Books related to Artificial Intelligence, Machine Learning, Deep Learning and Neural Networks
- link to ONLINE Book about Neural Networks and Deep Learning by MIchael Nielsen
- link to ONLINE BOOK about DEEP LEARNING of MIT PRESS BOOK
- link to Tools to Help Tensorflow Development
- link to PAIR code - Code repositories from People AI Research PAIR Initiative
- link to Tolls & Platoforms PAIR
- Book - Math for AI - Basics of Linear Algebra for Machine Learning (Examples in Python Code) 212 Pages · 2017
- Book Artificial Intelligence A Modern Approach (3rd Edition) 1154 Pages 2010
- Book - Deep Learning - MIT PRESS - Book Online 2019
- Book Deep Learning - Fundamentals, Theory and Applications 168 Pages 2019
- Book Deep Learning with Appls Using Python - Tensorflow and Keras - Chatbots, Object and Speech Recognition 227 Pages 2018
- Book Deep Learning with Python 386 Pages 2017
- Book Deep Learning with R 341 Pages 2017
- Book Deep learning - Adaptive Computation and Machine Learning 801 Pages 2016
- Book Unsupervised Deep Learning in Python 100 Pages 2016
- Book - Neural Networks and Deep Learning - Michael Nielsen - 281 pages Oct 2018
- Book Learn Keras for Deep Neural Networks - A Fast-Track Approach with Python 192 Pages 2019
- Book Convolutional Neural Networks in Python 75 Pages 2016
- Book Convolutional Neural Networks in Visual Computing 187 Pages 2018
- Book Learning Tensorflow - A Guide to Building Deep Learning Systems 242 Pages 2017
- Book - TensorFlow - Getting Started With TensorFlow 178 Pages
- Book Machine Learning with Python Cookbook - Practical Solutions 366 Pages 2018
- Book Reinforcement Learning - With Open AI, TensorFlow and Keras Using Python 174 Pages 2018
- Book Guide to Convolutional Neural Networks - Traffic-Sign Detection and Classification 303 Pages 2017
- Book Big Data SMACK - A Guide to Apache Spark, Mesos, Akka, Cassandra, and Kafka 277 Pages 2016
- Book Spark - The Definitive Guide - Big Data Processing Made Simple 601 Pages 2018
- Book Data Analysis From Scratch With Python, Pandas, NumPy, Scikit-Learn, IPython, TensorFlow and Matplotlib 104 pages 2018
- Book Advanced Data Analytics Using Python - With Machine Learning, Deep Learning and NLP Examples 195 Pages 2018
- Book Practical Data Science with R 417 Pages 2014
- Book R in Action - Data analysis and graphics with R 474 Pages 2011
- Book Learn R for Applied Statistics - With Data Visualizations, Regressions, and Statistics 254 Pages 2019
- Book R Markdown - The Definitive Guide 339 Pages 2018
In this tutorial, the basic steps of Gauss Elimination (or Gaussian Elimination) method to solve a system of linear equations are explained in details with examples, algorithms and Python codes. Gauss elimination (after Carl Friedrich Gauss, 1777-1855) is a the basis of all other elimination methods applied to solve systems of linear equations.
In this video, Cholesky factorization method (after André-Louis Cholesky) is explained with examples. The tutorial includes the definitions of the LU-decomposition and Cholesky decomposition, the conditions of Cholesky decomposition, the use of Numpy eigenvalue functions to test the positive definiteness, the derivation of Cholesky algorithm and Coding in Python.
In this tutorial, the procedure of Gauss-Jordan elimination method is explained step-by-step using symbolic and numeric examples. The general formulas and Gauss-Jordan algorithm are applied to write a Python code to solve the numeric example.
Lagrange interpolation (or Lagrangian interpolation) method is one of the most basic and common methods to apply the interpolation polynomials. It was named after the great mathematician Joseph-Louis Lagrange (1736-1813). This tutorial explains the Lagrangian polynomial form of the interpolation function, the algorithm of the method and the Python code by using Python lists with basic for loops and by using the Numpy arrays by using conditional slicing in addition to plotting the interpolation function versus the given data points by using matplotlib.pyplot module.
The binomial distribution consists of the probabilities of each of the possible numbers of successes on N trials for independent events that each have a probability of π (the Greek letter pi) of occurring. For the coin flip example, N = 2 and π = 0.5.
This calculus video tutorial provides a basic introduction into normal distribution and probability. It explains how to solve normal distribution problems using a simple chart and using calculus by evaluating the definite integral of the probability density function for a bell shaped curve or normal distribution curve. This video contains 1 practice problem in the form of a word problem with many parts giving you plenty of examples to master this topic. In this video, I explain how to evaluate the definite integral using wolfram's alpha online calculator for definite integrals. You need to determine the population mean mu and standard deviation sigma as well as the lower and upper limits of integration in order to determine the probability of an event occurring within a certain range of X values where X is a continuous random variable. You need to be familiar with the 68-95-99.7 rule. Approximately 68% of the population lies within 1 standard deviation of the population mean or average. 95% of the population lies within 2 standard deviations of the mean and 99.7% lies within 3 standard deviations of the mean.