Math for Deep Learning: What You Need to Know to Understand Neural Networks
Math for Deep Learning provides the essential math you need to understand deep learning discussions, explore more complex implementations, and better use the deep learning toolkits.
With Math for Deep Learning, you’ll learn the essential mathematics used by and as a background for deep learning.
You’ll work through Python examples to learn key deep learning related topics in probability, statistics, linear algebra, differential calculus, and matrix calculus as well as how to implement data flow in a neural network, backpropagation, and gradient descent. You’ll also use Python to work through the mathematics that underlies those algorithms and even build a fully-functional neural network.
In addition you’ll find coverage of gradient descent including variations commonly used by the deep learning community: SGD, Adam, RMSprop, and Adagrad/Adadelta.
Cover Page Title Page Copyright Page Dedication About the Author About the Technical Reviewer BRIEF CONTENTS CONTENTS IN DETAIL FOREWORD ACKNOWLEDGMENTS INTRODUCTION Who Is This Book For? About This Book 1 SETTING THE STAGE Installing the Toolkits NumPy SciPy Matplotlib Scikit-Learn Summary 2 PROBABILITY Basic Concepts The Rules of Probability Joint and Marginal Probability Summary 3 MORE PROBABILITY Probability Distributions Bayes’ Theorem Summary 4 STATISTICS Types of Data Summary Statistics Quantiles and Box Plots Missing Data Correlation Hypothesis Testing Summary 5 LINEAR ALGEBRA Scalars, Vectors, Matrices, and Tensors Arithmetic with Tensors Summary 6 MORE LINEAR ALGEBRA Square Matrices Eigenvectors and Eigenvalues Vector Norms and Distance Metrics Principal Component Analysis Singular Value Decomposition and Pseudoinverse Summary 7 DIFFERENTIAL CALCULUS Slope Derivatives Minima and Maxima of Functions Partial Derivatives Gradients Summary 8 MATRIX CALCULUS The Formulas The Identities Jacobians and Hessians Some Examples of Matrix Calculus Derivatives Summary 9 DATA FLOW IN NEURAL NETWORKS Representing Data Data Flow in Traditional Neural Networks Data Flow in Convolutional Neural Networks Summary 10 BACKPROPAGATION What Is Backpropagation? Backpropagation by Hand Backpropagation for Fully Connected Networks Computational Graphs Summary 11 GRADIENT DESCENT The Basic Idea Stochastic Gradient Descent Momentum Adaptive Gradient Descent Summary Epilogue APPENDIX: GOING FURTHER Probability and Statistics Linear Algebra Calculus Deep Learning INDEX
How to download source code?
1. Go to:
2. Search the book title:
Math for Deep Learning: What You Need to Know to Understand Neural Networks, sometime you may not get the results, please search the main title
3. Click the book title in the search results
3. Download the Source Code.
1. Disable the AdBlock plugin. Otherwise, you may not get any links.
2. Solve the CAPTCHA.
3. Click download link.
4. Lead to download server to download.