Machine Learning for Beginners: Learn to Build Machine Learning Systems Using Python
- Length: 262 pages
- Edition: 1
- Language: English
- Publisher: BPB Publications
- Publication Date: 2020-08-21
- ISBN-10: 9389845424
- ISBN-13: 9789389845426
- Sales Rank: #1637769 (See Top 100 Books)
Get familiar with various Supervised, Unsupervised and Reinforcement learning algorithms
Key Features
- Understand the types of Machine learning.
- Get familiar with different Feature extraction methods.
- Get an overview of how Neural Network Algorithms work.
- Learn how to implement Decision Trees and Random Forests.
- The book not only explains the Classification algorithms but also discusses the deviations/ mathematical modeling.
Description
This book covers important concepts and topics in Machine Learning. It begins with Data Cleansing and presents an overview of Feature Selection. It then talks about training and testing, cross-validation, and Feature Selection. The book covers algorithms and implementations of the most common Feature Selection Techniques. The book then focuses on Linear Regression and Gradient Descent. Some of the important Classification techniques such as K-nearest neighbors, logistic regression, Naïve Bayesian, and Linear Discriminant Analysis are covered in the book. It then gives an overview of Neural Networks and explains the biological background, the limitations of the perceptron, and the backpropagation model. The Support Vector Machines and Kernel methods are also included in the book. It then shows how to implement Decision Trees and Random Forests.
Towards the end, the book gives a brief overview of Unsupervised Learning. Various Feature Extraction techniques, such as Fourier Transform, STFT, and Local Binary patterns, are covered. The book also discusses Principle Component Analysis and its implementation.
What will you learn
- Learn how to prepare Data for Machine Learning.
- Learn how to implement learning algorithms from scratch.
- Use scikit-learn to implement algorithms.
- Use various Feature Selection and Feature Extraction methods.
- Learn how to develop a Face recognition system.
Who this book is for
The book is designed for Undergraduate and Postgraduate Computer Science students and for the professionals who intend to switch to the fascinating world of Machine Learning. This book requires basic know-how of programming fundamentals, Python, in particular.
About the Author
Harsh Bhasin is an Applied Machine Learning researcher. Mr. Bhasin worked as Assistant Professor in Jamia Hamdard, New Delhi, and taught as a guest faculty in various institutes including Delhi Technological University. Before that, he worked in C# Client-Side Development and Algorithm Development.
Mr. Bhasin has authored a few papers published in renowned journals including Soft Computing, Springer, BMC Medical Informatics and Decision Making, AI and Society, etc. He is the reviewer of prominent journals and has been the editor of a few special issues. He has been a recipient of a distinguished fellowship.
Outside work, he is deeply interested in Hindi Poetry, progressive era; Hindustani Classical Music, percussion instruments.
His areas of interest include Data Structures, Algorithms Analysis and Design, Theory of Computation , Python, Machine Learning and Deep learning.
Your LinkedIn Profile:
https://in.linkedin.com/in/harsh-bhasin-69134426
Cover Page Title Page Copyright Page Dedication Page About the Author About the Reviewer Acknowledgements Preface Errata Table of Contents 1. An Introduction to Machine Learning Structure Objective Conventional algorithm and machine learning Types of learning Supervised machine learning Unsupervised learning Working Data Train test validation data Rest of the steps Applications Natural Language Processing (NLP) Weather forecasting Robot control Speech recognition Business Intelligence History Conclusion Exercises Multiple Choice Questions Theory Explore 2. The Beginning: Pre-Processing and Feature Selection Introduction Structure Objective Dealing with missing values and ‘NaN’ Converting a continuous variable to categorical variable Feature selection Chi-Squared test Pearson correlation Variance threshold Conclusion Exercises Multiple Choice Questions Programming/Numerical Theory 3. Regression Introduction Structure Objective The line of best fit Gradient descent method Implementation Linear regression using SKLearn Experiments Experiment 1: Boston Housing Dataset, Linear Regression, 10-Fold Validation Experiment 2: Boston Housing Dataset, Linear Regression, train-test split Finding weights without iteration Regression using K-nearest neighbors Conclusion Exercises Multiple Choice Questions Theory Experiments 4. Classification Introduction Structure Objective Basics Classification using K-nearest neighbors Algorithm Implementation of K-nearest neighbors The KNeighborsClassifier in SKLearn Experiments – K-nearest neighbors Logistic regression Logistic regression using SKLearn Experiments – Logistic regression Naïve Bayes classifier The GaussianNB Classifier of SKLearn Implementation of Gaussian Naïve Bayes Conclusion Exercises Multiple Choice Questions Theory Numerical/Programs 5. Neural Network I – The Perceptron Introduction Structure Objective The brain The neuron The McCulloch Pitts model Limitations of the McCulloch Pitts The Rosenblatt perceptron model Algorithm Activation functions Unit step sgn Sigmoid Derivative tan-hyperbolic Implementation Learning Perceptron using sklearn Experiments Experiment 1: Classification of Fisher Iris Data Experiment 2: Classification of Fisher Iris Data, train-test split Experiment 3: Classification of Breast Cancer Data Experiment 4: Classification of Breast Cancer Data, 10 Fold Validation Conclusion Exercises Multiple Choice Questions Theory Programming/Experiments 6. Neural Network II – The Multi-Layer Perceptron Introduction Structure Objective History Introduction to multi-layer perceptrons Architecture Backpropagation algorithm Learning Implementation Multilayer perceptron using sklearn Experiments Conclusion Exercises Multiple Choice Questions Theory Practical/Coding 7. Support Vector Machines Introduction Structure Objective The Maximum Margin Classifier Maximizing the margins The non-separable patterns and the cost parameter The kernel trick SKLEARN.SVM.SVC Experiments Conclusion Exercises Multiple Choice Questions Theory Experiment 8. Decision Trees Introduction Structure Objective Basics Discretization Coming back Containing the depth of a tree Implementation of a decision tree using sklearn Experiments Experiment 1 – Iris Dataset, three classes Experiment 2 – Breast Cancer dataset, two classes Conclusion Exercises Multiple Choice Questions Theory Numerical/Programming 9. Clustering Introduction Structure Objective K-means Algorithm: K Means Spectral clustering Algorithm – Spectral clustering Hierarchical clustering Implementation K-means Experiment 1 Experiment 2 Experiment 3 Spectral clustering Experiment 4 Experiment 5 Experiment 6 Agglomerative clustering Experiment 7 Experiment 8 Experiment 9 DBSCAN Conclusion Exercises Multiple Choice Questions Theory Numerical Programming 10. Feature Extraction Introduction Structure Objective Fourier Transform Patches sklearn.feature_extraction.image.extract_patches_2d Histogram of oriented gradients Principal component analysis Conclusion Exercises Multiple Choice Questions Theory Programming Appendix 1. Cheat Sheet – Pandas Creating a Pandas series Using a List Using NumPy Array Using Dictionary Indexing Slicing Common methods Boolean index DataFrame Creation Adding a Column in a Data Frame Deleting column Addition of Rows Deletion of Rows unique nunique Iterating a Pandas Data Frame Appendix 2. Face Classification Introduction Data Conversion to grayscale: Methods Feature extraction Splitting of data Feature Selection Forward Feature Selection Classifier Observation and Conclusion Bibliography General Nearest Neighbors Neural Networks Support Vector Machines Decision Trees Clustering Fourier Transform Principal Component Analysis Histogram of Oriented Gradients
Donate to keep this site alive
1. Disable the AdBlock plugin. Otherwise, you may not get any links.
2. Solve the CAPTCHA.
3. Click download link.
4. Lead to download server to download.