Deep Learning with C#, .Net and Kelp.Net: The Ultimate Kelp.Net Deep Learning Guide
- Length: 414 pages
- Edition: 1
- Language: English
- Publisher: BPB Publications
- Publication Date: 2019-05-10
- ISBN-10: 9388511018
- ISBN-13: 9789388511018
- Sales Rank: #3924939 (See Top 100 Books)
Get hands on with Kelp.Net , Microsoft’s latest Deep Learning framework
Key Features
- Deep Learning Basics
- The ultimate Kelp.Net reference guide
- Develop state of the art deep learning applications
- C# Deep Learning code
- Develop advanced deep learning models with minimal code
- Develop your own advanced Deep Learning models
- Loading and Saving Deep Learning Models
- Comprehensive Kelp.Net reference
- Sample Deep Learning Models and Tests
- OpenCL Reference
- Easily add deep learning to your applications
- Many sample models and tests
- Intuitive and user friendly
Description
Deep Learning with Kelp.Net is the ultimate reference for C# .Net developers who are passionate about deep learning. Readers will learn all the skills necessary to develop powerful, scalable and flexible deep learning models from a fluid and easy to use API. Upon completing the book the reader will have all the tools necessary to add powerful deep learning capabilities to their new or existing applications.
What you will learn
- In-depth knowledge of Kelp.Net
- How to develop Deep Learning models
- C# Deep Learning programming
- Open-Computing Language (OpenCL)
- Loading and saving Deep Learning models
- How to develop and use activation functions
- How to test Deep Learning models
Who This Book is For
This book targets C# .Net developers who are passionate about deep learning yet want to do so from an easy and intuitive API.
Table of Contents
- Introduction
- ML/DL Terms and Concepts
- Deep Instrumentation
- Kelp.Net Reference
- Loading and Saving Models
- Model Testing and Training
- Sample Deep Learning Tests
- Creating Your Own Deep Learning Tests
- Appendix A: Evaluation Metrics
- Appendix B: OpenCL
About the Author
Matt R. Cole is a seasoned developer and published author with over 30 years’ experience in Microsoft Windows, C, C++, C# and .Net.
He is the owner of Evolved AI Solutions, a premier provider of advanced Machine Learning/Bio-AI technologies.
He developed the first enterprise grade MicroService framework written completely in C# and .Net, which is used in production by a major hedge fund in NYC.
He also developed the first Bio Artificial Intelligence framework which completely integrates mirror and canonical neurons. He continues to push the limits of Machine Learning, Biological Artificial Intelligence, Deep Learning and MicroServices.
In his spare time Matt loves to continue his education and contribute to open source efforts such as Kelp.Net.
His Website: www.evolvedaisolutions.com
His LinkedIn Profile: www.linkedin.com/in/evolvedai/
His Blog: www.evolvedaisolutions.com/blog.html
Cover Deep Learning with C#, .NET and Kelp.NET Copyright About the Author Reviewer Preface Acknowledgement Errata Table of Contents 1. Take This ___ and ___ It Objectives of this book Neural network overview Machine learning overview Deep learning overview Complexity Machine and deep learning differences Summary 2. Machine Learning/Deep Learning Terms and Concepts Overview Neuron/Perceptron Multi-Layer Perceptron (MLP) Features Weights Bias Activation Function Sigmoid ReLU (Rectified Linear Units) Softmax Neural network Input/Output/Hidden Layers Forward propagation Back propagation The No Free Lunch theorem The Curse of Dimensionality The more neurons versus more layers Cost function Gradient descent Learning rate Batches/Batch size Epochs Iterations Dropout Batch Normalization CNN (Convolutional Neural Network) Pooling Padding Recurrent neuron RNN (Recurrent Neural Network) Vanishing gradient problem Exploding gradient problem Logistic Neurons Hidden layers Types of neural networks Generalization Regularization Loss Loss over time Loss versus learning curve Supervised learning Bias-Variance Trade-off (overfitting and underfitting) Bias Variance Overfitting Is your model overfitting or underfitting? Prevention of overfitting and underfitting Amount of training data Input space dimensionality Incorrect output values Data heterogeneity Unsupervised learning Reinforcement learning Manifold learning Types of manifolds in deep learning Topological Differentiable Riemannian Principal Component Analysis (PCA) Hyperparameter training Approaches to hyperparameter tuning Grid search Random search Bayesian optimization Gradient-based optimization Evolutionary optimization Summary References 3. Deep Instrumentation Using ReflectInsight Next generation logging viewers Message log Message details Message properties Bookmarks Call Stack Message Navigation Advanced Search User-Defined Views and Filtering Auto Save/Purge rolling log files Watches Time zone formatting Router Log viewer Live viewer SDK Configuration editor Overview XML configuration Dynamic configuration Configuration editor Message type logging reference Assertions Assigned variables Attachments Audit failure and success Checkmarks Checkpoints Collections Comments Currency Data DataSet DataSetSchema DataTable DataTableSchema DataView Date/Time Debug Desktop Image Errors Exceptions Fatal Errors Generations Images Information Levels Linq queries and results Loaded assemblies Loaded processes Memory status Messages Notes Process Information Reminders Serialized Objects SQL strings Stack Traces System Information Text files Thread Information Typed collections Warning XML XML files Tracing method calls Attaching message properties To one request To all requests To a single message Watches Using custom data Output Summary 4. Kelp.Net Reference Let us be honest Downloading Kelp.Net Building the source code What is Kelp.Net? N-dimensional arrays Optimizers AdaDelta AdaGrad Adam GradientClippin g MomentumSGD RMSprop SGD Poolings MaxPooling AveragePooling FunctionStack FunctionDictionary SplitFunction SortedList SortedFunctionStack Activation Functions Activation plots ArcSinH ArcTan ELU Gaussian LeakyReLU LeakyReLUShifted LogisticFunction MaxMinusOne PolynomialApproximantSteep QuadraticSigmoid RbfGaussian ReLU ReLuTanh ScaledELU Sigmoid Sine Softmax Softplus SReLU SReLUShifted Swish Tanh Connections Convolution2D Deconvolution2D EmbedID Linear LSTM Normalization BatchNormalization Local Response Normalization Noise Dropout StochasticDepth Loss MeanSquaredError SoftmaxCrossEntropy Datasets CIFAR-10 CIFAR-100 MNIST Street View House Numbers (SVHN) Summary References 5. Model Testing and Training Accuracy Timing Common stacks Summary 6. Loading and Saving Models Loading models Saving models Model size Summary 7. Sample Deep Learning Tests A simple XOR problem Complete source code Output A penny for your thoughts A simple XOR problem (part 2) Complete source code Output Recurrent Neural Network Language Models (RNNLM) Complete source code Vocabulary Output Word prediction test Complete source code Output Decoupled Neural Interfaces using Synthetic Gradients Output MNIST accuracy tester Complete source code Output Massively Deep Network Test Complete source code Output Image prediction test Complete source code Output Function benchmarking Output MNIST (handwritten characters) learning test Complete source code Output LeakyReLu and PolynomialApproximantSteep Combination Network Complete source code Output FunctionStack navigation tests Complete source code Output Learning Rate Hyperparameter tester Complete source code Output Model scoring Complete source code Output Summary 8. Creating Your Own Deep Learning Tests Example Implementing the Run function Create a FunctionStack with your functions Set the optimizer Make your predictions Save the model Loading models Summary Thank You Appendix A Evaluation metrics Metrics terminology Confusion matrix Appendix B OpenCL OpenCL hierarchy
Donate to keep this site alive
1. Disable the AdBlock plugin. Otherwise, you may not get any links.
2. Solve the CAPTCHA.
3. Click download link.
4. Lead to download server to download.