Big Data Analytics: Cluster Analysis And Pattern Recognition. Examples With Matlab
- Length: 574 pages
- Edition: 1
- Language: English
- Publisher: lulu.com
- Publication Date: 2020-06-01
- ISBN-10: B08VND2L8Z
- ISBN-13: 9781716876868
- Sales Rank: #4373310 (See Top 100 Books)
Big data analytics examines large amounts of data to uncover hidden patterns, correlations and other insights. MATLAB has the tool Neural Network Toolbox (Deep Learning Toolbox from version 18) that provides algorithms, functions, and apps to create, train, visualize, and simulate neural networks. You can perform classification, regression, clustering, dimensionality reduction, time-series forecasting, and dynamic system modeling and control.The toolbox includes convolutional neural network and autoencoder deep learning algorithms for image classification and feature learning tasks. To speed up training of large data sets, you can distribute computations and data across multicore processors, GPUs, and computer clusters using Big Data tools (Parallel Computing Toolbox). Unsupervised learning algorithms, including self-organizing maps and competitive layers-Apps for data-fitting, pattern recognition, and clustering-Preprocessing, postprocessing, and network visualization for improving training efficiency and assessing network performance. This book develops cluster analysis and pattern recognition
BIG DATA AND ANALYTICS. NEURAL NETWORKS TOOLS 1.1 INTRODUCTION TO BIG DATA AND ANALYTICS 1.2 MATLAB AND BIG DATA 1.2.1 Access Data 1.2.2 Explore, Process, and Analyze Data 1.2.3 Develop Predictive Models 1.3 NEURAL NETWORKS WITH MATLAB Cluster ANALYSIS with NEURAL NETWORKS 2.1 INTRODUCTION 2.2 Using the Neural Network Clustering Tool 2.3 Using Command-Line Functions Cluster ANALYSIS with NEURAL NETWORKS. Self-Organizing Map 3.7.1 One-Dimensional Self-Organizing Map 3.7.2 Two-Dimensional Self-Organizing Map 3.7.3 Training with the Batch Algorithm CLUSTER ANALYSIS WITH NEURAL NETWORKS. Self-Organizing Maps FUNCTIONS 4.1 FUNCTIONS 4.2 nnstart 4.3 view 4.4 selforgmap 4.5 train 4.6 plotsomhits 4.7 plotsomnc 4.8 plotsomnd 4.9 plotsomplanes 4.10 plotsompos 4.11 plotsomtop 4.12 genFunction COMPETITIVE NEURAL NETWORKS 5.1 Create a Competitive Neural Network 5.1.1 Kohonen Learning Rule (learnk) 5.1.2 Bias Learning Rule (learncon) 5.1.3 Training 5.1.4 Graphical Example 5.2 Competitive Layers 5.2.1 Functions 5.3 nnstart 5.4 view 5.5 selforgmap 5.6 train 5.7 plotsomhits 5.8 plotsomnc 5.9 plotsomnd 5.10 plotsomplanes 5.11 plotsompos 5.12 plotsomtop 5.13 genFunction Competitive Layer FUNCTIONS 6.1 FUNCTIONS 6.2 COMPETLAYER 6.3 view 6.4 trainru 6.5 learnk 6.6 learncon BIG DATA CLASSIFICATION. Classify Patterns with a Neural Network 7.1 INTRODUCTION 7.2 Using the Neural Network Pattern Recognition Tool 7.3 Using Command-Line Functions BIG DATA CLASSIFICATION. WORK FLOW FOR NEURAL NETWORK DESIGN 8.1 INTRODUCTION 8.2 Four Levels of Neural Network Design 8.3 Multilayer Neural Networks and Backpropagation Training 8.4 Multilayer Neural Network Architecture 8.4.1 Neuron Model (logsig, tansig, purelin) 8.4.2 Feedforward Neural Network 8.5 Understanding Neural Network Toolbox Data Structures 8.5.1 Simulation with Concurrent Inputs in a Static Network 8.5.2 Simulation with Sequential Inputs in a Dynamic Network 8.5.3 Simulation with Concurrent Inputs in a Dynamic Network 8.6 Neural Network Object Properties 8.6.1 General 8.6.2 Architecture 8.6.3 Subobject Structures 8.6.4 Functions 8.6.5 Weight and Bias Values 8.7 Neural Network Subobject Properties 8.7.1 Inputs 8.7.2 Layers 8.7.3 Outputs 8.7.4 Biases 8.7.5 Input Weights 8.7.6 Layer Weights BIG DATA TOOLS. FunctionS FOR PATTERN RECOGNITION AND CLASSIFICATION 9.1 INTRODUCTION 9.2 view NEURAL NETWORK 9.3 Pattern Recognition and Learning Vector Quantization 9.3.1 Pattern recognition network: patternnet 9.3.2 Learning vector quantization neural network: lvqnet 9.4 Training Options and Network Performance 9.4.1 Receiver operating characteristic: roc 9.4.2 Plot receiver operating characteristic: plotroc 9.4.3 Plot classification confusion matrix: plotconfusion 9.4.4 Neural network performance: crossentropy 9.4.5 Construct and Train a Function Fitting Network 9.4.6 Create and train Feedforward Neural Network 9.4.7 Create and Train a Cascade Network 9.5 Network performance 9.5.1 Description 9.5.2 Examples 9.6 Fit Regression Model and Plot Fitted Values versus Targets 9.6.1 Description 9.6.2 Examples 9.7 Plot Output and Target Values 9.7.1 Description 9.7.2 Examples 9.8 Plot Training State Values 9.9 Plot Performances 9.10 Plot Histogram of Error Values 9.10.1 Syntax 9.10.2 Description 9.10.3 Examples 9.11 Generate MATLAB function for simulating neural network 9.11.1 Create Functions from Static Neural Network 9.11.2 Create Functions from Dynamic Neural Network 9.12 A COMPLETE EXAMPLE: House Price Estimation 9.12.1 The Problem: Estimate House Values 9.12.2 Why Neural Networks? 9.12.3 Preparing the Data 9.12.4 Fitting a Function with a Neural Network 9.12.5 Testing the Neural Network 9.13 Autoencoder class 9.13.1 trainAutoencoder 9.13.2 Construct Deep Network Using Autoencoders 9.13.3 decode 9.13.4 encode 9.13.5 predict 9.13.6 stack BIG DATA TOOLS. MULTILAYER Neural Network 10.1 Create, Configure, and Initialize Multilayer Neural Networks 10.1.1 Other Related Architectures 10.2 FUNCTIONS FOR Create, Configure, and Initialize Multilayer Neural Networks 10.2.1 Initializing Weights (init) 10.2.2 feedforwardnet Syntax feedforwardnet(hiddenSizes,trainFcn) Description Feedforward networks consist of a series of layers. The first layer has a connection from the network input. Each subsequent layer has a connection from the previous layer. The final layer produces the network’s output. Examples Feedforward Neural Network 10.2.3 configure 10.2.4 init To Get Help Algorithms 10.2.5 train 10.2.6 trainlm 10.2.7 tansig 10.2.8 purelin 10.2.9 cascadeforwardnet 10.2.10 patternnet 10.3 Train and Apply Multilayer Neural Networks 10.3.1 Training Algorithms 10.3.2 Training Example 10.3.3 Use the Network 10.4 Train ALGORITMS IN Multilayer Neural Networks 10.4.1 trainbr:Bayesian Regularization 10.4.2 trainscg: Scaled conjugate gradient backpropagation 10.4.3 trainrp: Resilient backpropagation 10.4.4 trainbfg: BFGS quasi-Newton backpropagation 10.4.5 traincgb: Conjugate gradient backpropagation with Powell-Beale restarts 10.4.6 traincgf: Conjugate gradient backpropagation with Fletcher-Reeves updates 10.4.7 traincgp: Conjugate gradient backpropagation with Polak-Ribiére updates 10.4.8 trainoss: One-step secant backpropagation 10.4.9 traingdx: Gradient descent with momentum and adaptive learning rate backpropagation 10.4.10 traingdm: Gradient descent with momentum backpropagation 10.4.11 traingd: Gradient descent backpropagation ANALYZE AND DEPLOY TRAINED NEURAL NETWORK 11.1 ANALYZE NEURAL NETWORK PERFORMANCE 11.2 Improving Results 11.3 Deployment Functions and Tools for Trained Networks 11.4 Generate Neural Network Functions for Application Deployment 11.5 Deploy Neural Network Simulink Diagrams 11.5.1 Example 11.5.2 Suggested Exercises 11.6 Deploy Training of Neural Networks BIG DATA PARALLEL COMPUTING. TRAINING SCALABILITY AND EFICIENCE 12.1 Neural Networks with Parallel and GPU Computing 12.1.1 Modes of Parallelism 12.1.2 Distributed Computing 12.1.3 Single GPU Computing 12.1.4 Distributed GPU Computing 12.1.5 Deep Learning 12.1.6 Parallel Time Series 12.1.7 Parallel Availability, Fallbacks, and Feedback 12.2 Automatically Save Checkpoints During Neural Network Training 12.3 Optimize Neural Network Training Speed and Memory 12.3.1 Memory Reduction 12.3.2 Fast Elliot Sigmoid OPTIMAL SOLUTIONS 13.1 Representing Unknown or Don’t-Care Targets 13.1.1 Choose Neural Network Input-Output Processing Functions 13.1.2 Representing Unknown or Don’t-Care Targets 13.2 Configure Neural Network Inputs and Outputs 13.3 Divide Data for Optimal Neural Network Training 13.4 Choose a Multilayer Neural Network Training Function 13.4.1 SIN Data Set 13.4.2 PARITY Data Set 13.4.3 ENGINE Data Set 13.4.4 CANCER Data Set 13.4.5 CHOLESTEROL Data Set 13.4.6 DIABETES Data Set 13.4.7 Summary 13.5 Improve Neural Network Generalization and Avoid Overfitting 13.5.1 Retraining Neural Networks 13.5.2 Multiple Neural Networks 13.5.3 Early Stopping 13.5.4 Index Data Division (divideind) 13.5.5 Random Data Division (dividerand) 13.5.6 Block Data Division (divideblock) 13.5.7 Interleaved Data Division (divideint) 13.5.8 Regularization 13.5.9 Modified Performance Function 13.5.10 Automated Regularization (trainbr) 13.5.11 Summary and Discussion of Early Stopping and Regularization 13.5.12 Posttraining Analysis (regression) 13.6 Train Neural Networks with Error Weights 13.7 Normalize Errors of Multiple Outputs CLASSIFICATION WITH NEURAL NETWORKS. EXAMPLES 14.1 Crab Classification 14.1.1 Why Neural Networks? 14.1.2 Preparing the Data 14.1.3 Building the Neural Network Classifier 14.1.4 Testing the Classifier 14.2 Wine Classification 14.2.1 The Problem: Classify Wines 14.2.2 Why Neural Networks? 14.2.3 Preparing the Data 14.2.4 Pattern Recognition with a Neural Network 14.2.5 Testing the Neural Network 14.3 Cancer Detection 14.3.1 Formatting the Data 14.3.2 Ranking Key Features 14.3.3 Classification Using a Feed Forward Neural Network 14.4 Character Recognition 14.4.1 Creating the First Neural Network 14.4.2 Training the first Neural Network 14.4.3 Training the Second Neural Network 14.4.4 Testing Both Neural Networks AUTOENCODERS AND CLUSTERING WITH NEURAL NETWORKS. EXAMPLES 15.1 Train Stacked Autoencoders for Image Classification 15.1.1 Data set 15.1.2 Training the first autoencoder 15.1.3 Visualizing the weights of the first autoencoder 15.1.4 Training the second autoencoder 15.1.5 Training the final softmax layer 15.1.6 Forming a stacked neural network 15.1.7 Fine tuning the deep neural network 15.1.8 Summary 15.2 Transfer Learning Using Convolutional Neural Networks 15.3 Iris Clustering 15.3.1 Why Self-Organizing Map Neural Networks? 15.3.2 Preparing the Data 15.3.3 Clustering with a Neural Network 15.4 Gene Expression Analysis 15.4.1 The Problem: Analyzing Gene Expressions in Baker’s Yeast (Saccharomyces Cerevisiae) 15.4.2 The Data 15.4.3 Filtering the Genes 15.4.4 Principal Component Analysis 15.4.5 Cluster Analysis: Self-Organizing Maps Self-organizing Networks. EXAMPLES 16.1 Competitive Learning 16.2 One-Dimensional Self-organizing Map 16.3 Two-Dimensional Self-organizing Map
Donate to keep this site alive
1. Disable the AdBlock plugin. Otherwise, you may not get any links.
2. Solve the CAPTCHA.
3. Click download link.
4. Lead to download server to download.