Reservoir Computing: Theory, Physical Implementations, and Applications
- Length: 477 pages
- Edition: 1
- Language: English
- Publisher: Springer
- Publication Date: 2021-08-06
- ISBN-10: 9811316864
- ISBN-13: 9789811316869
- Sales Rank: #1167979 (See Top 100 Books)
This book is the first comprehensive book about reservoir computing (RC). RC is a powerful and broadly applicable computational framework based on recurrent neural networks. Its advantages lie in small training data set requirements, fast training, inherent memory and high flexibility for various hardware implementations. It originated from computational neuroscience and machine learning but has, in recent years, spread dramatically, and has been introduced into a wide variety of fields, including complex systems science, physics, material science, biological science, quantum machine learning, optical communication systems, and robotics. Reviewing the current state of the art and providing a concise guide to the field, this book introduces readers to its basic concepts, theory, techniques, physical implementations and applications.
The book is sub-structured into two major parts: theory and physical implementations. Both parts consist of a compilation of chapters, authored by leading experts in their respective fields. The first part is devoted to theoretical developments of RC, extending the framework from the conventional recurrent neural network context to a more general dynamical systems context. With this broadened perspective, RC is not restricted to the area of machine learning but is being connected to a much wider class of systems. The second part of the book focuses on the utilization of physical dynamical systems as reservoirs, a framework referred to as physical reservoir computing. A variety of physical systems and substrates have already been suggested and used for the implementation of reservoir computing. Among these physical systems which cover a wide range of spatial and temporal scales, are mechanical and optical systems, nanomaterials, spintronics, and quantum many body systems.
This book offers a valuable resource for researchers (Ph.D. students and experts alike) and practitioners working in the field of machine learning, artificial intelligence, robotics, neuromorphic computing, complex systems, and physics.
Foreword Preface Contents Part I Fundamental Aspects and New Developments in Reservoir Computing The Cerebral Cortex: A Delay-Coupled Recurrent Oscillator Network? 1 Introduction 2 Strategies for the Evaluation and Encoding of Relations 3 Learning Mechanisms 4 Association of Signals Lacking Temporal Structure 5 The Generation of Temporally Structured Activity 6 Synchrony as a Common Signature of Relatedness 7 Critical Issues 8 Computing in High-Dimensional State Space 9 Information Processing in Natural Recurrent Networks, a Proposal 10 Experimental Evidence 11 Learning Mechanisms 12 Concluding Remarks References Cortico-Striatal Origins of Reservoir Computing, Mixed Selectivity, and Higher Cognitive Function 1 Introduction 1.1 Initial Motivation—Barone and Joseph 1.2 Further Motivation—Temporal Structure 1.3 Abstract Structure and Language 1.4 Renaissance: Neural Dynamics, Mixed Selectivity, and Higher Cognitive Function 2 Corticostriatal Foundations 2.1 Barone and Joseph, and the Primate Behavioral Neurophysiology of Sequence Learning 2.2 Requirements on the System—Mixture of Inhibition and Excitation, Modulated by Time 2.3 Primate Cortical Neuroanatomy 2.4 Primate Corticostriatal System—Plasticity 3 Corticostriatal Reservoir Computing 3.1 Model Implementation—Birth of the Reservoir 3.2 Modeling Barone and Joseph (1989) 3.3 Complex Sequence Learning 3.4 Temporal Structure 3.5 Abstract Structure 4 Higher Cognitive Function I—Language 4.1 Language Acquisition and Grammatical Constructions 4.2 The Reservoir Convergence 4.3 Narrative 5 Higher Cognitive Function II. Executive Function Mixed Selectivity 5.1 Revelation of Mixed Selectivity in Primate Cortex 5.2 Explore Exploit 5.3 Modeling Explore Exploit 5.4 Adaptation in the Face of Diversity 5.5 From Main Effects to Mixed Selectivity 6 Cortico-Hippocampal Interaction for Synthesis of Novel Adaptive Experiences 7 Discussion 8 Conclusions References Reservoirs Learn to Learn 1 Introduction 2 Optimizing Reservoirs to Learn 3 Reservoirs Can Also Learn Without Changing Synaptic Weights to Readout Neurons 4 Methods 4.1 Leaky Integrate-and-Fire Neurons 4.2 Backpropagation Through Time 4.3 Optimizing Reservoirs to Learn 4.4 Reservoirs Can Also Learn Without Changing Synaptic Weights to Readout Neurons 5 Discussion References Deep Reservoir Computing 1 Introduction 2 Deep Echo State Network 2.1 Architecture 2.2 Dynamics of Deep Reservoirs and Echo State Property 3 Advances 4 Other Hierarchical Reservoir Computing Models 5 Conclusions References On the Characteristics and Structures of Dynamical Systems Suitable for Reservoir Computing 1 Introduction 2 Mathematical Formulation of Reservoir Computing 2.1 A Class of Dynamical System Usable for Reservoir Computing: Common-Signal-Induced Synchronization 2.2 Concrete Example: Echo State Network Model 2.3 Geometrical Interpretation of Reservoir Computing 3 Characteristics of Dynamical Systems Suitable for Reservoir Computing 3.1 Edge of Chaos 3.2 Memory-Nonlinearity Trade-Off 4 Dynamical Structure Suitable for Reservoir Computing 5 Conclusions and Future Works References Reservoir Computing for Forecasting Large Spatiotemporal Dynamical Systems 1 Motivation 2 Background: Prediction of `Small' Chaotic Systems 3 Machine Learning and the Forecasting of Large, Complex, Spatiotemporally Chaotic Systems 4 Distributed Parallel Prediction 4.1 Partitioning the Spatial Grid 4.2 Training 4.3 Prediction 4.4 Re-synchronization 4.5 An Example: A Lorenz 96 Model 4.6 Another Example: The Kuramoto–Sivashinsky Equation 5 Hybrid Forecasting 5.1 Training 5.2 Prediction 5.3 An Example: Kuramoto–Sivashinsky Equations 6 Parallel/Hybrid Forecasting 7 Conclusion References Part II Physical Implementations of Reservoir Computing Reservoir Computing in Material Substrates 1 Introduction 2 Computing with Physical Systems 2.1 Unconventional Computing 2.2 Configuring Physical Systems to Compute 3 Reservoir Computing with Physical Systems 3.1 Encoding and Representation in Reservoir Computers 3.2 Abstraction/Representation (A/R) Theory 3.3 Observing Reservoir States 4 The Search for Reservoirs 5 What Makes a Good Physical Reservoir? 5.1 Framework Outline 5.2 Characterising Substrate Quality 5.3 Using Quality to Assess Substrate Design 5.4 CHARC Conclusions 6 Conclusion References Physical Reservoir Computing in Robotics 1 Introduction 2 Theoretical Models 2.1 Feedforward Setup 2.2 Feedback Setup 2.3 Connecting Theoretical Models to the Real World 3 Example Cases from Robotics 4 Advantages of Physical Reservoir Computing in Robotics 5 Limitations of Physical Reservoir Computing in Robotics 6 Connection to Soft Robotics 7 The Future of Physical Reservoir Computing in Robotics References Reservoir Computing in MEMS 1 Introduction 2 Microelectromechanical Systems 2.1 MEMS Fabrication 2.2 Sensing and Driving Methods 2.3 MEMS Dynamics and Nonlinearity 3 Driven Oscillators with Duffing Nonlinearities 3.1 Duffing Oscillator 3.2 Clamped–Clamped Beams 4 Reservoir Computing in a MEMS 4.1 The MEMS Nonlinear Node 4.2 Training with Delayed Feedback 4.3 Performance Metrics 4.4 Hyperparameter Optimization 5 Conclusion References Part IV Physical Implementations: Neuromorphic Devices and Nanotechnology Neuromorphic Electronic Systems for Reservoir Computing 1 Introduction 2 RC on Digital Neuromorphic Processors 3 RC on Analog Neuromorphic Microchips 4 RC on Mixed Digital/Analog Neuromorphic Systems 5 Conclusion References Reservoir Computing Using Autonomous Boolean Networks Realized on Field-Programmable Gate Arrays 1 Introduction 2 Reservoir Computers Based on Autonomous Boolean Networks 3 Dynamics of Random Autonomous Boolean Networks 3.1 Dynamics of Clocked Random Boolean Networks 3.2 Dynamics of ABNs on an FPGA 4 Reservoir Computing with ABNs on an FPGA 4.1 The MNIST Classification Task 4.2 Dynamics of ABNs with Data Injection: The Consistency Window 4.3 Realizing an RC for the MNIST Classification Task 5 Discussion and Conclusions References Programmable Fading Memory in Atomic Switch Systems for Error Checking Applications 1 Introduction 1.1 The Atomic Switch 1.2 Neuromorphic Atomic Switch Networks 2 Theoretical Constraints and Consideration 2.1 Nonlinear Circuits 2.2 Characterization: Power-Law Dynamics 2.3 Atomic Switch Plasticity 2.4 Resistance Training 2.5 Simulation of Atomic Switch Network 2.6 Implementation: Error Checking 2.7 Simulated ASN Error Checking Results 2.8 Neuromorphic ASN Device Error Checking Results 3 Outlook 4 Methods 4.1 Design Optimization Results 4.2 Hardware and Instrumentation 4.3 Reservoir Computing Implementation References Part V Physical Implementations: Spintronics Reservoir Computing Reservoir Computing Leveraging the Transient Non-linear Dynamics of Spin-Torque Nano-Oscillators 1 Context 2 Hardware Implementation 2.1 Measurement Set-Up 2.2 Physical Properties of the Oscillator Used for Computation 3 Results on Classification Tasks 3.1 Results on Sine/Square Recognition Task 3.2 Results on Spoken-Digit Recognition 3.3 Conclusion 4 Optimizing the Experimental Parameters and Data Processing for Improved Classification 4.1 Input Sampling Rate and Amplitude 4.2 Magnetic Field and DC Current Dependence 4.3 Conclusion 5 General Conclusion References Reservoir Computing Based on Spintronics Technology 1 Introduction 1.1 Recurrent Neural Network and Reservoir Computing 1.2 History of Spintronics and Key Technologies 1.3 Brain-Inspired Computing Based on Spintronics 1.4 Spin-Torque Oscillator (STO) 2 Methods 2.1 Landau–Lifshitz–Gilbert Equation 2.2 Micro- and Macro-Structures of Nanomagnet 2.3 Recurrent Neural Network Based on STO 2.4 Methods for Evaluating Memory Capacities 3 Reservoir Computing with STO (Experiment) 3.1 Experimental Methods 3.2 Short-Term Memory Capacity in Single STO 3.3 Contributions from Other Circuit Components 3.4 Future Directions 4 Reservoir Computing with STO (Simulation) 4.1 Simulation Methods 4.2 Short-Term Memory and Parity Check Capacities in Single STO 4.3 Short-Term Memory and Parity Check Capacities in Multiple STOs 4.4 Comparison with Echo-State Network 5 Conclusion References Reservoir Computing with Dipole-Coupled Nanomagnets 1 Spin-Glass Model and Spin-Glass Reservoir Computing 2 Historical Design of the Magnetic Boolean Calculator 3 Linear and Nonlinear Calculation Using Nanomagnets 4 First Trial of the Dipole-Coupled Nanomagnet Reservoir 5 Guiding Principle for the Future References Part VI Physical Implementations: Photonic Reservoir Computing Performance Improvement of Delay-Based Photonic Reservoir Computing 1 Introduction 2 Performance Improvement of Reservoir Computing Using an Input Chaos Mask Signal 2.1 Scheme 2.2 Numerical Model 2.3 Results of Chaos Mask Signal 2.4 Comparison of Digital and Analog Mask Signals 3 Miniaturization of Reservoir Computing with a Photonic Integrated Circuit 3.1 Scheme 3.2 Experimental Results of Time-Series Prediction Task 3.3 N-Step-Ahead Prediction Task 4 Mutually Coupled Electro-Optic System 4.1 RC Based on Electro-Optic Feedback System 4.2 Scheme for Mutually Coupled Electro-Optic System 4.3 Results 5 Discussions 6 Conclusions References Computing with Integrated Photonic Reservoirs 1 Introduction 2 From Ideas to First Prototypes 2.1 Coherent Light and Planar Topologies 2.2 Readout and Training 2.3 Delays, Non-linearity and Power 2.4 Next Generation Reservoir Architectures 3 Training Reservoirs with Integrated Optical Readouts 3.1 Motivation 3.2 Limited Observability 3.3 Non-linearities and Complex-Valued Regression 3.4 Limited Precision 4 From Modules to Systems 4.1 The Future of Single Reservoir Modules 4.2 Making Ensembles of Reservoir Modules 5 Conclusion and Perspectives References Part VII Physical Implementations: Quantum Reservoir Computing Quantum Reservoir Computing: A Reservoir Approach Toward Quantum Machine Learning on Near-Term Quantum Devices 1 Introduction 2 Pedagogical Introduction to Quantum Mechanics 2.1 Quantum State 2.2 Time Evolution 2.3 Qubits 2.4 Density Operator 2.5 Vector Representation of Density Operators 3 Machine Learning and Reservoir Approach 3.1 Linear and Nonlinear Regression 3.2 Temporal Task 3.3 Reservoir Approach 4 Quantum Machine Learning on Near-Term Quantum Devices 4.1 Quantum Extreme Learning Machine 4.2 Quantum Circuit Learning 4.3 Quantum Reservoir Computing 4.4 Emulating Chaotic Attractors Using Quantum Dynamics 5 Conclusion and Discussion References Toward NMR Quantum Reservoir Computing References
Donate to keep this site alive
1. Disable the AdBlock plugin. Otherwise, you may not get any links.
2. Solve the CAPTCHA.
3. Click download link.
4. Lead to download server to download.