【2020-2024持续更新】Echo State Network和储层计算论文汇总!包括经典ESN、DeepESN、组合ESN和综述!

简介: 本文汇总了2020至2024年间关于Echo State Network(ESN)和储层计算的研究成果,包括综述、不同模型分类(经典ESN、DeepESN、组合ESN)、开源论文、储层计算相关研究以及ESN在各个领域的应用情况。

关键词:ESN、Echo state network、Reservoir Computing
更新时间:2024

目录

  • 1 综述
  • 2 ESN模型分类
    • 2.1 ESN
    • 2.2 DeepESN
    • 2.3 组合ESN
  • 3 开源论文
  • 4 储层计算相关研究
  • 5 应用

1 综述

  1. Gallicchio, Claudio and Alessio Micheli. “Deep Echo State Network (DeepESN): A Brief Survey.” ArXiv abs/1712.04323 (2017): n. pag.
  2. Sun, Chenxi et al. “A Systematic Review of Echo State Networks From Design to Application.” IEEE Transactions on Artificial Intelligence 5 (2024): 23-37.
  3. Soltani, Rebh et al. “Echo State Network Optimization: A Systematic Literature Review.” Neural Processing Letters 55 (2023): 10251-10285.
  4. Xu Y. A review of machine learning with echo state networks[J]. Proj. Rep, 2020.
  5. Margin D A, Dobrota V. Overview of Echo State Networks using Different Reservoirs and Activation Functions[C]//2021 20th RoEduNet Conference: Networking in Education and Research (RoEduNet). IEEE, 2021: 1-6.
  6. Sun, Chenxi et al. “A Review of Designs and Applications of Echo State Networks.” ArXiv abs/2012.02974 (2020): n. pag.
  7. Sun, Chenxi et al. “A Systematic Review of Echo State Networks From Design to Application.” IEEE Transactions on Artificial Intelligence 5 (2024): 23-37.

2 ESN模型分类

2.1 ESN

典型的ESN由一个输入层、一个循环层(储层,由大量的稀疏连接的神经元组成)和一个输出层组成。包含对经典ESN、并对ESN的结构改进的研究的论文。

  1. Manneschi, Luca et al. “Exploiting Multiple Timescales in Hierarchical Echo State Networks.” Frontiers in Applied Mathematics and Statistics (2021).
  2. Fourati R, Ammar B, Jin Y, et al. EEG feature learning with intrinsic plasticity based deep echo state network[C]//2020 international joint conference on neural networks (IJCNN). IEEE, 2020: 1-8.
  3. Liu, Qianwen et al. “Memory augmented echo state network for time series prediction.” Neural Computing and Applications (2023): 1-16.
  4. Akrami, Abbas et al. “Design of a reservoir for cloud-enabled echo state network with high clustering coefficient.” EURASIP Journal on Wireless Communications and Networking 2020 (2020): 1-14.
  5. Arroyo, Diana Carolina Roca. “A Modified Echo State Network Model Using Non-Random Topology.” (2023).
  6. Fu, Jun et al. “A double-cycle echo state network topology for time series prediction.” Chaos 33 9 (2023): n. pag.
  7. Akrami, Abbas et al. “Design of a reservoir for cloud-enabled echo state network with high clustering coefficient.” EURASIP Journal on Wireless Communications and Networking 2020 (2020): n. pag.
  8. Yang, Cuili and Zhanhong Wu. “Multi-objective sparse echo state network.” Neural Computing and Applications 35 (2022): 2867-2882.
  9. Tortorella, Domenico et al. “Spectral Bounds for Graph Echo State Network Stability.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 1-8.
  10. Zheng, Shoujing et al. “Improved Echo State Network With Multiple Activation Functions.” 2022 China Automation Congress (CAC) (2022): 346-350.
  11. Morra, Jacob and Mark Daley. “Imposing Connectome-Derived Topology on an Echo State Network.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 1-6.
  12. McDaniel, Shane et al. “Investigating Echo State Network Performance with Biologically-Inspired Hierarchical Network Structure.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 01-08.
  13. Yao, Xianshuang et al. “A stability criterion for discrete-time fractional-order echo state network and its application.” Soft Computing 25 (2021): 4823 - 4831.
  14. Mu, Xiaohui and Lixiang Li. “Memristor-based Echo State Network and Prediction for Time Series.” 2021 International Conference on Neuromorphic Computing (ICNC) (2021): 153-158.
  15. Mahmoud, Tarek A. and Lamiaa M. Elshenawy. “TSK fuzzy echo state neural network: a hybrid structure for black-box nonlinear systems identification.” Neural Computing and Applications 34 (2022): 7033 - 7051.
  16. Maksymov, Ivan S. et al. “Neural Echo State Network using oscillations of gas bubbles in water: Computational validation by Mackey-Glass time series forecasting.” Physical review. E 105 4-1 (2021): 044206 .
  17. Wang, Lei et al. “Design of sparse Bayesian echo state network for time series prediction.” Neural Computing and Applications 33 (2020): 7089 - 7102.
  18. Gong, Shangfu et al. “An Improved Small-World Topology for Optimizing the Performance of Echo State Network.” 2020 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom) (2020): 1413-1419.
  19. Iacob, Stefan et al. “Delay-Sensitive Local Plasticity in Echo State Networks.” 2023 International Joint Conference on Neural Networks (IJCNN) (2023): 1-8.
  20. Jordanou, Jean P. et al. “Investigation of Proper Orthogonal Decomposition for Echo State Networks.” Neurocomputing 548 (2022): 126395.
  21. Paassen, Benjamin et al. “Tree Echo State Autoencoders with Grammars.” 2020 International Joint Conference on Neural Networks (IJCNN) (2020): 1-8.(有源码)
  22. Liu, Junxiu, et al. “Echo state network optimization using binary grey wolf algorithm.” Neurocomputing 385 (2020): 310-318.
  23. Trouvain, Nathan, et al. “Reservoirpy: an efficient and user-friendly library to design echo state networks.” International Conference on Artificial Neural Networks. Cham: Springer International Publishing, 2020.(源码)
  24. Hart, Allen, James Hook, and Jonathan Dawes. “Embedding and approximation theorems for echo state networks.” Neural Networks 128 (2020): 234-247.
  25. Morra, Jacob, and Mark Daley. “Imposing Connectome-Derived topology on an echo state network.” 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 2022.
  26. Na, Xiaodong, Weijie Ren, and Xinghan Xu. “Hierarchical delay-memory echo state network: A model designed for multi-step chaotic time series prediction.” Engineering Applications of Artificial Intelligence 102 (2021): 104229.

2.2 DeepESN

Deep Echo State Network
DeepESN是利用深度学习DL框架堆叠多个ESN而成的网络。它由输入层、动力学堆叠的储层组件和输出层组成。

  1. Bouazizi, Samar et al. “Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network.” JUCS - Journal of Universal Computer Science (2023): n. pag.
  2. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  3. Wang, Yuanhui et al. “A Weight Optimization Method of Deep Echo State Network Based on Improved Knowledge Evolution.” 2022 China Automation Congress (CAC) (2022): 395-400.
  4. Yang, Xiaojian et al. “An improved deep echo state network inspired by tissue-like P system forecasting for non-stationary time series.” Journal of Membrane Computing 4 (2022): 222 - 231.
  5. Kanda, Keiko and Sou Nobukawa. “Feature Extraction Mechanism for Each Layer of Deep Echo State Network.” 2022 International Conference on Emerging Techniques in Computational Intelligence (ICETCI) (2022): 65-70.
  6. Kim, Taehwan and Brian R. King. “Time series prediction using deep echo state networks.” Neural Computing and Applications (2020): 1-19.
  7. Hu, Ruihan et al. “Ensemble echo network with deep architecture for time-series modeling.” Neural Computing and Applications 33 (2020): 4997 - 5010.
  8. Ma, Qianli, Lifeng Shen, and Garrison W. Cottrell. “DeePr-ESN: A deep projection-encoding echo-state network.” Information Sciences 511 (2020): 152-171.
  9. Song, Zuohua, Keyu Wu, and Jie Shao. “Destination prediction using deep echo state network.” Neurocomputing 406 (2020): 343-353.
  10. Barredo Arrieta, Alejandro, et al. “On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification.” Neural Computing and Applications (2022): 1-21.(有源码)

2.3 组合ESN

ESN与深度学习、机器学习网络、特殊数据结构结合

  1. Lien, Justin. “Hypergraph Echo State Network.” ArXiv abs/2310.10177 (2023): n. pag.
  2. Deng, Lichi and Yuewei Pan. “Machine Learning Assisted Closed-Loop Reservoir Management using Echo State Network.” (2020).
  3. Trierweiler Ribeiro, Gabriel, et al. “Bayesian optimized echo state network applied to short-term load forecasting.” Energies 13.9 (2020): 2390.

3 开源论文

包含ESN和储层计算的研究,不限时间

  1. Cerina L, Santambrogio M D, Franco G, et al. EchoBay: design and optimization of echo state networks under memory and time constraints[J]. ACM Transactions on Architecture and Code Optimization (TACO), 2020, 17(3): 1-24.
  2. Lukoševičius M, Uselis A. Efficient implementations of echo state network cross-validation[J]. Cognitive computation, 2021: 1-15.
  3. Sun C, Hong S, Song M, et al. Te-esn: Time encoding echo state network for prediction based on irregularly sampled time series data[J]. arXiv preprint arXiv:2105.00412, 2021.
  4. Özdemir A, Scerri M, Barron A B, et al. EchoVPR: Echo state networks for visual place recognition[J]. IEEE Robotics and Automation Letters, 2022, 7(2): 4520-4527.
  5. Li Z, Liu Y, Tanaka G. Multi-Reservoir Echo State Networks with Hodrick–Prescott Filter for nonlinear time-series prediction[J]. Applied Soft Computing, 2023, 135: 110021.
  6. Barredo Arrieta A, Gil-Lopez S, Laña I, et al. On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification[J]. Neural Computing and Applications, 2022: 1-21.
  7. Robust optimization and validation ofecho state networksfor learning chaotic dynamics
  8. Gallicchio, Claudio and Alessio Micheli. “Deep Echo State Network (DeepESN): A Brief Survey.” ArXiv abs/1712.04323 (2017): n. pag.
  9. Steiner, Peter, Azarakhsh Jalalvand, and Peter Birkholz. “Cluster-based input weight initialization for echo state networks.” IEEE Transactions on Neural Networks and Learning Systems (2022).
  10. Bianchi, Filippo Maria et al. “Bidirectional deep-readout echo state networks.” The European Symposium on Artificial Neural Networks (2017).
  11. Maat, Jacob Reinier et al. “Efficient Optimization of Echo State Networks for Time Series Datasets.” 2018 International Joint Conference on Neural Networks (IJCNN) (2018): 1-7.
  12. Heim, Niklas and James E. Avery. “Adaptive Anomaly Detection in Chaotic Time Series with a Spatially Aware Echo State Network.” ArXiv abs/1909.01709 (2019): n. pag.
  13. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  14. Lukoševičius, Mantas, and Arnas Uselis. “Efficient implementations of echo state network cross-validation.” Cognitive computation (2021): 1-15.
  15. Lukoševičius, Mantas and Arnas Uselis. “Efficient Cross-Validation of Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  16. Özdemir, Anil et al. “EchoVPR: Echo State Networks for Visual Place Recognition.” IEEE Robotics and Automation Letters PP (2021): 1-1.
  17. Verzelli, Pietro et al. “Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere.” Scientific Reports 9 (2019): n. pag.
  18. Rodriguez, Nathaniel et al. “Optimal modularity and memory capacity of neural reservoirs.” Network Neuroscience 3 (2017): 551 - 566.
  19. Chattopadhyay, Ashesh et al. “Data-driven prediction of a multi-scale Lorenz 96 chaotic system using deep learning methods: Reservoir computing, ANN, and RNN-LSTM.” (2019).
  20. Steiner, Peter, et al. “PyRCN: A toolbox for exploration and application of Reservoir Computing Networks.” Engineering Applications of Artificial Intelligence 113 (2022): 104964.
  21. Strock, Anthony et al. “A Simple Reservoir Model of Working Memory with Real Values.” 2018 International Joint Conference on Neural Networks (IJCNN) (2018): 1-8.
  22. Zhang, Yuanzhao and Sean P. Cornelius. “Catch-22s of reservoir computing.” Physical Review Research (2022): n. pag.
  23. Gao, Ruobin et al. “Dynamic ensemble deep echo state network for significant wave height forecasting.” Applied Energy (2023): n. pag.
  24. Gallicchio, Claudio and Alessio Micheli. “Reservoir Topology in Deep Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  25. Lukoševičius, Mantas and Arnas Uselis. “Efficient Implementations of Echo State Network Cross-Validation.” Cognitive Computation 15 (2020): 1470 - 1484.
  26. Mattheakis, Marios et al. “Unsupervised Reservoir Computing for Solving Ordinary Differential Equations.” ArXiv abs/2108.11417 (2021): n. pag.
  27. Paassen, Benjamin et al. “Tree Echo State Autoencoders with Grammars.” 2020 International Joint Conference on Neural Networks (IJCNN) (2020): 1-8.
  28. Evanusa, Matthew et al. “Hybrid Backpropagation Parallel Reservoir Networks.” ArXiv abs/2010.14611 (2020): n. pag.
  29. Trouvain, Nathan, et al. “Reservoirpy: an efficient and user-friendly library to design echo state networks.” International Conference on Artificial Neural Networks. Cham: Springer International Publishing, 2020.
  30. Cossu, Andrea, et al. “Continual learning with echo state networks.” arXiv preprint arXiv:2105.07674 (2021).
  31. Gauthier, Daniel J., et al. “Next generation reservoir computing.” Nature communications 12.1 (2021): 5564.
  32. Vlachas, Pantelis R., et al. “Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics.” Neural Networks 126 (2020): 191-217.
  33. Cucchi, Matteo, et al. “Hands-on reservoir computing: a tutorial for practical implementation.” Neuromorphic Computing and Engineering 2.3 (2022): 032002.(储层计算实践)
  34. Mattheakis, Marios, Hayden Joy, and Pavlos Protopapas. “Unsupervised reservoir computing for solving ordinary differential equations.” arXiv preprint arXiv:2108.11417 (2021).
  35. Barredo Arrieta, Alejandro, et al. “On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification.” Neural Computing and Applications (2022): 1-21.

4 储层计算相关研究

  1. Margin D A, Ivanciu I A, Dobrota V. Deep Reservoir Computing using Echo State Networks and Liquid State Machine[C]//2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom). IEEE, 2022: 208-213.
  2. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  3. Chattopadhyay, Ashesh et al. “Data-driven prediction of a multi-scale Lorenz 96 chaotic system using deep learning methods: Reservoir computing, ANN, and RNN-LSTM.” (2019).
  4. Steiner, Peter, et al. “PyRCN: A toolbox for exploration and application of Reservoir Computing Networks.” Engineering Applications of Artificial Intelligence 113 (2022): 104964.
  5. Zhang, Yuanzhao and Sean P. Cornelius. “Catch-22s of reservoir computing.” Physical Review Research (2022): n. pag.
  6. Gallicchio, Claudio and Alessio Micheli. “Reservoir Topology in Deep Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  7. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  8. Manjunath, G… “Memory-Loss is Fundamental for Stability and Distinguishes the Echo State Property Threshold in Reservoir Computing & Beyond.” ArXiv abs/2001.00766 (2020): n. pag.
  9. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  10. Gonon, Lukas et al. “Infinite-dimensional reservoir computing.” ArXiv abs/2304.00490 (2023): n. pag.
  11. Sun, Xiaochuan et al. “Towards Fault Tolerance of Reservoir Computing in Time Series Prediction.” Inf. 14 (2023): 266.
  12. Lee, Kundo and Tomoki Hamagami. “Reservoir Computing for Scalable Hardware with Block‐Based Neural Network.” IEEJ Transactions on Electrical and Electronic Engineering 16 (2021): n. pag.
  13. Ren, Bin and Huanfei Ma. “Global optimization of hyper-parameters in reservoir computing.” Electronic Research Archive (2022): n. pag.
  14. Storm, Lance et al. “Constraints on parameter choices for successful reservoir computing.” ArXiv abs/2206.02575 (2022): n. pag.
  15. Bendali, Wadie et al. “Optimization of Deep Reservoir Computing with Binary Genetic Algorithm for Multi-Time Horizon Forecasting of Power Consumption.” Journal Européen des Systèmes Automatisés (2022): n. pag.
  16. Bacciu, Davide et al. “Federated Reservoir Computing Neural Networks.” 2021 International Joint Conference on Neural Networks (IJCNN) (2021): 1-7.
  17. Mattheakis, Marios et al. “Unsupervised Reservoir Computing for Solving Ordinary Differential Equations.” ArXiv abs/2108.11417 (2021): n. pag.(有源码)
  18. Love, Jake et al. “Task Agnostic Metrics for Reservoir Computing.” ArXiv abs/2108.01512 (2021): n. pag.
  19. Heyder, Florian et al. “Generalizability of reservoir computing for flux-driven two-dimensional convection.” Physical review. E 106 5-2 (2021): 055303 .
  20. Honda, Hirotada. “A novel framework for reservoir computing with inertial manifolds.” 2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC) (2021): 347-352.
  21. Hart, Allen G… “(Thesis) Reservoir Computing With Dynamical Systems.” (2021).(可视化美观)
  22. Doan, Nguyen Anh Khoa et al. “Auto-Encoded Reservoir Computing for Turbulence Learning.” ArXiv abs/2012.10968 (2020): n. pag.
  23. Gallicchio, Claudio et al. “Frontiers in Reservoir Computing.” The European Symposium on Artificial Neural Networks (2020).
  24. Evanusa, Matthew et al. “Hybrid Backpropagation Parallel Reservoir Networks.” ArXiv abs/2010.14611 (2020): n. pag.(有源码)
  25. Kleyko, Denis, et al. “Integer echo state networks: Efficient reservoir computing for digital hardware.” IEEE Transactions on Neural Networks and Learning Systems 33.4 (2020): 1688-1701.
  26. Huhn, Francisco, and Luca Magri. “Gradient-free optimization of chaotic acoustics with reservoir computing.” Physical Review Fluids 7.1 (2022): 014402.
  27. Alomar, Miquel L., et al. “Efficient parallel implementation of reservoir computing systems.” Neural Computing and Applications 32 (2020): 2299-2313.
  28. Manneschi, Luca, Andrew C. Lin, and Eleni Vasilaki. “SpaRCe: Improved learning of reservoir computing systems through sparse representations.” IEEE Transactions on Neural Networks and Learning Systems (2021).
  29. Damicelli, Fabrizio, Claus C. Hilgetag, and Alexandros Goulas. “Brain connectivity meets reservoir computing.” PLoS Computational Biology 18.11 (2022): e1010639.
  30. Gauthier, Daniel J., et al. “Next generation reservoir computing.” Nature communications 12.1 (2021): 5564.(有源码)
  31. Gallicchio, Claudio. “Sparsity in reservoir computing neural networks.” 2020 International Conference on INnovations in Intelligent SysTems and Applications (INISTA). IEEE, 2020.
  32. Vlachas, Pantelis R., et al. “Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics.” Neural Networks 126 (2020): 191-217.
  33. Cucchi, Matteo, et al. “Hands-on reservoir computing: a tutorial for practical implementation.” Neuromorphic Computing and Engineering 2.3 (2022): 032002.(有源码)(储层计算实践)
  34. Lim, Soon Hoe, et al. “Predicting critical transitions in multiscale dynamical systems using reservoir computing.” Chaos: An Interdisciplinary Journal of Nonlinear Science 30.12 (2020).
  35. Mattheakis, Marios, Hayden Joy, and Pavlos Protopapas. “Unsupervised reservoir computing for solving ordinary differential equations.” arXiv preprint arXiv:2108.11417 (2021).(有源码)

5 应用

  1. Bouazizi S, Benmohamed E, Ltifi H. Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network[J]. JUCS: Journal of Universal Computer Science, 2023, 29(10).
  2. Valencia C H, Vellasco M M B R, Figueiredo K. Echo State Networks: Novel reservoir selection and hyperparameter optimization model for time series forecasting[J]. Neurocomputing, 2023, 545: 126317.
  3. Viehweg J, Worthmann K, Mäder P. Parameterizing echo state networks for multi-step time series prediction[J]. Neurocomputing, 2023, 522: 214-228.
  4. Bai, Yu-ting et al. “Nonstationary Time Series Prediction Based on Deep Echo State Network Tuned by Bayesian Optimization.” Mathematics (2023): n. pag.
  5. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  6. Özdemir, Anil et al. “EchoVPR: Echo State Networks for Visual Place Recognition.” IEEE Robotics and Automation Letters PP (2021): 1-1.
  7. Gao, Ruobin et al. “Dynamic ensemble deep echo state network for significant wave height forecasting.” Applied Energy (2023): n. pag.
  8. Liu, Qianwen et al. “Memory augmented echo state network for time series prediction.” Neural Computing and Applications (2023): 1-16.
  9. Deng, Lichi and Yuewei Pan. “Machine-Learning-Assisted Closed-Loop Reservoir Management Using Echo State Network for Mature Fields under Waterflood.” Spe Reservoir Evaluation & Engineering 23 (2020): n. pag.
  10. Mandal, Swarnendu and Manish Dev Shrimali. “Learning unidirectional coupling using echo-state network.” Physical review. E 107 6-1 (2023): 064205 .
  11. Koprinkova-Hristova, Petia D. et al. “Echo state network for features extraction and segmentation of tomography images.” Computer Science and Information Systems (2023): n. pag.
  12. Bouazizi, Samar et al. “Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network.” JUCS - Journal of Universal Computer Science (2023): n. pag.
  13. Soltani, Rebh et al. “Optimized Echo State Network based on PSO and Gradient Descent for Choatic Time Series Prediction.” 2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI) (2022): 747-754.
  14. Caremel, Cedric et al. “Echo State Network for Soft Actuator Control.” J. Robotics Mechatronics 34 (2022): 413-421.
  15. Ren, Weijie et al. “Time series prediction based on echo state network tuned by divided adaptive multi-objective differential evolution algorithm.” Soft Computing 25 (2021): 4489 - 4502.
  16. Na, Yongsu et al. “Near real-time predictions of tropical cyclone trajectory and intensity in the northwestern Pacific Ocean using echo state network.” Climate Dynamics 58 (2021): 651 - 667.
  17. Gandhi, Manjunath. “An Echo State Network Imparts a Curve Fitting.” IEEE Transactions on Neural Networks and Learning Systems 33 (2021): 2596-2604.
  18. Jere, Shashank et al. “Channel Equalization Through Reservoir Computing: A Theoretical Perspective.” IEEE Wireless Communications Letters 12 (2023): 774-778.
  19. Jordanou, Jean P. et al. “Echo State Networks for Practical Nonlinear Model Predictive Control of Unknown Dynamic Systems.” IEEE Transactions on Neural Networks and Learning Systems 33 (2021): 2615-2629.
  20. Kim, Taehwan and Brian R. King. “Time series prediction using deep echo state networks.” Neural Computing and Applications (2020): 1-19.
  21. Simov, Kiril Ivanov et al. “A Reservoir Computing Approach to Word Sense Disambiguation.” Cognitive Computation 15 (2020): 1409 - 1418.
  22. Cossu, Andrea, et al. “Continual learning with echo state networks.” arXiv preprint arXiv:2105.07674 (2021).(有源码)
  23. Fourati, Rahma, et al. “EEG feature learning with intrinsic plasticity based deep echo state network.” 2020 international joint conference on neural networks (IJCNN). IEEE, 2020.
  24. Fourati, Rahma, et al. “Unsupervised learning in reservoir computing for eeg-based emotion recognition.” IEEE Transactions on Affective Computing 13.2 (2020): 972-984.

目录
相关文章
|
4月前
|
机器学习/深度学习 存储 算法
回声状态网络(Echo State Networks,ESN)详细原理讲解及Python代码实现
本文详细介绍了回声状态网络(Echo State Networks, ESN)的基本概念、优点、缺点、储层计算范式,并提供了ESN的Python代码实现,包括不考虑和考虑超参数的两种ESN实现方式,以及使用ESN进行时间序列预测的示例。
239 4
回声状态网络(Echo State Networks,ESN)详细原理讲解及Python代码实现
|
4月前
|
机器学习/深度学习 存储 算法
【博士每天一篇论文-技术综述】Machine Learning With Echo State Networks 一篇系统讲解ESN知识的五星文章
本文是一篇技术报告,全面介绍了回声状态网络(ESNs)的数学模型、属性、意义、训练方法、深度ESN的发展、应用和局限性,并探讨了未来的研究方向,为理解ESNs在机器学习中的应用提供了系统性的综述。
79 3
|
机器学习/深度学习 算法 Python
BP神经网络(Back Propagation Neural Network)算法原理推导与Python实现详解
BP神经网络(Back Propagation Neural Network)算法原理推导与Python实现详解
|
机器学习/深度学习 自然语言处理 JavaScript
R-Drop: Regularized Dropout for Neural Networks 论文笔记(介绍,模型结构介绍、代码、拓展KL散度等知识)
R-Drop: Regularized Dropout for Neural Networks 论文笔记(介绍,模型结构介绍、代码、拓展KL散度等知识)
|
机器学习/深度学习 算法 数据安全/隐私保护
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(三)
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(三)
|
机器学习/深度学习 算法 图形学
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(七)
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(七)
|
机器学习/深度学习 人工智能 算法
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(一)
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(一)
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(一)
|
机器学习/深度学习 算法
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(八)
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(八)
|
机器学习/深度学习 存储 算法
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(九)
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(九)
|
机器学习/深度学习 存储 算法
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(二)
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟
DL:深度学习算法(神经网络模型集合)概览之《THE NEURAL NETWORK ZOO》的中文解释和感悟(二)