Clustered Echo State Networks for Signal Observation and Frequency Filtering

  • Laércio Oliveira Junior USP
  • Florian Stelzer Humboldt University of Berlin / Technical University of Berlin
  • Liang Zhao USP

Resumo


Echo State Networks (ESNs) are recurrent neural networks that map an input signal to a high-dimensional dynamical system, called reservoir, and possess adaptive output weights. The output weights are trained such that the ESN’s output signal fits the desired target signal. Classical reservoirs are sparse and randomly connected networks. In this article, we investigate the effect of different network topologies on the performance of ESNs. Specifically, we use two types of networks to construct clustered reservoirs of ESN: the clustered Erdös–Rényi and the clustered Barabási-Albert network model. Moreover, we compare the performance of these clustered ESNs (CESNs) and classical ESNs with the random reservoir by employing them to two different tasks: frequency filtering and the reconstruction of chaotic signals. By using a clustered topology, one can achieve a significant increase in the ESN’s performance.

Palavras-chave: clustered networks, complex networks, echo state networks, machine learning, neural networks

Referências

Appeltant, L., Soriano, M., Van Der Sande, G., Danckaert, J., Massar, S., Dambre, J., Schrauwen, B., Mirasso, C., and Fischer, I. Information processing using a single dynamical node as complex system. Nature Communications vol. 2, pp. 1–6, 9, 2011.

Barabási, A. L. and Albert, R. Emergence of scaling in random networks. Science 286 (5439): 509–512, 1999.

Bollobas, B., Borgs, C., Chayes, J., and Riordan, O. Directed scale-free graphs. In Proceedings of the 14th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), Proceedings of the 14th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA) ed. pp. 132–139, 2003.

Deng, Z. and Zhang, Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Transactions on Neural Networks 18 (5): 1364–1375, Sep., 2007.

Erdös, P. and Renyi, A. On the strength of connectedness of a random graph. Acta Mathematica Hungarica vol. 12, pp. 261–267, 1961.

Jaeger, H. The" echo state" approach to analysing and training recurrent neural networks-with an erratum note’.

Bonn, Germany: German National Research Center for Information Technology GMD Technical Report vol. 148, 01, 2001.

Li, X., Zhong, L., Xue, F., and Zhang, A. A Priori Data-Driven Multi-Clustered Reservoir Generation Algorithm for Echo State Network - Fig 9. , 4, 2015.

Lin, X., Yang, Z., and Song, Y. The application of echo state network in stock data mining. In Advances in Knowledge Discovery and Data Mining, T. Washio, E. Suzuki, K. M. Ting, and A. Inokuchi (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, pp. 932–937, 2008.

Lu, Z., Pathak, J., Hunt, B., Girvan, M., Brockett, R., and Ott, E. Reservoir observers: Model-free inference of unmeasured variables in chaotic systems. Chaos: An Interdisciplinary Journal of Nonlinear Science 27 (4): 041102, 2017.

Tanaka, G., Yamane, T., Héroux, J. B., Nakane, R., Kanazawa, N., Takeda, S., Numata, H., Nakano, D., and Hirose, A. Recent advances in physical reservoir computing: A review. Neural Networks vol. 115, pp. 100 – 123, 2019.

Wen, G., Li, H., and Li, D. An ensemble convolutional echo state networks for facial expression recognition. In 2015 International Conference on Affective Computing and Intelligent Interaction (ACII). pp. 873–878, 2015.
Publicado
20/10/2020
OLIVEIRA JUNIOR, Laércio; STELZER, Florian; ZHAO, Liang. Clustered Echo State Networks for Signal Observation and Frequency Filtering. In: SYMPOSIUM ON KNOWLEDGE DISCOVERY, MINING AND LEARNING (KDMILE), 8. , 2020, Evento Online. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2020 . p. 25-32. ISSN 2763-8944. DOI: https://doi.org/10.5753/kdmile.2020.11955.