Echo State Incremental Gaussian Mixture Network for Spatio-Temporal Pattern Processing
Resumo
Este trabalho introduz um novo algoritmo de redes neurais para processamento online de padrões espaço-temporais, chamado Echo State Incremental Gaussian Mixture Network (ESIGMN). O algoritmo proposto é um híbrido de dois algoritmos estado-da-arte: a Echo State Network (ESN), usada para processamento de padrões espaço-temporais, e a Incremental Gaussian Mixture Network (IGMN), aplicada ao aprendizado agressivo em tarefas online. O algoritmo é comparado com a ESN convencional a fim de destacar as vantagens da abordagem IGMN como camada supervisionada de saída.Referências
Buehner, M. and Young, P. (2006). A tighter bound for the echo state property. Neural Networks, IEEE Transactions on, 17(3):820–824.
Dempster, A., Laird, N., Rubin, D., et al. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society. Series B (Methodological), 39(1):1–38.
Dominey, P. F. (1995). Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning. Biological Cybernetics, 73(3):265–74.
Elman, J. (1990). Finding structure in time* 1. Cognitive science, 14(2):179–211.
Engel, P. and Heinen, M. (2011). Incremental learning of multivariate gaussian mixture models. Advances in Artificial Intelligence–SBIA 2010, pages 82–91.
Hajnal, M. and Lőrincz, A. (2006). Critical echo state networks. Artificial Neural Networks–ICANN 2006, pages 658–667.
Hayes, M. (1996). 9.4: Recursive Least Squares”,”. Statistical Digital Signal Processing and Modeling.
Heinen, M. (2011). A Connectionist Approach for Incremental Function Approximation and On-line Tasks. PhD thesis, Universidade Federal do Rio Grande do Sul. Instituto de Informática. Programa de Pós-Graduação em Computação.
Heinen, M. and Engel, P. (2010). An Incremental Probabilistic Neural Network for Regression and Reinforcement Learning Tasks. Artificial Neural Networks–ICANN 2010, pages 170–179.
Jaeger, H. (2001). The” echo state” approach to analysing and training recurrent neural networks-with an erratum note’. Technical report, Technical Report GMD Report 148, German National Research Center for Information Technology.
Jaeger, H. (2003). Adaptive nonlinear system identification with echo state networks. Advances in Neural Information Processing Systems, 15:593–600.
Jaeger, H. and Haas, H. (2004). Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science, 304(5667):78.
Jaeger, H., Lukosevicius, M., Popovici, D., and Siewert, U. (2007). Optimization and applications of echo state networks with leaky-integrator neurons. Neural Networks, 20(3):335–352.
Kangas, J. (1991). Phoneme recognition using time-dependent versions of self-organizing maps. In icassp, pages 101–104. IEEE.
Lukoševičius, M. and Jaeger, H. (2009). Reservoir computing approaches to recurrent neural network training. Computer Science Review, 3(3):127–149.
Moser, L. (2004). Modelo de um neurônio diferenciador-integrador para representação temporal em arquiteturas conexionistas. Universidade Federal do Rio Grande do Sul. Instituto de Informática. Programa de Pós-Graduação em Computação.
Natschläger, T., Maass, W., and Markram, H. (2002). The ”liquid computer”: A novel strategy for real-time computing on time series. Special Issue on Foundations of Information Processing of TELEMATIK, 8(1):39–43.
Pinto, R. (2010). Um Estudo de Redes Neurais Não-Supervisionadas Temporais. Universidade Federal do Rio Grande do Sul. Instituto de Informática. Programa de Pós-Graduação em Computação.
Schrauwen, B., Wardermann, M., Verstraeten, D., Steil, J., and Stroobandt, D. (2008). Improving reservoirs using intrinsic plasticity. Neurocomputing, 71(7-9):1159–1171.
Steil, J. J. (2004). Backpropagation-decorrelation: online recurrent learning with o(n) complexity,. In IJCNN.
Verstraeten, D., Schrauwen, B., and Stroobandt, D. (2006). Reservoir-based techniques for speech recognition. In Proceedings of the world conference on computational intelligence, pages 1050–1053.
White, O., Lee, D., and Sompolinsky, H. (2004). Short-term memory in orthogonal neural networks. Physical review letters, 92(14):148102.
Widrow, B. (1966). Adaptive filters I: fundamentals (TR 6764-6).
Dempster, A., Laird, N., Rubin, D., et al. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society. Series B (Methodological), 39(1):1–38.
Dominey, P. F. (1995). Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning. Biological Cybernetics, 73(3):265–74.
Elman, J. (1990). Finding structure in time* 1. Cognitive science, 14(2):179–211.
Engel, P. and Heinen, M. (2011). Incremental learning of multivariate gaussian mixture models. Advances in Artificial Intelligence–SBIA 2010, pages 82–91.
Hajnal, M. and Lőrincz, A. (2006). Critical echo state networks. Artificial Neural Networks–ICANN 2006, pages 658–667.
Hayes, M. (1996). 9.4: Recursive Least Squares”,”. Statistical Digital Signal Processing and Modeling.
Heinen, M. (2011). A Connectionist Approach for Incremental Function Approximation and On-line Tasks. PhD thesis, Universidade Federal do Rio Grande do Sul. Instituto de Informática. Programa de Pós-Graduação em Computação.
Heinen, M. and Engel, P. (2010). An Incremental Probabilistic Neural Network for Regression and Reinforcement Learning Tasks. Artificial Neural Networks–ICANN 2010, pages 170–179.
Jaeger, H. (2001). The” echo state” approach to analysing and training recurrent neural networks-with an erratum note’. Technical report, Technical Report GMD Report 148, German National Research Center for Information Technology.
Jaeger, H. (2003). Adaptive nonlinear system identification with echo state networks. Advances in Neural Information Processing Systems, 15:593–600.
Jaeger, H. and Haas, H. (2004). Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science, 304(5667):78.
Jaeger, H., Lukosevicius, M., Popovici, D., and Siewert, U. (2007). Optimization and applications of echo state networks with leaky-integrator neurons. Neural Networks, 20(3):335–352.
Kangas, J. (1991). Phoneme recognition using time-dependent versions of self-organizing maps. In icassp, pages 101–104. IEEE.
Lukoševičius, M. and Jaeger, H. (2009). Reservoir computing approaches to recurrent neural network training. Computer Science Review, 3(3):127–149.
Moser, L. (2004). Modelo de um neurônio diferenciador-integrador para representação temporal em arquiteturas conexionistas. Universidade Federal do Rio Grande do Sul. Instituto de Informática. Programa de Pós-Graduação em Computação.
Natschläger, T., Maass, W., and Markram, H. (2002). The ”liquid computer”: A novel strategy for real-time computing on time series. Special Issue on Foundations of Information Processing of TELEMATIK, 8(1):39–43.
Pinto, R. (2010). Um Estudo de Redes Neurais Não-Supervisionadas Temporais. Universidade Federal do Rio Grande do Sul. Instituto de Informática. Programa de Pós-Graduação em Computação.
Schrauwen, B., Wardermann, M., Verstraeten, D., Steil, J., and Stroobandt, D. (2008). Improving reservoirs using intrinsic plasticity. Neurocomputing, 71(7-9):1159–1171.
Steil, J. J. (2004). Backpropagation-decorrelation: online recurrent learning with o(n) complexity,. In IJCNN.
Verstraeten, D., Schrauwen, B., and Stroobandt, D. (2006). Reservoir-based techniques for speech recognition. In Proceedings of the world conference on computational intelligence, pages 1050–1053.
White, O., Lee, D., and Sompolinsky, H. (2004). Short-term memory in orthogonal neural networks. Physical review letters, 92(14):148102.
Widrow, B. (1966). Adaptive filters I: fundamentals (TR 6764-6).
Publicado
19/07/2011
Como Citar
PINTO, Rafael C.; ENGEL, Paulo M.; HEINEN, Milton R..
Echo State Incremental Gaussian Mixture Network for Spatio-Temporal Pattern Processing. In: ENCONTRO NACIONAL DE INTELIGÊNCIA ARTIFICIAL E COMPUTACIONAL (ENIAC), 8. , 2011, Natal/RN.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2011
.
p. 454-465.
ISSN 2763-9061.