Tessarine and Quaternion-Valued Deep Neural Networks for Image Classification

  • Fernando Ribeiro de Senna UNICAMP
  • Marcos Eduardo Valle UNICAMP

Resumo


Many image processing and analysis tasks are performed with deep neural networks. Although the vast majority of advances have been made with real numbers, recent works have shown that complex and hypercomplex-valued networks may achieve better results. In this paper, we address quaternion-valued and introduce tessarine-valued deep neural networks, including tessarine-valued 2D convolutions. We also address initialization schemes and hypercomplex batch normalization. Finally, a tessarine-valued ResNet model with hypercomplex batch normalization outperformed the corresponding real and quaternion-valued networks on the CIFAR dataset.

Referências

Aizenberg, I. (2011). Complex-valued neural networks with multi-valued neurons, volume 353. Springer.

Arena, P., Fortuna, L., Muscato, G., and Xibilia, M. G. (1997). Multilayer perceptrons to approximate quaternion valued functions. Neural Networks, 10(2):335–342.

Babadag, F. (2017). A new approach to homothetic motions and surfaces with tessarines. International Journal of New Technology and Research (IJNTR), Volume-3:45–48.

Cerroni, C. (2017). From the theory of “congeneric surd equations” to “segre’s bicomplex numbers”. Historia Mathematica, 44(3):232–251.

Clevert, D., Unterthiner, T., and Hochreiter, S. (2016). Fast and accurate deep network learning by exponential linear units (elus).

Cockle, J. (1848). On certain functions resembling quaternions, and on a new imaginary in algebra. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 33(224):435–439.

Gaudet, C. and Maida, A. (2017). Deep quaternion networks.

Gaudet, C. and Maida, A. (2020). Generalizing complex/hyper-complex convolutions to vector map convolutions.

Glorot, X. and Bengio, Y. (2010). Understanding the difficulty of training deep feedforward neural networks. In Teh, Y. W. and Titterington, M., editors, Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, volume 9 of Proceedings of Machine Learning Research, pages 249–256. PMLR.

Grassucci, E., Cicero, E., and Comminiello, D. (2021). Quaternion generative adversarial networks.

Géron, A. (2017). Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems. O’Reilly Media, Inc., 1st edition.

Hamilton, W. R. (1844). On quaternions; or on a new system of imaginaries in algebra. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 25(163):10–13.

Haykin, S. (2009). Neural Networks and Learning Machines. Pearson International Edition. Pearson.

He, K., Zhang, X., Ren, S., and Sun, J. (2015a). Deep residual learning for image recognition.

He, K., Zhang, X., Ren, S., and Sun, J. (2015b). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification.

He, K., Zhang, X., Ren, S., and Sun, J. (2016). Identity mappings in deep residual networks.

Ioffe, S. and Szegedy, C. (2015). Batch normalization: Accelerating deep network training by reducing internal covariate shift.

Kessy, A., Lewin, A., and Strimmer, K. (2018). Optimal whitening and decorrelation. The American Statistician, 72(4):309–314.

Krizhevsky, A. (2009). Learning multiple layers of features from tiny images. Technical report.

Kumar, S. and Tripathi, B. K. (2019). On the learning machine with compensatory aggregation based neurons in quaternionic domain. Journal of Computational Design and Engineering, 6(1):33–48.

Lecun, Y., Bottou, L., Bengio, Y., and Haffner, P. (1998). Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324.

Loots, M., Bekker, A., Arashi, M., and Roux, J. (2012). On the real representation of quaternion random variables. Statistics, 47:1–17.

Navarro-Moreno, J., Fernández-Alcalá, R. M., Jiménez-López, J. D., and Ruiz-Molina, J. C. (2020). Tessarine signal processing under the t-properness condition. Journal of the Franklin Institute, 357(14):10100–10126.

Navarro-Moreno, J. and Ruiz-Molina, J. C. (2021). Wide-sense markov signals on the tessarine domain. a study under properness conditions. Signal Processing, 183:108022.

Parcollet, T., Morchid, M., and Linarés, G. (2020). A survey of quaternion neural networks. Artificial Intelligence Review, 53(4):2957–2982.

Trabelsi, C., Bilaniuk, O., Serdyuk, D., Subramanian, S., Santos, J. F., Mehri, S., Rostamzadeh, N., Bengio, Y., and Pal, C. J. (2017). Deep complex networks. CoRR.

Zhu, X., Xu, Y., Xu, H., and Chen, C. (2019). Quaternion convolutional neural networks.
Publicado
29/11/2021
Como Citar

Selecione um Formato
SENNA, Fernando Ribeiro de; VALLE, Marcos Eduardo. Tessarine and Quaternion-Valued Deep Neural Networks for Image Classification. In: ENCONTRO NACIONAL DE INTELIGÊNCIA ARTIFICIAL E COMPUTACIONAL (ENIAC), 18. , 2021, Evento Online. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2021 . p. 350-361. DOI: https://doi.org/10.5753/eniac.2021.18266.