Characterization and identification of twelve-tone composers

  • Lucas F. P. Costa UTFPR
  • Andrés E. Coca S. UTFPR


The individualism of each composer is shaped in an inherent way to his personality, aiming for recognition of particular form through the own songs. In this way, it is possible to categorize a musical subgenre at a deeper level by identifying the composer from his works. However, the characteristics of each composer are so varied that they are difficult to identify. In this paper it is proposed to use machine learning to classify works of twelve-tone music according to the composer, under the hypothesis that in choosing the twelve-tone series a part of his signature was reflected. Experimental results showed promising performance and confirmed the existence of a relation between composer and series.


[1] Coca A. and Zhao L., Identification of music genres by using communities detection in complex networks, In Proc. National Meeting on Artificial and Computational Intelligence (ENIAC), 2015, pp. 95–100.

[2] Lannoy C., Detection and discrimination of dodecaphonic series, Interface 1 (1972), no. 1, 13–27.

[3] Coca A., Olivar G., and Zhao L., Characterizing chaotic melodies in automatic music composition, Chaos - An Interdisciplinary Journal 20 (2010), no. 3, 033125.

[4] Herremans D., Martens D., and S¨orensen K., Composer classification models for musictheory building, pp. 369–392, Springer, 2016.

[5] Covach J., Twelve-tone theory, The Cambridge History of Western Music Theory, 2002.

[6] Hajj N. Filo M. and Awad M., Automated composer recognition for multi-voice piano compositions using rhythmic features, n-grams and modified cortical algorithms, Complex & Intelligent Systems 4 (2018), no. 1, 55–65.

[7] Hontanilla M., Pérez C., and I˜nesta J., Modeling musical style with language models for composer recognition, Pattern Recognition and Image Analysis (Berlin, Heidelberg) (Sanches M., Micó L., and Cardoso J., eds.), Springer Berlin Heidelberg, 2013, pp. 740–748.

[8] Kaliakatsos P., Maximos A., Epitropakis M., and Vrahatis N., Musical composer identification through probabilistic and feedforward neural networks, Applications of Evolutionary Computation (Berlin, Heidelberg) (Di Chio C. et al.), Springer Berlin Heidelberg, 2010, pp. 411–420.

[9] Kaliakatsos P., Epitropakis M., and Vrahatis N., Weighted markov chain model for musical composer identification, Applications of Evolutionary Computation. (Berlin, Heidelberg) (Di Chio C. et al.), Springer Berlin Heidelberg, 2011, pp. 334–343.

[10] Candé R., História universal da musica, Martins Fontes, 2001.

[11] Franc`es R., La perception de la musique, Librairie Philosophique J. Vrin, Paris, 1958.

[12] Russel S. and Norvig P., Artificial intelligence, Prentice Hall, 2009.

[13] Y. Song, S. Dixon, and M. Pearce, A survey of music recommendation systems and future perspectives, In Proc. International Symposium on Computer Music Modelling and Retrieval (CMMR), 2012, pp. 395–410.

[14] B. Sturm, A survey of evaluation in music genre recognition, In Proc. Adaptive Multimedia Retrieval (AMR), 2012, pp. 1–41.

[15] Tsatsishvili V., Automatic subgenre classification of heavy metal music, Master’s thesis, University Of Jyväskylä., 2011.

[16] Nie Y.-B, Data mining applied to music style classification, International Journal of Simulation Systems, Science & Technology (IJSSST) 17 (2016), no. 2, 19.1–19.6.

[17] De Prisco R., Zaccagnino G. and Zaccagnino R., A Genetic Algorithm for Dodecaphonic Compositions, Applications of Evolutionary Computation (Berlin, Heidelberg) (Di Chio C. et al.), Springer Berlin Heidelberg, 2011, pp. 244–253.

[18] Coca A. and Zhao L. Musical rhythmic pattern extraction using relevance of communities in networks, Information Sciences, vol. 329, 2016, pp. 819–848.

COSTA, Lucas F. P.; COCA S., Andrés E.. Characterization and identification of twelve-tone composers. In: ENCONTRO NACIONAL DE INTELIGÊNCIA ARTIFICIAL E COMPUTACIONAL (ENIAC), 15. , 2018, São Paulo. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2018 . p. 763-774. ISSN 2763-9061. DOI: