Transfer Learning for Synthetic Examples Selection in Meta-learning

  • Regina R. Parente UFPE
  • Ricardo B. C. Prudencio UFPE

Resumo


In Meta-learning, training examples are generated from experiments performed with a pool of candidate algorithms in a number of problems (real or synthetic). Generating a good set of examples can be difficult due to the low availability of real datasets in some domains and the high computational cost of labeling. In this paper, we focus on the selection of training meta-examples by combining data manipulation and Transfer Learning via One-class classification. So, the most relevant examples are selected to be labeled. Our experiments revealed that it is possible to reduce the computational cost of generating meta- examples and maintain the meta-learning performance.

Referências


Alcalá., J., Fernández., A., Luengo, J., Derrac., J., García., S., Sánchez., L., and Herrera., F. (2010). Keel data-mining software tool: Data set repository, integration of algorithms and experimental analysis framework. Journal of Multiple-Valued Logic and Soft Computing 17(2-3). 255-287.

Brazdil, P., Giraud-Carrier, C., Soares, C., and Vilalta, R. (2009). (Eds.), Metalearning - Applications to Data Mining. Springer, Berlin, Heidelberg. I.H.Witten and Frank, E. (2005). Data Mining: Practical machine learning tools and techniques. Morgan Kaufmann.

Khan, S. and Madden, M. (2010). A survey of recent trends in one class classification. In Coyle, L. and Freyne, J., editors, (eds) Artificial Intelligence and Cognitive Science.AICS 2009. Lecture Notes in Computer Science, vol 6206. Springer, Berlin, Heidelberg, pages 188–197. Publishing Press.

Lichman, M. (2013). Uci machine learning repository. url http://archive.ics.uci. edu/ml.

Liu, B.and Lee, W. S. Y. P. and Li, X. (2002). Partially supervised classification of text documents. ICML-02.

Macia, N., Orriols-Puig, A., and Bernadó-Mansilla, E. (2008). Genetic-based synthetic data sets for the analysis of classifiers behavior. Proceedings of 15th International Conference on Hybrid Intelligent Systems, pp. 507-512.

Nagy., T., Farkas., R., and Csirik., J. (2011). On positive and unlabeled learning for text classification. Habernal I., Matouˇsek V. (eds) Text, Speech and Dialogue. TSD 2011. Lecture Notes in Computer Science, vol 6836. Springer, Berlin, Heidelberg.

Olvera-López, J., Carrasco-Ochoa, J., and Martínez-Trinidad, J. (2010). A review of instance selection methods. Artif Intell Rev (2010) 34: 133. https://doi.org/10.1007/s10462-010-9165-y.

Parente, R. R., Canuto, A. M. P., and Xavier-Junior, J. C. (2013). Characterization measures of ensemble systems using a meta-learning approach. Proceedings of International

Joint Conference on Neural Networks, Dallas, Texas, USA, August 4-9. Prudêncio, R. and Ludermir, T. (2004). Meta-learning approaches to selecting time series models. Neurocomputing 61, 121–137.

Shimodaira, H. (2000). Improving predictive inference under covariate shift by weighting the log-likelihood function. J. Statistical Planning and Inference, vol. 90, pp. 227-244.

Soares, C. (2009). Uci++, improved support for algorithm selection using datasetoids. pages 499–506. Lecture Notes in Computer Science 5476.

Xia, R., Hu, X., Lu, J., Yang, J., and Zong, C. (2013). Instance selection and instance weighting for cross-domain sentiment classification via pu learning. Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence.

Yang, Q. and Pan, S. J. (2010). A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering. 1345-1359.

Zadrozny, B. (2004). Learning and evaluating classifiers under sample selection bias. Proc. 21st Int’l Conf. Machine Learning, July.

Publicado
22/10/2018
Como Citar

Selecione um Formato
PARENTE, Regina R.; PRUDENCIO, Ricardo B. C.. Transfer Learning for Synthetic Examples Selection in Meta-learning. In: ENCONTRO NACIONAL DE INTELIGÊNCIA ARTIFICIAL E COMPUTACIONAL (ENIAC), 15. , 2018, São Paulo. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2018 . p. 811-822. ISSN 2763-9061. DOI: https://doi.org/10.5753/eniac.2018.4469.