Surrogate Methods Applied to Hyperparameter Optimization Problem

  • José Ilmar Cruz Freire Neto UFS
  • André Britto UFS

Resumo


Hyperparameters affects the performance of machine learning models. Hyperparameter optimization is an area that aims to find the best of them, but it deals with a considerable number of machine learning training, which can be slow. Thus, surrogates can be used to soften this slow process. This paper evaluates the performance of two surrogate methods, M1 and MARSAOP, applied to hyperparameter optimization. The surrogates are confronted with six hyperparameter optimization algorithms from the literature for classification and regression problems. Results indicate that the surrogate methods are faster than the traditional algorithms.

Referências

Bergstra, J., Bardenet, R., Bengio, Y., and Kégl, B. (2011). Algorithms for hyper-parameter optimization. Advances in neural information processing systems, 24.

Bishop, C. M. and Nasrabadi, N. M. (2006). Pattern recognition and machine learning, volume 4. Springer.

Coello, C. A. C. (2017). Recent results and open problems in evolutionary multiobjective optimization. In International Conference on Theory and Practice of Natural Computing, pages 3-21. Springer.

Deb, K., Hussein, R., Roy, P. C., and Toscano-Pulido, G. (2019). A taxonomy for metamodeling frameworks for evolutionary multiobjective optimization. IEEE Transactions on Evolutionary Computation, 23(1):104-116.

Duo, M., Qi, Y., Lina, G., and Xu, E. (2017). A short-term traffic flow prediction model based on emd and gpso-svm. In 2017 IEEE 2nd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), pages 2554-2558. IEEE.

Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep Learning. MIT Press. http://www.deeplearningbook.org.

Grama, L., Tuns, L., and Rusu, C. (2017). On the optimization of svm kernel parameters for improving audio classification accuracy. In 2017 14th International Conference on Engineering of Modern Electric Systems (EMES), pages 224-227. IEEE.

Hutter, F., Hoos, H. H., and Leyton-Brown, K. (2011). Sequential model-based optimization for general algorithm configuration. In International conference on learning and intelligent optimization, pages 507-523. Springer.

Jordan, M. I. and Mitchell, T. M. (2015). Machine learning: Trends, perspectives, and prospects. Science, 349(6245):255-260.

Kamath, C. (2022). Intelligent sampling for surrogate modeling, hyperparameter optimization, and data analysis. Machine Learning with Applications, 9:100373.

Kennedy, J. (2010). Particle Swarm Optimization, pages 760-766. Springer US, Boston, MA.

Lessmann, S., Stahlbock, R., and Crone, S. F. (2005). Optimizing hyperparameters of support vector machines by genetic algorithms. In IC-AI, volume 74, page 82.

Li, Y., Liu, G., Lu, G., Jiao, L., Marturi, N., and Shang, R. (2019). Hyper-parameter optimization using mars surrogate for machine-learning algorithms. IEEE Transactions on Emerging Topics in Computational Intelligence, 4(3):287-297.

Poli, R., Kennedy, J., and Blackwell, T. (2007). Particle swarm optimization. Swarm intelligence, 1(1):33-57.

Ranftl, S., von der Linden, W., and Committee, M. . S. (2021). Bayesian surrogate analysis and uncertainty propagation. In Physical Sciences Forum, volume 3, page 6. MDPI.

Seeger, M. (2004). Gaussian processes for machine learning. International journal of neural systems, 14(02):69-106.

Tenne, Y. and Goh, C.-K. (2010). Computational Intelligence in Expensive Optimization Problems. Springer Berlin, Heidelberg, 1 edition.

Yang, L. and Shami, A. (2020). On hyperparameter optimization of machine learning algorithms: Theory and practice. Neurocomputing, 415:295-316.
Publicado
28/11/2022
Como Citar

Selecione um Formato
FREIRE NETO, José Ilmar Cruz; BRITTO, André. Surrogate Methods Applied to Hyperparameter Optimization Problem. In: ENCONTRO NACIONAL DE INTELIGÊNCIA ARTIFICIAL E COMPUTACIONAL (ENIAC), 19. , 2022, Campinas/SP. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2022 . p. 521-531. ISSN 2763-9061. DOI: https://doi.org/10.5753/eniac.2022.227594.