Conditional density estimation using Fourier series and neural networks
Resumo
Most machine learning tools aim at creating good predictions for new samples. However, obtaining 100% is not feasible in most problems, and therefore modeling the uncertainty over such predictions becomes necessary in several applications. This can be achieved by estimating conditional densities. In this work, we propose a novel method of conditional density estimation based on Fourier series and artificial neural networks, and compare it to other estimators on five distinct datasets. We conclude that our proposed method outperforms the other tested methods.
Palavras-chave:
conditional density estimation, neural networks, Fourier series, pytorch
Referências
Bertin, K., Lacour, C., and Rivoirard, V. Adaptive pointwise estimation of conditional density function. In Annales de l’Institut Henri Poincaré, Probabilités et Statistiques. Vol. 52. Institut Henri Poincaré, pp. 939–980, 2016.
Clevert, D.-A., Unterthiner, T., and Hochreiter, S. Fast and accurate deep network learning by exponential linear units (elus), 2015.
Efromovich, S. Dimension reduction and adaptation in conditional density estimation. Journal of the American Statistical Association 105 (490): 761–774, 2010.
Fan, J., Yao, Q., and Tong, H. Estimation of conditional densities and sensitivity measures in nonlinear dynamical systems. Biometrika 83 (1): 189–206, 1996.
Glad, I. K., Hjort, N. L., and Ushakov, N. G. Correction of density estimators that are not densities. Scand J Stat 30 (2): 415–427, jun, 2003.
Glorot, X. and Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. vol. 9, pp. 249–256, 01, 2010.
Hall, P., Racine, J., and Li, Q. Cross-validation and the estimation of conditional probability densities. Journal of the American Statistical Association 99 (468): 1015–1026, 2004.
Hinton, G. E., Srivastava, N., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R. R. Improving neural networks by preventing co-adaptation of feature detectors. CoRR, 2012.
Inacio, M., Izbicki, R., and Salasar, L. Comparing two populations using bayesian fourier series density estimation. Communications in Statistics - Simulation and Computation, in press.
Ioffe, S. and Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the 32nd International Conference on Machine Learning, F. Bach and D. Blei (Eds.). Proceedings of Machine Learning Research, vol. 37. PMLR, Lille, France, pp. 448–456, 2015.
Izbicki, R. and B. Lee, A. Converting high-dimensional regression to high-dimensional conditional density estimation. Electron. J. Statist. 11 (2): 2800–2831, 2017.
Izbicki, R. and Lee, A. B. Nonparametric conditional density estimation in a high-dimensional regression setting. Journal of Computational and Graphical Statistics 25 (4): 1297–1316, 2016.
Izbicki, R., Lee, A. B., and Freeman, P. E. Photo-z estimation: An example of nonparametric conditional density estimation under selection bias. The Annals of Applied Statistics 11 (2): 698–724, 2017.
Kingma, D. P. and Ba, J. Adam: A method for stochastic optimization. CoRR vol. abs/1412.6980, 2014.
Kreyszig, E. Introductory Functional Analysis with Applications. Wiley, 1989.
Nugteren, C. and Codreanu, V. Cltune: A generic auto-tuner for opencl kernels. In 2015 IEEE 9th International Symposium on Embedded Multicore/Many-core Systems-on-Chip. pp. 195–202, 2015.
Scricciolo, C. Convergence rates for Bayesian density estimation of infinite-dimensional exponential families. The Annals of Statistics 34 (6): 2897–2920, 2006.
Sugiyama, M., Takeuchi, I., Suzuki, T., Kanamori, T., Hachiya, H., and Okanohara, D. Conditional density estimation via least-squares density ratio estimation. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. pp. 781–788, 2010.
Takeuchi, I., Nomura, K., and Kanamori, T. Nonparametric conditional density estimation using piecewise-linear solution path of kernel quantile regression. Neural Computation 21 (2): 533–559, 2009.
Wasserman, L. All of nonparametric statistics. Springer, New York London, 2006.
Zhang, C., Bengio, S., Hardt, M., Recht, B., and Vinyals, O. Understanding deep learning requires rethinking generalization, 2016.
Clevert, D.-A., Unterthiner, T., and Hochreiter, S. Fast and accurate deep network learning by exponential linear units (elus), 2015.
Efromovich, S. Dimension reduction and adaptation in conditional density estimation. Journal of the American Statistical Association 105 (490): 761–774, 2010.
Fan, J., Yao, Q., and Tong, H. Estimation of conditional densities and sensitivity measures in nonlinear dynamical systems. Biometrika 83 (1): 189–206, 1996.
Glad, I. K., Hjort, N. L., and Ushakov, N. G. Correction of density estimators that are not densities. Scand J Stat 30 (2): 415–427, jun, 2003.
Glorot, X. and Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. vol. 9, pp. 249–256, 01, 2010.
Hall, P., Racine, J., and Li, Q. Cross-validation and the estimation of conditional probability densities. Journal of the American Statistical Association 99 (468): 1015–1026, 2004.
Hinton, G. E., Srivastava, N., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R. R. Improving neural networks by preventing co-adaptation of feature detectors. CoRR, 2012.
Inacio, M., Izbicki, R., and Salasar, L. Comparing two populations using bayesian fourier series density estimation. Communications in Statistics - Simulation and Computation, in press.
Ioffe, S. and Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the 32nd International Conference on Machine Learning, F. Bach and D. Blei (Eds.). Proceedings of Machine Learning Research, vol. 37. PMLR, Lille, France, pp. 448–456, 2015.
Izbicki, R. and B. Lee, A. Converting high-dimensional regression to high-dimensional conditional density estimation. Electron. J. Statist. 11 (2): 2800–2831, 2017.
Izbicki, R. and Lee, A. B. Nonparametric conditional density estimation in a high-dimensional regression setting. Journal of Computational and Graphical Statistics 25 (4): 1297–1316, 2016.
Izbicki, R., Lee, A. B., and Freeman, P. E. Photo-z estimation: An example of nonparametric conditional density estimation under selection bias. The Annals of Applied Statistics 11 (2): 698–724, 2017.
Kingma, D. P. and Ba, J. Adam: A method for stochastic optimization. CoRR vol. abs/1412.6980, 2014.
Kreyszig, E. Introductory Functional Analysis with Applications. Wiley, 1989.
Nugteren, C. and Codreanu, V. Cltune: A generic auto-tuner for opencl kernels. In 2015 IEEE 9th International Symposium on Embedded Multicore/Many-core Systems-on-Chip. pp. 195–202, 2015.
Scricciolo, C. Convergence rates for Bayesian density estimation of infinite-dimensional exponential families. The Annals of Statistics 34 (6): 2897–2920, 2006.
Sugiyama, M., Takeuchi, I., Suzuki, T., Kanamori, T., Hachiya, H., and Okanohara, D. Conditional density estimation via least-squares density ratio estimation. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. pp. 781–788, 2010.
Takeuchi, I., Nomura, K., and Kanamori, T. Nonparametric conditional density estimation using piecewise-linear solution path of kernel quantile regression. Neural Computation 21 (2): 533–559, 2009.
Wasserman, L. All of nonparametric statistics. Springer, New York London, 2006.
Zhang, C., Bengio, S., Hardt, M., Recht, B., and Vinyals, O. Understanding deep learning requires rethinking generalization, 2016.
Publicado
22/10/2018
Como Citar
INÁCIO, M. H. de A.; IZBICKI, Rafael.
Conditional density estimation using Fourier series and neural networks. In: SYMPOSIUM ON KNOWLEDGE DISCOVERY, MINING AND LEARNING (KDMILE), 6. , 2018, São Paulo/SP.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2018
.
p. 41-48.
ISSN 2763-8944.
DOI: https://doi.org/10.5753/kdmile.2018.27383.