A New Risk-Sensitive Deep Learning Optimization Function for Ranking Tasks

  • Pedro H. S. Rodrigues Federal University of Minas Gerais (UFMG)
  • Daniel X. de Sousa Federal Institute of Goiás (IFG)
  • Marcos A. Gonçalves Federal University of Minas Gerais (UFMG)

Abstract


This master thesis proposes the RiskLoss function to deal with the (hard) problem of incorporating risk-sensitiveness measures into Deep Neural Networks (DNNs), by including two adaptations for neural network ranking in ad-hoc retrieval and Recommender Systems (RSs): a differentiable loss function and the use of networks‘ sub-portions, obtained via dropout, as baseline systems for optimizing risk sensitiveness. We empirically demonstrate significant achievements of the RiskLoss functions when used with recent DNN methods. For ad-hoc retrieval, RiskLoss presents the most consistent risk sensitiveness behavior, reducing by 28% the losses over the best evaluated baselines and significantly improving over the risk-sensitive state-of-the-art non-DNN method (by up to 13%) while keeping (or even increasing) overall effectiveness. In RSs, RiskLoss reduced the number of bad recommendations by over 11% for ”hard-to-recommend”users. This dissertation produced a full paper in the most important worldwide Information Retrieval (IR) conference (ACM SIGIR Conference on Research and Development in IR 2022, Qualis A1) and a journal paper submitted to Elsevier´s Information Processing & Management (Qualis A1), currently in the second round of revisions

Keywords: Risk-sensitiveness, Deep Neural Network, Loss Function

References

Dinçer, B. T., Macdonald, C., and Ounis, I. (2016). Risk-sensitive evaluation and learning to rank using multiple baselines. SIGIR ’16, New York, USA.

Fu, Z., Xian, Y., Gao, R., Zhao, J., Huang, Q., Ge, Y., Xu, S., Geng, S., Shah, C., Zhang, Y., and de Melo, G. (2020). Fairness-Aware Explainable Recommendation over Knowledge Graphs.

Knijnenburg, B. P., Willemsen, M. C., Gantner, Z., Soncu, H., and Newell, C. (2012). Explaining the user experience of recommender systems. J. of User Mod. and User-Adapt. Inter.

Lewis, P., Perez, E., Piktus, A., Petroni, F., Karpukhin, V., Goyal, N., Küttler, H., Lewis, M., Yih, W.-t., Rocktäschel, T., Riedel, S., and Kiela, D. (2020). Retrieval-augmented generation for knowledge-intensive nlp tasks. In Proceedings of the 34th International Conference on Neural Information Processing Systems, NIPS ’20, Red Hook, NY, USA. Curran Associates Inc.

Li, H. (2011). Learning to rank for information retrieval and natural language processing. Morgan & Claypool Publishers.

Pobrotyn, P., Bartczak, T., Synowiec, M., Bialobrzeski, R., and Bojar, J. (2020). Context-aware learning to rank with self-attention. In Journal Computing Research Repository.

Qin, T., Liu, T.-Y., and Li, H. (2010). A general approximation framework for direct optimization of information retrieval measures. In Journal of Information Retrieval, 13(0).

Wang, L., Bennett, P. N., and Collins-Thompson, K. (2012). Robust ranking models via risk-sensitive optimization. In Proceedings of the 35th International Conference on Research and Development in Information Retrieval, SIGIR ’12, New York, USA.

Yao, Y., Duan, J., Xu, K., Cai, Y., Sun, Z., and Zhang, Y. (2024). A survey on large language model (llm) security and privacy: The good, the bad, and the ugly. High-Confidence Computing, 4(2):100211.
Published
2024-10-14
RODRIGUES, Pedro H. S.; SOUSA, Daniel X. de; GONÇALVES, Marcos A.. A New Risk-Sensitive Deep Learning Optimization Function for Ranking Tasks. In: THESIS AND DISSERTATION CONTEST (CTDBD) - BRAZILIAN SYMPOSIUM ON DATABASES (SBBD), 39. , 2024, Florianópolis/SC. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2024 . p. 207-211. DOI: https://doi.org/10.5753/sbbd_estendido.2024.240610.