LeanDL HPC Challenge 2025: Applying Large-Scale Model Adaptation Techniques

  • Kenzo Miranda Sakiyama USP
  • Magaly Lika Fujimoto USP
  • René Vieira Santin USP
  • Solange Oliveira Rezende USP

Resumo


The rapid expansion of academic literature presents significant challenges for manual analysis and categorization, making it difficult to identify key research gaps. In this context, the LeanDL-HPC 2025 challenge aims to automatically classify Brazilian theses and dissertations based on their adherence to state-level strategic themes while also addressing critical constraints in computational resources such as memory, runtime, and energy. Considering these challenges, this work proposes a comparison of several efficient approaches for adapting modern models, such as LLM and BERT, under resource constraints. Specifically, it explores Parameter-Efficient Fine-Tuning (PEFT) through QLoRA—which reduces memory consumption by using 4-bit quantization and low-rank adapters — and a recent improvement of a traditional transformer encoder (ModernBERT). Moreover, the experiments employed a Balanced Loss Function, which was also used to overcome class imbalance, penalizing the misclassification of minority labels.
Palavras-chave: Adaptation models, Runtime, Quantization (signal), Computational modeling, Large language models, Memory management, Bidirectional control, Manuals, Transformers, Encoding, Large Language Model, Fine-tuning, Parameter-Efficient Fine-Tuning, QLoRA, Balanced Loss Function, Modern BERT
Publicado
28/10/2025
SAKIYAMA, Kenzo Miranda; FUJIMOTO, Magaly Lika; SANTIN, René Vieira; REZENDE, Solange Oliveira. LeanDL HPC Challenge 2025: Applying Large-Scale Model Adaptation Techniques. In: WORKSHOP ON LIGHTWEIGHT EFFICIENT DEEP LEARNING IN HPC ENVIRONMENTS (LEANDL-HPC) - INTERNATIONAL SYMPOSIUM ON COMPUTER ARCHITECTURE AND HIGH PERFORMANCE COMPUTING (SBAC-PAD), 37. , 2025, Bonito/MS. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2025 . p. 139-146.