Improving Medical Image Segmentation with Semantic Mask-Guided Diffusion Models

  • Marcelo Clasen Ribeiro UFPel
  • Marilton Sanchotene de Aguiar UFPel

Resumo


Image segmentation plays a fundamental role in medical image analysis, especially in Computer-Aided Diagnosis (CAD) systems, by highlighting anatomical structures and assisting physicians in making accurate clinical decisions. Despite the remarkable success of deep learning-based segmentation models, their performance heavily relies on large and diverse annotated datasets, which are often limited in the medical domain due to the high cost and complexity of data acquisition and labeling. To address this limitation, this article proposes a data augmentation pipeline based on semantic mask-guided diffusion models, aiming to synthetically enhance medical image datasets with anatomically consistent samples. The proposed method was evaluated using annotated abdominal CT images from the MICCAI FLARE 2022 Challenge, employing the YOLO11 segmentation network to assess the impact of the augmented data. Experimental results show that the method improved segmentation performance, achieving an mAP@0.5 of 0.792 and an mAP@0.5:0.95 of 0.467, outperforming the baseline model trained solely on real data, which obtained 0.788 and 0.456, respectively. These findings highlight the effectiveness of diffusion-based augmentation in improving generalization and robustness of segmentation networks. Moreover, the results suggest that this strategy is particularly beneficial for segmenting larger and high-contrast anatomical structures, such as the liver and kidneys, which may guide future research toward structure-aware data synthesis techniques in medical imaging.
Publicado
29/09/2025
RIBEIRO, Marcelo Clasen; AGUIAR, Marilton Sanchotene de. Improving Medical Image Segmentation with Semantic Mask-Guided Diffusion Models. In: BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 35. , 2025, Fortaleza/CE. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2025 . p. 49-63. ISSN 2643-6264.