Image Denoising using Attention-Residual Convolutional Neural Networks

  • Rafael Gonçalves Pires Unesp
  • Daniel Felipe S. Santos Unesp
  • Cláudio Santos UFScar
  • Marcos C. Santana Unesp
  • João P. Papa Unesp

Resumo


During the image acquisition process, noise is usually added to the data mainly due to physical limitations of the acquisition sensor, and also regarding imprecisions during the data transmission and manipulation. In that sense, the resultant image needs to be processed to attenuate its noise without losing details. Non-learning-based strategies such as filter-based and noise prior modeling have been adopted to solve the image denoising problem. Nowadays, learning-based denoising techniques showed to be much more effective and flexible approaches, such as Residual Convolutional Neural Networks. Here, we propose a new learning-based non-blind denoising technique named Atenttion Residual Convolutional Neural Network (ARCNN), and its extension to blind denoising named Flexible Attention Residual Convolutional Neural Network (FARCNN). The proposed methods try to learn the underlying noise expectation using an Attention-Residual mechanism. Experiments on public datasets corrupted by different levels of Gaussian and Poisson noise support the effectiveness of the proposed approaches against some state-of-the-art image denoising methods. ARCNN achieved an overall average PSNR results of around 0.44dB and 0.96dB for Gaussian and Poisson denoising, respectively FARCNN presented very consistent results, even with slightly worsen performance compared to ARCNN.
Palavras-chave: image restoration, deep learning, machine learning
Publicado
07/11/2020
Como Citar

Selecione um Formato
PIRES, Rafael Gonçalves; SANTOS, Daniel Felipe S.; SANTOS, Cláudio; SANTANA, Marcos C.; PAPA, João P.. Image Denoising using Attention-Residual Convolutional Neural Networks. In: CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI), 33. , 2020, Evento Online. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2020 . p. 81-87.