A Convolutional Neural Network-based Mobile Application to Bedside Neonatal Pain Assessment
Resumo
More than 500 painful interventions are carried out during the hospitalisation of a newborn baby in a neonatal intensive care unit. Since neonates are not able to verbally communicate pain, some studies have been done to identify the presence and intensity of pain by behavioural analysis, mainly by facial expression. These studies allow a better understanding of this painful experience faced by the neonate. In this context, this work proposes and implements a mobile application for smartphones that uses Artificial Intelligence (AI) techniques to automatically identify the facial expression of pain in neonates, presenting feasibility in real clinical situations. Firstly, a Convolutional Neural Network architecture was adapted and trained with face images captured before and after painful clinical procedures carried out routinely. Then, this computational model was optimised to a mobile environment to make it practical for everyday use. Moreover, we used an explainable AI method to identify facial regions that might be relevant to pain assessment. Our results showed that is possible to classify the facial expression of the pain of neonates with high accuracy. Additionally, our methodology presented novel results highlighting as well sound facial regions that agree with pain scales used by neonatologists and with the visual perception of adults when assessing pain in neonates, whether they are health professionals or not.
Palavras-chave:
Graphics, Pediatrics, Pain, Computer architecture, Mobile applications, Convolutional neural networks, Artificial intelligence
Publicado
18/10/2021
Como Citar
CARLINI, Lucas P. et al.
A Convolutional Neural Network-based Mobile Application to Bedside Neonatal Pain Assessment. In: CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI), 34. , 2021, Online.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2021
.