University of South Florida

Medicine, engineering researchers use facial expression software to help measure pain felt by newborns

For generations, nurses tending to newborns have been able to tell the subtle difference between a baby’s cry of hunger and that of pain.

That ability to distinguish those differences is now being combined with continuous facial expression recognition software in hopes of offering a new way to help health care providers more precisely gauge whether a baby is experiencing pain or simply needing a diaper change.

Neonatal experts in the USF Health Morsani College of Medicine are partnering with facial expression recognition experts in the USF College of Engineering to build data that combines known information collected through facial expression recognition capabilities and the known information from nurses who have years of training and on-the-job experience using the neonatal infant pain scale (NIPS).

“Our intent is to develop a methodology and technology to allow us to better detect when the patients we are caring for experience pain,” said Terri Ashmeade, MD, professor of pediatrics in the USF Health Morsani College of Medicine and chief quality officer for USF Health.

“Babies hospitalized in the NICU experience many painful procedures and research has shown that these painful experiences are associated with altered development of the infants brains and can impact them long term. Babies cannot tell us when they are experiencing pain, or how intense their pain might be. So the most important thing about this research is that, by coupling computer vision technology with vocal responses, we can have a fuller understanding for what our patients are experiencing and know when we should intervene. And that precision in knowing when they are feeling pain would prevent us from exposing babies to medications they don’t need.”

Dr. Terri Ashmeade at Tampa General Hospital’s Jennifer Leigh Muma NICU.

The preliminary study looked at 53 infants in the Jennifer Leigh Muma Neonatal Intensive Care Unit at Tampa General Hospital. Using small video cameras attached to infant incubators, the researchers collected footage of the young patients before, during and after scheduled procedures and interventions. The footage was examined later through facial expression analysis software and was also coupled with vital signs that were measured in sync with the footage, with audio that was also collected, and with near-infrared spectroscopy (NIRS), which measures oxygen levels in the brain.

All of those datasets – facial expressions, body movements, sounds of crying and vital signs – were combined and then matched with the nurses’ own professional expertise of what particular cries and facial expressions mean, the NIPS score. The resulting overlay could provide a tool in a NICU that would constantly monitor a baby and then alert the health care team when there is evidence the baby is feeling any distress from pain. Currently, these NICU-skilled nurses build in typically hourly assessments of the infants to gauge a NIPS score – the new technology would offer round-the-clock monitoring.

Cameras continually monitor the newborns.

This new use of computer vision and pattern recognition adds a new dimension to existing software, said Rangachar Kasturi, PhD, the Douglas W. Hood Professor in the Department of Computer Science and Engineering, USF College of Engineering.

“USF’s expertise in computer vision and pattern recognition is well known, so naturally we have a strong interest in using it to help this population,” Dr. Kasturi said.

“The key difference here is that we’re not trying to recognize or identify a face, we are measuring the baby’s muscle movement and how their creases and lines move, to determine if they are experiencing pain. We are comparing the nurses’ scores with those we get from the technology to determine how accurate our scores are. We want to replicate what these talented nurses do so the babies can be constantly monitored.”

USF engineering professor Dr. Rangachar Kasturi and USF doctoral student Ghada Zamzmi. Photo by Ryan Noone.

In gauging facial expression, capturing known meanings in babies can be difficult, said Ghada Zamzmi, a doctoral student in the USF Department of Computer Science and Engineering.

“There are common expressions such as happy, sad, angry etc. that we know about adults, but those cannot be applied to newborns,” Zamzmi said. “In this study, we are capturing the facial muscle movements in video, or optical flow, and classifying them as relating to pain or no pain. In addition to facial expression, we are automatically analyzing other signals such as sounds, body movement, and heart rate to increase the reliability of detecting pain in case of missing data. We believe developing an automated multimodal system can provide a continuous and quantitative assessment of infants’ pain and lead to improved outcomes. ”

This type of technology and assessment could be used beyond the NICU, including for any patient who is not able to communicate directly with their health care team about whether or not they’re experiencing pain, such as elderly patients with dementia, Dr. Ashmeade said.

NICU babies are some of the most vulnerable and require multiple medical procedures – even surgeries – that are painful, Dr. Ashmeade said.

Babies may require multiple medical procedures while in the neonatal intensive care unit.

“These newborns, many of them born prematurely, cannot communicate their feelings, which is why and how the nursing staff has become the go-to experts for gauging the babies’ needs,” she said. “While we have had many successes in neonatal care and improving survival of our babies, what we really want to focus on is a great outcome. Anything we can do to foster appropriate development, especially of the brain, is what we want for these babies.”

In addition to Drs. Ashmeade and Kasturi, and Zamzmi, researchers on the study included: Chih-Yun Pai, Dr. Dmitry Goldgof, and Dr. Yu Sun. This preliminary research was supported, in part, by a 2016 USF Women’s Health Collaborative Seed Grant. The team has applied for further funding with the National Institutes of Health and expects to hear if an expanded study is approved by next Fall. In June, the research will be presented in Norway at the Scandinavian Conference on Image Analysis, which is sponsored by the International Association for Pattern Recognition.

Story by Sarah Worth, photos by Eric Younghans, USF Health Communications

Network-wide options by YD - Freelance Wordpress Developer