Laura Kelly-Hunter, May 1, 2023
Photo credit: James Green
Applying Artificial Intelligence to Infant Patient Monitoring
Ever since artificial intelligence (AI) and machine learning (ML) technology development began years ago, we have witnessed the ways we live and work rapidly evolve.
Whether it is incorporating new tools into the workday like ChatGPT, a text-generating chatbot that automatically drafts communications, or enjoying the first Wii home video game console, which detects movements as you interact with the game, AI can enhance many areas of our lives.
Using similar technologies, Carleton University Systems and Computer Engineering researcher James Green and his students are applying machine learning to improve patient care in the currently overstretched health-care system.
Specifically, Green is developing patient monitoring algorithms to track patient vital signs, movements, and clinical interventions in real-time. This system uses video-based computer vision to monitor individuals without physically touching them.
Moving Toward Noncontact Patient Monitoring
Green was motivated to begin researching noncontact patient monitoring after two of his children were admitted to the Neonatal Intensive Care Unit (NICU). He realized that existing patient monitors lead to a high number of false alarms, and the wired sensors interfere with normal parental bonding. Furthermore, he observed that nurses spend a lot of time on onerous charting, where everything from a diaper change to feeding and dressing, must be recorded in the patient’s file.
“I recognized that the number of false alarms led to fatigue among clinical staff and parents, and that wired sensors prevent parents from holding their baby naturally,” said Green. “By moving towards non-contact patient monitoring technologies and real-time scene understanding, we hope to reduce false alarms, simplify parental bonding, and potentially streamline charting of clinical notes.”
“I also noticed that a typical infant patient’s monitoring was based on multiple wired sensors attached to various places on their body,” said Green, “which cannot only be cumbersome but irritating to their fragile skin, especially for those in a precarious condition.”
With wired sensors, alarms can easily be generated when a patient moves, causing about 50% of all patient alarms to be false. Due to the high volume of alarms, nurses can experience alarm fatigue or desensitization leading to delayed responses to care.
By using computer vision to enable noncontact monitoring, the NICU can eventually implement semi-automated reporting without the use of wired sensors, which can avoid both inaccurate retrospective charting and false sensor alarms.
Detecting Infant Patient Care Needs
In the beginning, Green’s team faced challenges starting research at the NICU. Seeking data from real infant patients admitted to the NICU meant data collection would need to be stretched over several years (2016-2020).
“It takes significant time to study patients since clinical staff can be limited and are very occupied with patients,” said Green, “which can make it difficult to implement testing.”
Given the complex research, Green was grateful to have excellent collaborators at CHEO including neonatologist Dr. JoAnn Harrold, Director of Clinical Engineering Kim Greenwood (a Carleton alumnus), and NICU nurse Cheryl Aubertin. Green’s research was funded by both IBM and the Natural Sciences and Engineering Research Council of Canada (NSERC).
Thanks to support from partners at the hospital, Green’s lab developed new ways to collect data without touching the patient including face detection, pose estimation (under the blankets), movement detection, recognition of feeding events, and estimation of vital signs including heart rate and respiration rate.
This data can quickly answer questions like: Is the infant in bed? Is someone with them? Are they moving, and if so, how? These are all questions that are difficult to answer with existing infant monitoring systems.
With computer vision, the computer can analyze video data to produce real-time notes, which can later be reviewed and added to by health-care staff, saving them time that is better spent interacting with patients.
“This new technology, in tandem with existing technology, will enhance the accuracy and efficacy of patient monitoring,” said Green. “We are one step closer to a semi-automatic system that uses text generated by the technology to help doctors and nurses assess and diagnose an infant’s condition in less time, allowing for more accurate treatments.”
To learn more about James Green’s current research projects, visit his lab website.
Share: Twitter, Facebook