It’s an unfortunate paradox: care that patients receive in the ICU often saves their lives, but spending time there can also hurt their health.
Sedation, prolonged periods in bed, and around-the-clock monitoring by loud machines can affect patients’ cognitive, psychological and physical functioning even after they’ve gone home, Francesca Rinaldo, MD, PhD, told me recently. The condition is called post-intensive care syndrome, and one way to avoid it is to get patients up and moving as soon as practical.
“Getting them moving can reduce the number of days they’re delirious,” Rinaldo, a Stanford Medicine surgery resident, said. “It can reduce muscle wasting and weakness. It can get them off the ventilator faster and reduce things like pressure ulcers.”
To study the relationship between mobility of ICU patients and their outcomes, Rinaldo has teamed with Serena Yeung, PhD, in a project through the Partnership in Artificial Intelligence-Assisted Care at Stanford’s Clinical Excellence Research Center.
Current methods for documenting patient movement are burdensome and ripe for human error, so this team is devising a new way that relies on computer vision technology similar to that in self-driving cars. Sensors in a hospital room capture patient motions as silhouette-like moving images, and a trained algorithm identifies the activity — whether a patient is being moved into or out of bed, for example, or into or out of a chair.
“One of the really exciting things about computer vision is that it’s this powerful measuring tool,” said Yeung, who will be joining the faculty of Stanford’s department of biomedical data science this summer. “It can watch what’s happening in the hospital setting continuously, 24/7, and it never gets tired.”
The system, as envisioned by the researchers, will offer several potential benefits. It will document patients’ mobility activities, freeing nurses from a clerical duty, ensuring an accurate record and helping clinicians confirm they delivered the care they intended. It also will provide comprehensive data to support studies of clinical care.
Rinaldo and Yeung are well on their way to these goals. Working with Intermountain LDS Hospital in Salt Lake City, their team installed depth sensors in patient rooms in the adult ICU. There they collected more than 98,000 video frames, which were annotated to identify the activity and used to train the algorithm.
A paper recently published in NPJ Digital Medicine outlines the algorithm’s success at detecting patients’ movements in and out of bed or a chair, along with the duration of the movement and the number of people assisting.
With the algorithm ready, the next step will be to analyze patient outcomes in the context of their mobility, as tracked by the computer, Rinaldo said. Areas of interest include duration of ICU stay, number of days the patient had delirium — or has symptoms including confusion, drowsiness and short-term memory problems — and number of days on a ventilator, as well as functional status at discharge.
“There are a lot of barriers to implementing different types of mobility protocols because of perceptions that these patients — who are on ventilator support or getting IV medications — may be too sick to mobilize,” Rinaldo said. “But studies have shown that these protocols are very safe. And one thing I hope our work will do is to change the perceptions around mobility.”
The researchers also plan to fine-tune the algorithm so it can be used in other hospitals, and to train it to recognize a broader range of movements. Yeung told me:
Our ultimate goal is to be able to describe in very detailed fashion all of the care that is being delivered to a patient across all aspects of a hospital and even beyond. It’s really trying to quantify, what is the health care that every patient is receiving? And how can we use the data from computer vision to design the best care practices possible for patients?
Image and video courtesy of Francesca Rinaldo and Serena Yeung