window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'UA-115625534-1');
Davoudi, Anis, Kumar Rohit Malhotra, Benjamin Shickel, Scott Siegel, Seth Williams, Matthew Ruppert, Emel Bihorac, Tezcan Ozrazgat-Baslanti, Patrick J. Tighe, Azra Bihorac, and Parisa Rashidi
Nature, Scientific Reports, 9, Article number: 8020 (2019).
Publication year: 2019

Currently, many critical care indices are not captured automatically at a granular level, rather are repetitively assessed by overburdened nurses. In this pilot study, we examined the feasibility of using pervasive sensing technology and artificial intelligence for autonomous and granular monitoring in the Intensive Care Unit (ICU). As an exemplary prevalent condition, we characterized delirious patients and their environment. We used wearable sensors, light and sound sensors, and a camera to collect data on patients and their environment. We analyzed collected data to detect and recognize patient’s face, their postures, facial action units and expressions, head pose variation, extremity movements, sound pressure levels, light intensity level, and visitation frequency. We found that facial expressions, functional status entailing extremity movement and postures, and environmental factors including the visitation frequency, light and sound pressure levels at night were significantly different between the delirious and non-delirious patients. Our results showed that granular and autonomous monitoring of critically ill patients and their environment is feasible using a noninvasive system, and we demonstrated its potential for characterizing critical care patients and environmental factors.