From ASD to mobile health: new HCESC director Jim Rehg develops computational tools for health-related behaviors

2/20/2024 Michael O'Boyle

Written by Michael O'Boyle

Man with long hair wearing glasses and a suit
Jim Rehg, professor of computer science

In 2008, Jim Rehg was a computer science professor at the Georgia Institute of Technology and an expert in the field of computer vision where computers can be used to make sense of images and video. Around that time, he became close with a colleague who described the challenges of seeking diagnoses and assistance for their two children with autism.

Autism spectrum disorder, or ASD, is behaviorally defined by a specialist clinician by observing a patient at length and then regularly monitoring them to assess treatments. Those seeking diagnosis often wait for months and face further delays in receiving assistance and treatment.

“As my colleague was describing this, it struck me that this is something that computer vision can help with,” Rehg said. “Analyzing videos of people to identify different behaviors is a classic vision problem, so why not take advantage of it to develop tools that can help identify behaviors associated with ASD so people can get the assistance they need?”

Now, Rehg is a world leader in the application of computer vision and machine learning to developmental and social psychology, ASD research, and mobile health devices. He has led a National Science Foundation Expedition in Computing and several National Institutes of Health initiatives to study social behavior and communication using computational tools. He is currently the deputy director of the NIH-funded mDOT Center for developing data science tools for mobile health sensors and devices, and he has co-edited a book on mobile health.

As of August 2023, he is also the director of the Health Care Engineering Systems Center and a Founder Professor in computer science and industrial & enterprise systems engineering at The Grainger College of Engineering, University of Illinois Urbana-Champaign.

“Grainger Engineering is a powerhouse in all things computer science and engineering, and the U. of I. has world-class programs in psychology and behavioral science,” Rehg said. “And now that the university has invested in a medical school, this is an ideal environment for conducting research on health-related behaviors and interventions. We are uniquely poised to drive innovations in AI and sensor technology that will directly impact people’s lives.”

‘Seeing’ ASD

To develop software that can discern social behaviors from video, it is necessary to first have a vast data set on which to draw so as many nuances as possible are represented. This was Rehg’s goal for the first stage of his research into ASD recognition.

“We started out by focusing on children playing repetitive social games like peekaboo and seeing if we could tell whether an interaction was occurring and quantify the engagement,” he said. “That paved the way for the first large scale data sets we collected in the NSF Expedition. But a large, multi-university collaboration was just the beginning.”

Before the NSF Expedition, the question of quantifying behaviors associated with autism from video and audio was largely unexplored. The team able to get a handle on the scope of the problem and determine what questions should be asked. Rehg’s group went on to collect more significant data sets with additional funding from the Simons Foundation.

Rehg added, “I’m especially proud that our first NIH award in ASD was reviewed by the panel which is broadly focused on developmental disabilities, as it provides a validation of the value of our technology by the core ASD research community.”

Recently, Rehg’s research group has received a second NIH award focused on developing diagnostics and treatment for minimally verbal children with ASD.

“In research, you always have to pick your battles, and I’ve focused on ASD because it’s one of the most common developmental disabilities,” he said. “In particular, those in the minimally verbal subpopulation are at higher risk for poor outcomes and remain severely under-studied. If we could pinpoint those at risk for being minimally verbal, then we could mount more significant interventions earlier.”

Sensors and big data

Four wearable sensors connected to a smartphone that can provide sources of health data. Counterclockwise from the top left: AutoSense providing ECG, respiration and acceleration data; EasySense providing heat motion, lung motion and lung fluid level data; iShadow Eyeglass providing an outward video camera and an inward infrared camera; and Smartwatch with accelerometers and gyroscopes(10.1093/jamia/ocv056)
Four wearable sensors connected to a smartphone that can provide sources of health data. Counterclockwise from the top left: AutoSense providing ECG, respiration and acceleration data; EasySense providing heat motion, lung motion and lung fluid level data; iShadow Eyeglass providing an outward video camera and an inward infrared camera; and Smartwatch with accelerometers and gyroscopes(10.1093/jamia/ocv056)

While Rehg was developing screening and outcome measures for ASD, he met researchers who were studying health-related behaviors using other wearable sensing technologies.

“Vision is a valuable tool for studying behavior in context, but there are other variables like heart rate, oxygen saturation and step counts that constitute an important dimension of health,” he said. “While commercial devices like the FitBit provided activity tracking capabilities, the broader adoption of mobile sensing was still in its infancy. I was fortunate to be able to collaborate with mobile health researchers in developing new ways to incorporate wearable sensor data into healthcare.”

When the NIH called for research in the application of big data methods to health care, and Rehg became the deputy director of the Center of Excellence for Mobile Sensor Data-to-Knowledge. Most recently, he is the deputy director of an NIH Biomedical Technology Research Resource Center to promote the development of mobile sensing devices and their use across all fields of health.

Turning to HCESC, Rehg said, “We have an opportunity to directly engage with populations across the state that can directly benefit from the use of AI and sensor technology to manage their health and improve outcomes. We are fortunate to have excellent clinical partners such as OSF Healthcare as well as public health collaborators. Through these partnerships, the center can catalyze mobile health research in both academic and clinical settings.”

A methods lab first

Although Rehg is spearheading the application of computer vision and machine learning to health care, he considers himself a methodological researcher first.

“My lab can’t just develop health applications,” he said. “It has to have a strong methodological core. The engine of progress in addressing health and behavior questions is AI and the tools it provides for vision and sensing, and the best way to use those tools is to invent them.”

He maintains a strong research profile in computer vision, and he encourages students to work on fundamental methods in machine learning and computer vision while trying to increase their involvement in the health care and behavioral science research communities.

“That’s a big reason I jumped on the opportunity to come to Illinois,” Rehg said. “The students and researchers in my group will only get interdisciplinary exposure if they have the opportunity to interact with psychologists, behavioral scientists and other health researchers. U. of I. has the community I am looking for and, more importantly, a passion for and a desire to invest in health.”


Share this story

This story was published February 20, 2024.