Emotions in Motion: How Movements Can Signal Mental Health Problems

Biomedical engineering senior Ian Y. Kim wears a motion capture suit during an experiment in the Neuromuscular and Musculoskeletal Biomechanics Lab at UT Dallas.

New University of Texas at Dallas researchers show the potential to detect mental health disorders by analyzing the way people move.

Using 3D motion capture models and machine learning models, researchers were able to identify symptoms of depression and high anxiety in subjects from the way they walked and got up from a chair. The findings, published online in the May issue of Gait & Attitudee, shows the potential to develop wearable devices that could one day provide users with early warnings about their mental health.

“Our study shows that depression and anxiety can be identified from human movement,” said Dr. Gu Eon Kang, assistant professor of bioengineering at the Erik Jonsson School of Engineering and Computer Science. “Gait analysis may offer an objective method for evaluating mental health.”

The computer screen shows the trajectory of the marker during the motion capture experiment.

Using gait analysis to identify mental health problems and emotional states is an emerging area of ​​research. Kang cautions, however, that wearable mental health screening technology can be an indicator – not a substitute – for a professional diagnosis.

“If we can detect potential problems, people can seek treatment early and the outcome can be better,” he said.

He said the research can have other applications such as helping animators to better represent emotions.

Researchers at Kang’s Neuromuscular and Musculoskeletal Biomechanics Laboratory recruited 30 young adults for the study and assessed their levels of depression and anxiety using a standardized questionnaire. Participants performed the walk and sit-to-walk tasks while wearing a form-fitting black motion capture suit covered with 68 reflective markers. His movements were recorded by a 16-camera motion capture system.

The researchers trained a machine learning model using data from the participants’ movements combined with information about whether they fell into higher or lower symptom groups for depression or anxiety, based on their questionnaires. Then, they asked the model to predict the mental state of other participants whose data the system hadn’t been trained on.

The model correctly classified participants as walking about 75% of the time, and using the sit-to-walk task, 77% of the time.

Angeloh Stout BS’20, MS’23, a biomedical engineering doctoral student in Kang’s lab and first author on the study, said that as a former high school student athlete, she became interested in research after taking Kang’s biomechanics class. Stout, who helped set up Kang’s lab, said he was surprised by the changes in how people walk depending on their emotional state.

A researcher prepares a reflective marker that will be attached to a motion capture suit.

“You expect others to walk slowly when they are sad. But what is interesting to see is the different body responses that occur,” said brave. “Students with higher depression and anxiety scores showed subtle but measurable differences from students with lower scores in how their joints moved and greater hesitation during transitions such as standing to walking.”

Brave and other researchers in Kang’s lab published another new study, in the February issue Gait & Attitudethat found a person’s gait can also provide insight into their emotional state.

In the study, participants were asked to recall memories of five emotions: anger, sadness, joy, fear and a neutral state. Then, researchers captured their gait patterns in the lab. The model was approximately 59% accurate in distinguishing between emotional states and detected sadness with an accuracy of 66%.

While additional research with more subjects needs to be done to improve the method, Kang said our team’s study shows the possibility of objective measurement of human emotions.

“Believe it or not, gait may be the most reliable modality for detecting emotion,” he said.

“Believe it or not, gait may be the most reliable modality for detecting emotion.”

Dr. Gu Eon Kang, assistant professor of bioengineering at Erik Jonsson School of Engineering and Computer Science

Kang said, he became interested in learning gait as a way to combine his long interest in psychology and engineering. The goal is to determine whether gait analysis can also be used to provide early warning of other mental health problems such as bipolar disorder and neurodevelopmental conditions such as attention-deficit/hyperactivity disorder.

From left: Dr. Gu Eon Kang, Mehreen Dawood, Ashley T. Adams, Angeloh gahah BS’20, MS’23, Mrigank Maharana, Ian Y. Kim, Luke Fisanick and Marvin Alvarez BS’25.

Other authors of the study to detect depression and anxiety through gait analysis include several biomedical engineering students: graduate students Marvin Alvarez BS’25 and seniors Macie Kauffman, Mehreen Dawood, Ashley T. Adams, Ian Y. Kim, Mrigank Maharana and Lukas Fisanick, who conducted related research through the Hobson Wildenthal Honors College Research Apprentice Program. Kaye Mabbun MS’25, Dr. Yunhui Guo, assistant professor of computer science, and Dr. Chuan-Fa Tang, assistant professor of mathematics in the School of Natural Sciences and Mathematics, also contributed.

In addition to Kang, co-authors of the study to gauge emotions by learning gait include Justin MacNeal Cadenhead BS’23, MS’24; Maharana; Ashley Guzman BS’23, MS’24; Kauffman; and Katherine Brown PhD’21, assistant professor of bioengineering. The UT Dallas researchers collaborated with colleagues from the McGovern Medical School at UT Health Houston.

This research was funded by the National Science Foundation (grant 2513070).

#Emotions #Motion #Movements #Signal #Mental #Health #Problems

Leave a Comment