Language sentiment can predict future changes in depression symptoms, study finds
MedicalXpress Breaking News-and-Events Sep 18, 2024
A person's choice of words can be predictive of worsening symptoms of major depressive disorder, a new Yale study finds. Using both human evaluators and the large language model ChatGPT, researchers demonstrated that written responses to open-ended questions could be used to predict who would experience worse symptoms of depression weeks later.
The findings, reported Sept. 16 in the Proceedings of the National Academy of Sciences, suggest automated procedures that can assess language use might complement and enhance psychological evaluations.
A growing body of research has uncovered a link between depression and the language a person uses. People with depression use more negative emotional words on social media and in text messages, for instance. And word choice is associated with how well individuals respond to treatment.
For this study, Yale researchers wanted to explore whether language might also yield insight into someone's future symptoms. To better understand this, they asked 467 participants to complete nine open-ended, neutral short-answer questions and the Patient Health Questionnaire (PHQ-9), which assesses depression severity. Three weeks later, all participants completed the PHQ-9 questionnaire again.
Using a tool called Linguistic Inquiry and Word Count (LIWC)—which can calculate how many words fall into a particular category—the researchers identified how many words in the participants' written responses to the short-answer questions had a positive or negative emotional tone. While LIWC scores were associated with depression severity at the time participants answered the questions, they did not predict depression severity three weeks later, researchers found.
Sentiment scores given by human raters, on the other hand, did predict future depression symptoms.
"This told us that human raters were picking up on something that just counting emotion words could not," said Robb Rutledge, an assistant professor of psychology in Yale's Faculty of Arts and Sciences and senior author of the study.
LIWC treats each word individually, which may be why it falls short in this particular application, said the researchers.
"We wanted to look at word order and the multidimensional aspect of language central to shaping emotional tone," said lead author Jihyun Hur, a Ph.D. student in Rutledge's lab and the lab of co-author Jutta Joormann, the Richard Ely Foundation Professor of Psychology. "That's when we got interested in ChatGPT."
ChatGPT is an artificial intelligence tool that aims to mimic human conversational speech. Therefore, word order and the meaning within and between phrases are taken into account in a way that standard tools for analyzing language, like LIWC, do not.
When the researchers instructed ChatGPT versions 3.5 and 4.0 to rate the positive and negative tone of the participants' responses, the scores predicted future changes in depression severity much like the human raters' scores.
Researchers say the finding is a starting point that lays a foundation for additional research. Rutledge and his team, for example, are interested in how this approach might be applied to other psychiatric disorders and across longer time periods.
This line of work is part of the lab's ongoing research into the relationship between emotion and decision-making, which anyone can participate in by playing the games in the lab's free smartphone app Happiness Quest.
Rutledge said he can see this type of language assessment being a useful addition to the clinician's toolbox in the future.
"Analysis of the language people use offers extra information that clinicians currently don't have, and our approach could be one way clinicians evaluate their patients," said Rutledge. "You want a combination of tools that work across lots of people, which together can give you a snapshot of an individual. If some of those tools are automated like this, that frees up the clinician to spend more time trying to help the patient."
And ultimately, a better understanding of symptoms and how to predict them would be beneficial.
"Artificial intelligence tools like ChatGPT open up a new way to use the great deal of language data already available in the clinical setting to better understand mental health," said Hur.
-
Exclusive Write-ups & Webinars by KOLs
-
Daily Quiz by specialty
-
Paid Market Research Surveys
-
Case discussions, News & Journals' summaries