Language models may miss signs of depression in Black people’s Facebook posts

People with depression tend to write and speak about how bad they feel, years of research has shown. But linguistic features linked to depression seem to be absent in Black people’s social media posts, researchers report in the April 2 Proceedings of the National Academy of Sciences.

“We now have over a decade of research [that] has shown how language can be a very powerful indicator of mental health and signs of depression. But one thing we hadn’t understood until this study was how demographic factors … impact that measurement,” says Munmun De Choudhury, a computer scientist at Georgia Tech in Atlanta who is an expert in using social media data to study mental health.

Researchers and public health officials have been testing machine learning programs that are designed to predict links between certain language markers and health outcomes. These programs could act as an early warning system by scouring social media posts to identify spikes in depression across a given population.

However, the new findings suggest such AI programs could miss depression in a big slice of the population. If that’s the case, De Choudhury says, “there are profound public health implications.”

Computer scientist Sunny Rai of the University of Pennsylvania and her team recruited 868 people in the United States for the new study. Half of the participants were Black and half were white, and the two groups were matched by age and gender. Participants completed a standardized online depression survey, the Patient Health Questionnaire 9, and they gave the team access to their Facebook posts. The researchers then fed those social media posts into a text analysis program.

Consistent with earlier work, the use of first-person singular pronouns — “I,” “me” and “my” — increased alongside depression scores for the entire cohort. Conversely, use of first-person plural pronouns, or “we,” “our” and “us,” was linked to lower depression scores. And as depression scores went up, so too did words reflecting negative emotions, such as those referring to feelings of emptiness and longing, disgust, despair, lack of belonging and self-criticism.

But that picture changed when the researchers broke down responses by race. The text analysis program did relatively well at predicting depression in the white group participants, but did poorly at predicting depression in the Black group. In fact, the program’s ability to predict depression in Black participants was close to zero. Even when the team trained the text analysis program on just Black participants’ social media posts, the program failed to identify any linguistic patterns. 

“We re-ran the experiment so many times because we thought we were doing something wrong,” Rai says.

Why the program struggled to predict depression on Black people is unclear, Rai says. Maybe signs of depression in Black people aren’t linked to communication. Or maybe depression links to nonwritten forms of communication, such as changes to body language, rate of speaking or tone. Or maybe the very public nature of social media discourages Black people from sharing too much about how they’re feeling.

It’s even possible that depression lacks universal features, says Ryan Boyd, a psychologist and computational social scientist at Stony Brook University in New York. That would suggest there are flaws in these types of machine learning programs and the information used to train them. For instance, the assumption with this sort of research is that the standardized questionnaire used to measure depression works well, says Boyd, who was not involved in the new study. But mounting evidence suggests that may not be the case, especially with certain populations, such as Black men.

“The model is only as good as the measurement we are basing it on,” Boyd says.

Sorting out just what is going on requires first determining if social media is a uniquely poor format for studying depression or if Black people’s depression
does not manifest in speech even in other settings, such as in private conversations with medical staff, Rai says. “We found this on Facebook, but of course this needs to be replicated … [outside] a public space.”     

administrator

Related Articles