AI Model Predicts Alzheimer’s Risk with 78.5% Accuracy from Speech Patterns

A new artificial intelligence program developed by Boston University researchers could revolutionize how we predict and diagnose Alzheimer’s disease. The model, which analyzes speech patterns, can predict with 78.5% accuracy whether someone with mild cognitive impairment will develop Alzheimer’s-associated dementia within six years.

This breakthrough could allow for earlier diagnosis and intervention, potentially slowing the disease’s progression with new treatments. Moreover, it could make cognitive impairment screening more accessible by automating parts of the process, eliminating the need for expensive lab tests, imaging exams, or even office visits.

Machine Learning Meets Cognitive Health

The research team, led by Ioannis Paschalidis, director of BU’s Rafik B. Hariri Institute for Computing and Computational Science & Engineering, used data from the long-running Framingham Heart Study to train their model. They analyzed audio recordings of 166 initial interviews with people aged 63 to 97 who had been diagnosed with mild cognitive impairment.

“We wanted to predict what would happen in the next six years—and we found we can reasonably make that prediction with relatively good confidence and accuracy,” says Paschalidis. “It shows the power of AI.”

The model combines information extracted from audio recordings with basic demographics to generate a score indicating the likelihood of someone remaining stable or transitioning to dementia. Importantly, it relies on the content of the interview rather than acoustic features like enunciation or speed.

Democratizing Dementia Diagnosis

This research could significantly impact how we approach Alzheimer’s screening and diagnosis. Rhoda Au, a coauthor on the paper, emphasizes the potential for creating “equal opportunity science and healthcare.”

“Technology can overcome the bias of work that can only be done by those with resources, or care that has relied on specialized expertise that is not available to everyone,” Au explains. The model could potentially be used to bring care to patients far from medical centers or provide routine monitoring through an at-home app.

Looking ahead, Paschalidis aims to explore using data from more natural, everyday conversations and potentially developing a smartphone app for dementia diagnosis. The team is also considering expanding their analysis beyond speech to include patient drawings and data on daily life patterns.

As Au puts it, “Digital is the new blood. You can collect it, analyze it for what is known today, store it, and reanalyze it for whatever new emerges tomorrow.”

This research, funded in part by the National Science Foundation, the National Institutes of Health, and the BU Rajen Kilachand Fund for Integrated Life Science and Engineering, represents a significant step forward in our ability to predict and potentially prevent Alzheimer’s disease progression.


Substack subscription form sign up