MIT researchers have found a new angle: applying AI to detect COVID-19 in an asymptomatic patient. From a cough. Recorded by a phone.
Earlier this year, researchers from MIT solicited tens of thousands of recorded “forced” (i.e. not naturally occurring) coughs from over 5,300 subjects – both COVID-positive and COVID-negative – in addition to data about physical symptoms and demographics. They then used that data to train a revised version of their Alzheimer’s model (That had been the target of earlier research with this technology) to distinguish between COVID-infected and COVID-uninfected coughers.
MIT researchers had been working on AI cough models to detect signs of Alzheimers, which causes neuromuscular degradation that is associated with weaker vocal cords. Using three layered neural networks, the researchers had carefully tuned the model to control for differences in emotion, differences in vocal cord strength, and differences in respiratory health when processing coughs, arriving at an algorithm aimed at detecting the kind of degradation associated with Alzheimer’s.
“The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs. This means that when you talk, part of your talking is like coughing, and vice versa,” said Brian Subirana, a research scientist in MIT’s Auto-ID Laboratory and one of the co-authors of the paper, in an interview with MIT’s Jennifer Chu. “It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state. There’s in fact sentiment embedded in how you cough. So we thought, why don’t we try these Alzheimer’s biomarkers … for COVID.”
And, it turns out, the model could do just that. After training, it successfully identified 98.5% of COVID-positive coughs from new recordings, including – most impressively – 100% of asymptomatic COVID-positive coughs (the tool may actually perform better for asymptomatic coughers because a symptomatic cough may be a symptom of a non-COVID ailment, such as the flu). While the coughs sounded identical to the human ear, the AI voice model picked up on telling discrepancies. “We think this shows that the way you produce sound, changes when you have Covid, even if you’re asymptomatic,” Subirana said.
Source: AI Enterprise News. https://www.enterpriseai.news/ and Datanami.
November 24, 2020 by Oliver Peckham