Home / Issues and Investigations / AI Can Tell When You Have Coronavirus Just By Listening To Your Cough
newFile-4

AI Can Tell When You Have Coronavirus Just By Listening To Your Cough

Artificial intelligence is able to distinguish between people who have the coronavirus simply by their coughs.

The difference between cough from someone with coronavirus and a healthy one is something not discernable to the human ear, but is able to be ‘heard’ by a machine learning algorithm.

Researchers from MIT took thousands of samples of coughs and spoken words to train the artificial intelligence, which is now able to detect those with COVID-19 with 98.5 per cent accuracy.

For those who were positively tested with the coronavirus, but showed no symptoms, the artificial intelligence detected them every time.
Before the pandemic, researchers were using similar technologies to detect signs of Alzheimer’s disease.

While Alzheimer is best known for harming humans’ memory, it also weakens the vocal chords.

Moreover, Alzheimer’s patients show emotions like frustration or having a flat affect (where humans have reduced emotional expressiveness) more often than those who do not.

The neural network ResNet50 – an algorithm that is designed to work similar to a human brain – was trained to discriminate sounds with different degrees of vocal cord strength.

Two more neural networks were trained to detect emotions in speech, such as frustration, happiness, and calmness, as well as to detect changes in lung and respiratory performance from coughs.

Combining all three of these models, as well as an algorithm to detect muscular degradation, gave the researchers an artificial intelligence model that could find Alzheimer’s samples – and one that could be adapted to diagnosing Covid-19.

“The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs. This means that when you talk, part of your talking is like coughing, and vice versa. It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state.” said co-author Brian Subirana, a research scientist in MIT’s Auto-ID Laboratory, who worked alongside Jordi Laguarta and Ferran Hueto.

“There’s in fact sentiment embedded in how you cough, so we thought, why don’t we try these Alzheimer’s biomarkers [to see if they’re relevant] for Covid.”

The researchers collected over 70,000 recordings and 200,000 total coughs – the “the largest research cough dataset that we know of,” Subirana said.

Approximately 2,500 samples were from confirmed coronavirus patients.

These samples, along with 1,500 others, were used to train the model. Another 1,000 were chosen to test the model for accuracy.

They found four biomarkers — vocal cord strength, sentiment, lung and respiratory performance, and muscular degradation — that are specific to Covid-19, which researchers believe means that the coronavirus changes how humans produce sounds, even when they are asymptomatic.

The researchers are working on incorporating the findings into an app, which would need to be approved by the United States Food and Drug Administration before release.

If successful, users could cough into the phone and instantly get information about their potential infection.

“The effective implementation of this group diagnostic tool could diminish the spread of the pandemic if everyone uses it before going to a classroom, a factory, or a restaurant,” Subirana said.

The Independent

 

Leave a Reply

Your email address will not be published. Required fields are marked *