Researchers at MIT have used machine learning to create software capable of detecting whether a person has caught COVID-19 by analyzing their cough, a development that could eventually result in an iPhone app for daily checks.
So far, the iPhone has lent itself to helping users determine if they are at risk of the coronavirus by coming into close proximity to someone carrying the virus. If a new discovery by MIT researchers is developed further, the iPhone may end up being able to do more to dampen the spread of the virus further.
A paper from the team published in the IEEE Journal of Engineering in Medicine and Biology claims an AI model was created that could tell the difference between asymptomatic people and those who are healthy, via analysis of recordings of forced coughs. The model is claimed to be accurate 98.5% of the time when listening to recordings of people confirmed to have had COVID-19, as well as 100 percent of asymptomatic cough recordings.
The team collected more than 70,000 recordings via a website where the public could record a series of coughs via their smartphone or other devices, at the same time as filling out a survey about their symptoms, if they were confirmed to have the virus, and other details. The recordings resulted in around 200,000 forced-cough samples, including 2,500 of those confirmed to have COVID-19 or were asymptomatic.
Combining the 2,500 confirmed samples with another 2,500 randomly selected from the data set, the AI model was trained, then tested. The researchers claim the results revealed "a striking similarity between Alzheimer's and COVID discrimination."
The AI framework was based on one that existed for Alzheimer's research, and determined it could pick up four biomarkers relating to vocal cord strength, sentiment, lung and respiratory response, and muscular degradation specific to COVID-19.
The team are now working to create a pre-screening app it intends to distribute for free, based on the AI model, as well as working with a number of hospitals to enlarge the cough recording pool, for further training.
It is suggested by the team that such cough analysis could be implemented into smart speakers and digital assistants to perform daily assessments. This naturally would depend on the devices involved having sufficiently good quality microphones as well as handling the necessary privacy issues, not to mention the assistance of companies like Apple and Amazon to implement, making it unlikely albeit altruistic.
46 Comments
I wonder if it can distinguish between a flu cough and a COVID cough. Hmmm…
So this would allow on demand, free, instantaneous testing for anyone with the right phone or microphone. This is actually pretty big news, especially if an app could be developed quickly.
Perhaps MIT could write an app, but it seems more appropriate for MIT to give the cough-detection code to Apple and Google (who could pay MIT) to enhance their COVID API, so that the governments deploying their COVID apps can access that feature, so that users don't need to install multiple apps. After all, this is a worldwide problem and I don't think MIT wants to get involved in deploying apps for 200 different countries.
This is an utter joke. CDC says most people have 2.6 preexisting conditions who die, they don’t die from SARS-Cov2. The cough could be from many other things.
Please put all efforts into simple, preventive universal 1.5m detection.
Use BT, UW, LiDAR, NFC, TOF, AirDrop proximity detection and all other high-profile technologies to their extreme.
How hard can it be now that the Covid API already uses a form of proximity-detection and the Measurement app has sub-millimeter precision ?