Researchers at MIT have used machine learning to create software capable of detecting whether a person has caught COVID-19 by analyzing their cough, a development that could eventually result in an iPhone app for daily checks.
The UK's NHS COVID-19 app for iPhone uses the Apple-Google exposure notification API to help combat the coronavirus.
So far, the iPhone has lent itself to helping users determine if they are at risk of the coronavirus by coming into close proximity to someone carrying the virus. If a new discovery by MIT researchers is developed further, the iPhone may end up being able to do more to dampen the spread of the virus further.
A paper from the team published in the IEEE Journal of Engineering in Medicine and Biology claims an AI model was created that could tell the difference between asymptomatic people and those who are healthy, via analysis of recordings of forced coughs. The model is claimed to be accurate 98.5% of the time when listening to recordings of people confirmed to have had COVID-19, as well as 100 percent of asymptomatic cough recordings.
The team collected more than 70,000 recordings via a website where the public could record a series of coughs via their smartphone or other devices, at the same time as filling out a survey about their symptoms, if they were confirmed to have the virus, and other details. The recordings resulted in around 200,000 forced-cough samples, including 2,500 of those confirmed to have COVID-19 or were asymptomatic.
Combining the 2,500 confirmed samples with another 2,500 randomly selected from the data set, the AI model was trained, then tested. The researchers claim the results revealed "a striking similarity between Alzheimer's and COVID discrimination."
The AI framework was based on one that existed for Alzheimer's research, and determined it could pick up four biomarkers relating to vocal cord strength, sentiment, lung and respiratory response, and muscular degradation specific to COVID-19.
The team are now working to create a pre-screening app it intends to distribute for free, based on the AI model, as well as working with a number of hospitals to enlarge the cough recording pool, for further training.
It is suggested by the team that such cough analysis could be implemented into smart speakers and digital assistants to perform daily assessments. This naturally would depend on the devices involved having sufficiently good quality microphones as well as handling the necessary privacy issues, not to mention the assistance of companies like Apple and Amazon to implement, making it unlikely albeit altruistic.