views
With asymptomatic cases being a big concern with the coronavirus, there may just be a new method to know if someone indeed has COVID-19 or not. Researchers have confirmed an artificial intelligence (AI) tool that can detect if there are any indications in a human’s cough that could point to a COVID infection, before seeking medical help and further testing. Researchers at the Massachusetts Institute of Technology (MIT) say that people who are asymptomatic may differ from healthy individuals in the way that they cough and while differences are not decipherable to the human ear, AI can detect these. At this time, researchers are working to make this tool available to users as an app, which will be FDA approved.
In a paper published in the IEEE Journal of Engineering in Medicine and Biology, the researchers say that induced or forced cough recordings via a smartphone or a laptop or a PC can help AI detect any changes. They say the AI tool has accurately identified 98.5 percent of coughs from people who were confirmed to have Covid-19. The AI tool also has 100 percent accuracy rate for coughs from asymptomatic humans who did not have symptoms but later tested positive for COVID-19. The tool has been trained on what researchers say are tens of thousands of cough recording samples.
“The effective implementation of this group diagnostic tool could diminish the spread of the pandemic if everyone uses it before going to a classroom, a factory, or a restaurant,” says co-author Brian Subirana, a research scientist in MIT’s Auto-ID Laboratory. The study is co-authored by Jordi Laguarta and Ferran Hueto, of MIT’s Auto-ID Laboratory. An app would allow a user to log in daily, record the sound of their cough on their phone, and get iimmediate guidance on whether they might be infected, which would mean the requirement for further testing.
The model was first trained on a general machine-learning algorithm known as ResNet50, to distinguish and separate sounds associated with different degrees of vocal cord strength. This worked with more than 10000 hours of speech to pick out specific words. The second level of training was on the neural network that distinguishes emotional states that often show up in speech. The same neural network also detects the sentiments shown by patients who have Alzheimer’s, for instance. The third neural network consisted of a database of coughing recordings to distinguish the different lung and respiratory responses. The final step is an algorithm that distinguishes between strong and weak coughing, the latter being a sign of muscle weakness.
Read all the Latest News and Breaking News here
Comments
0 comment