How Artificial Intelligence Can Assess Your Health with Just Your Voice
Photo Source: Flickr
The human voice sings, speaks, and expresses a wide range of emotions from joy to sadness, boredom to excitement. One can easily hear if a friend sounds a bit depressed or afraid, but you cannot hear if they have an increasing chance of heart disease or Alzheimer’s disease. Beyond expressing ourselves, our voices also contain subtle clues to our health, not detectable with the normal ear. Such clues hidden in our voices only become apparent with some remarkable applications of artificial intelligence (AI).
Imagine the scenario where you did not have to make an appointment, take time off work, and go to the doctor to have a medical test, and instead, you only had to read some text into your smartphone to get the diagnosis. The work of researchers at Mayo Clinic in Rochester, MN and an Israeli startup called Beyond Verbal Communication (now called Vocalis Health after a merger) demonstrated the human voice contains clues to heart health. In their paper titled “Voice Signal Characteristics Are Independently Associated with Coronary Artery Disease,” the authors describe the study of 138 test subjects of whom 101 had planned angiograms. (mayoclinicproceedings.org) An angiogram is a test administered in the clinic that uses X-rays and an injected radiochemical to provide contrast in the X-ray images of blood vessels to detect atherosclerosis or narrowing of the arteries. The test subjects each recorded their voice three times on their smartphone, reading a text describing an emotionally positive and an emotionally negative event. The researchers analyzed the differences in voice features such as tone between people with and without coronary artery disease (CAD). The study clearly showed two vocal features that specifically indicate CAD.
Alzheimer’s disease, characterized by progressive degeneration of brain cells, leads to loss of brain functions such as memory and speech. Alzheimer’s as of 2019 affects an estimated 5.8 million Americans. (alz.org) Because Alzheimer’s disease progresses over time and current therapies only slow its progress, early detection of Alzheimer’s remains crucial for the care of our population. However, early detection remains a challenge for clinicians. Canadian researchers Kathleen Fraser, Jed Meltzer, and Frank Rudzicz in Toronto used AI to analyze voices to look for evidence of Alzheimer’s disease. The work described in an article in the Journal of Alzheimer’s Disease titled “Linguistic Features Identify Alzheimer's Disease in Narrative Speech,” details how they set up an experiment in which people with possible Alzheimer’s disease and healthy control people receive a picture and must describe what they see. The descriptions were recorded and analyzed for many factors such as complexity of the vocabulary, use of specific nouns, the accuracy of descriptions as well as the voice tones and time between words. Their work demonstrated that AI helped to attain an impressive 81% accuracy in detecting Alzheimer’s disease. They isolated four main areas that distinguished healthy from Alzheimer’s disease patients—improper sentences, unusual vocal sounds, loss of memory, and slow recall of information. The researchers also have taken their work and started a company called Winterlight Labs that offers a tablet-based system to assess cognitive health using AI and voice recordings.
People communicate all the time with their voices to share information, give directions, convey emotion, entertain, comfort, and more. Although we perceive vast amounts of information about each other via the voice, AI has opened the door even more subtle information carried in our voices about our health. Using sophisticated computer programs, tones in the voice and the words chosen to describe an image AI can find the patterns that separate the healthy from the unwell. Certain vocal sounds can now differentiate an individual with coronary artery disease from a healthy individual. Additionally, Toronto based researchers demonstrated that their AI system could analyze hundreds of variables in the sound and content of a person describing a picture. Their work found that factors such as slow recollection of details, unusual vocal sounds, and the timing between words provide a very accurate assessment of the presence of Alzheimer’s disease. Such discoveries offer better, non-invasive detection of serious illnesses and the option for early intervention. Moreover, all of this can happen over the phone, making it easier to reach people in a remote place or who have difficulty traveling. Using AI, health professionals can find new ways to help detect disease earlier and easier to help get people the help they need.
Dr. Smith’s career in scientific and information research spans the areas of bioinformatics, artificial intelligence, toxicology, and chemistry. He has published a number of peer-reviewed scientific papers. He has worked over the past seventeen years developing advanced analytics, machine learning, and knowledge management tools to enable research and support high level decision making. Tim completed his Ph.D. in Toxicology at Cornell University and a Bachelor of Science in chemistry from the University of Washington.
You can buy his book on Amazon in paperback here and in kindle format here.