What started as a pioneering method to track humans’ emotions has become a revolutionary new method for diagnosing early stages of progressive diseases according to one Israeli startup. Beyond Verbal, originally conceived to focus on emotional patterns detectable in people’s speech, has just announced the preliminary results of an exploratory study with the Mayo Clinic that establishes a “strong correlation” between certain characteristics of individuals’ voices and heart disease, also known as coronary artery disease (CAD).
“A patient’s voice is the most readily available, easy to capture, and rich output the body offers,” Beyond Verbal CEO Yuval Mor said in a statement. “We are very excited to be able to work with Mayo Clinic on such a breakthrough research, studying the potential of using the human voice in healthcare monitoring and specifically CAD.”
In September, Beyond Verbal raised $3 million to pursue its research, which of course included this project with the Mayo Clinic. The study was conducted as a double blind with 120 patients electing to have a coronary angiography — a test that uses dye and X-rays to see how much plaque has built up in a patient’s arteries (that blockage is the crux of CAD). During the experiment, patients would be asked to say out loud a prepared text, which was then recorded and analyzed across several metrics including frequency and voice intensity.
Besides the prepared text, which usually made for a 30-second long recording, each patient was also asked to record a positive experience in his/her life and a negative one respectively, presumably to find emotional and tonal shifts that might enrich the data. Results showed that when certain voice characteristics appeared, the patient was 2.6 times as likely to have CAD.
Their recently announced Beyond mHealth Research Platform works passively and continuously monitors voice patterns, whose changes in theory correlate with changes in health.
Their original emotional analytics technology features include an attitude score that gauge’s someone’s excitability, positivity, temperament and a range of other emotions like joy and aggressiveness. An example can be seen here from the now distant-past Republican primary debates featuring Donald Trump’s rivalry with Fox News anchor Megyn Kelly.
As is not often stressed in media reports about studies, this is an exploratory, preliminary study whose results have yet to be duplicated by follow-on experiments. Still, the results are promising for Beyond Verbal’s premise and seem to justify further investment in applying their emotional analytics technology to medical diagnostics. The technology might have applications in other areas where voice characteristics have been theorized to have a link with other conditions: dyslexia and ADHD. Beyond Verbal says they have already applied their work to correlative studies with other diseases, namely neurological disorders like Parkinson’s and Autism.
The company says proving this connection will allow for improving continuous monitoring in the mHealth field and reduce the cost of remote healthcare.
They have previously worked on targeting marketers and sellers to use this emotional analytics data to track 430 precise emotions to better engage potential customers and business partners. They boast that their technology is built off more than 21 years of research that includes 2.5 million “emotion-tagged” recordings in 40+ languages.