Home devices that use AI may be able to pick up signs of distress and call for help.
Amazon Alexa and Google Home could be lifesavers — quite literally. A team at the University of Washington (UW) created a new tool that can monitor people for cardiac arrest through your at-home smart speaker.
When someone experiences cardiac arrest, they become unresponsive and may stop breathing or gasp for air. The tool can detect that gasp, known as agonal breathing, and then alert authorities for help.
About 475,000 Americans
“Just like smart speakers can listen to Alexa, what we are showing is that they can also passively listen to agonal breathing sounds and either raise an audible alarm or call emergency services when it detects one,” Shyam Gollakota, PhD, an associate professor in UW’s Paul G. Allen School of Computer Science & Engineering, told Healthline. Gollakota was one of the authors of a new study looking at if Alexa can detect signs of cardiac arrest.
The new tool that UW researchers created is a “skill,” similar to an app, which is a built-in capability that can be used on voice devices that incorporate artificial intelligence (AI). These voice devices include Amazon Echo, whose virtual assistant is dubbed Alexa. Skills can be added to an existing Amazon Echo, Google Home, or smartphone.
Researchers tested out the tool, which was developed using real agonal breathing recordings captured from 911 calls. They collected data from 162 calls between 2009 and 2017, and created 236 clips from the calls. The recordings were captured from Alexa-enabled, iPhone 5s, and Samsung Galaxy S4 devices. After different machine learning techniques were applied, the team came up with 7,316 positive clips.
They were played at different distances to simulate possible locational differences. They also added in sounds that may interfere such as air conditioning running or the sound of a pet.
The team then used a negative dataset that had 7,305 sound samples of noises from people such as snoring.
The technology was able to detect agonal breathing 97 percent of the time from up to 20 feet away. The findings were published in NPJ Digital Medicine.
“We envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event, and alerts anyone nearby to come provide CPR. And then if there’s no response, the device can automatically call 911,” Gollakota said in a statement.
He told Healthline the technology is being licensed and could be commercially available in a year or so.
“We also envision that our system would give users a warning before contacting emergency medical services or other forms of support and provide them a chance to cancel any false alarms,” Justin Chan, a doctoral student and fellow researcher, who also worked on the study, told Healthline.
The researchers plan to commercialize this technology through Sound Life Sciences, Inc. They want to test it on more calls across the country and in other countries.
Chan noted that the technology preserves privacy. It only runs on a smart device and doesn’t send data to the cloud or a third party. Data is only stored locally for a few seconds as required for processing, and is then discarded, he said.
Despite any potential privacy issues, many AI experts find the cardiac arrest detection skill quite a discovery.
“What these researchers have done is brilliant and a glimpse of an important evolution for smart speakers and voice assistants,” Bradley Metrock, an AI and voice expert and CEO of tech-focused Score Publishing, told Healthline.
Medical experts say the findings are interesting, but by the time Alexa can detect something is wrong, it may be too late.
Dr. Robert Glatter, an emergency department physician at Lennox Hill Hospital in New York City, says agonal breathing may start only after brain damage is already occurring.
“Lack of adequate blood flow to the brain for greater than three to five minutes — the amount of time it takes to result in irreversible brain damage — will typically occur well before the onset of agonal breathing,” he explained.
He says further research may help uncover other biomarkers that could help detect warning signs far earlier.
“I think the concept of monitoring breathing in a contactless fashion to detect a heart attack is a work in progress,” Glatter said. “Since early intervention is paramount in trying to save lives, we may need to evaluate biomarkers other than agonal breathing, since its appearance is a sign that death is imminent.”
It may seem odd to use a robotic device as a way to call for help, but AI is increasingly being used as a way to improve people’s health and detecting when something is wrong.
The reason is that AI is particularly good at detecting patterns, explained Marius Kierski, a partner with Sigmoidal.
His company specializes in machine learning and AI.
“I believe that AI advancements have a longer way to go to make it into everyday devices, because of the regulatory requirements,” he told Healthline.
Alexa also has a home security skill that can listen for sounds like glass breaking when you are not home. You just put your Amazon Echo (or Echo Dot) into security mode and tell it you are leaving. It listens.
In addition, the Alexa-enabled devices already have a host of healthcare-related skills. Express Scripts, for example, lets patients check the status of a home delivery and get notifications when orders are shipped. Atrium Health, a healthcare system in the South, allows people to find an urgent care location near them and schedule a same-day appointment.
“You don’t have privacy concerns then because you’re not home to be eavesdropped on,” said Freddie Feldman, president and CEO of VocoLabs. He is creating Alexa skills and other conversational interfaces that can help patients. “It’s a little different [than the cardiac arrest skill technology], but very much the same in that they’re using AI to detect a certain pattern of sounds and then act on it.”
“I think the advancement is really great and interesting,” he added. “Having a device in the home that is connected and ‘always listening’ is actually a benefit in a use case such as this.”
Henry O’Connell, president and CEO of Canary Speech, believes these applications will be primarily used in hospitals and in clinical trials rather than the home
O’Connell’s company is creating technology that integrates AI and machine learning for health. They are working to devise disease classification tools for Parkinson’s and Alzheimer’s disease, as well as conditions such as anxiety and depression.
O’Connell said hospitals and clinical trials may be a better fit for AI voice applications. In part, this is because in these locations, physicians have to be very clear about how their data will be use in order to get informed consent from patients. Only then can their voice data be used to evaluate them for diseases.
Other AI and health applications are in the pharmaceutical industry, which uses AI in the drug research process.
There’s even a chance radiologists may soon employ AI. A 2018 study found AI may beat dermatologists at detecting some skin cancer signs.