The same artificial intelligence that may soon drive your new car is being adapted to help drive interventional radiology care for patients.
Researchers at the University of California, Los Angeles (UCLA), have used advanced artificial intelligence, also called machine learning, to create a “chatbot” or Virtual Interventional Radiologist (VIR).
This device communicates automatically with a patient’s physicians and can quickly offer evidence-based answers to frequently asked questions.
The scientists will present their research today at the Society of Interventional Radiology’s 2017 annual scientific meeting in Washington, D.C.
This breakthrough will allow clinicians to give patients real-time information on interventional radiology procedures as well as planning the next step of their treatment.
Dr. Edward W. Lee, assistant professor of radiology at UCLA’s David Geffen School of Medicine, and one of the authors of the study, said he and his colleagues theorized they could use artificial intelligence in low-cost, automated ways to improve patient care.
“The fundamental technology that has made self-driving cars possible is deep learning, a type of artificial intelligence modeled after the connections in the human brain,” explained Dr. Kevin Seals, resident physician in diagnostic radiology at UCLA Health, and a study co-author, said in a Healthline interview.
Seals, who programmed the VIR, said advanced computers and the human brain have a number of similarities.
“Using deep learning, computers are now essentially as good as humans at identifying particular objects, making it possible for self-driving cars to ‘see’ and appropriately navigate their environment,” he said.
“This same technology can allow computers to understand complex text inputs such as medical questions from healthcare professionals,” he added. “By implementing deep learning using the IBM Watson cognitive technology and Natural Language Processing, we are able to make our ‘virtual interventional radiologist’ smart enough to understand questions from physicians and respond in a smart, useful way.”
How does VIR work?
Think of it as an initial, superfast layer of information gathering that can be used prior to taking the time to contact an actual human diagnostic or interventional radiologist, Seals said.
“The user simply texts a question to the virtual radiologist, which in many cases provides an excellent, evidence-based response more or less instantaneously,” he said.
He noted that if the patient doesn’t receive a helpful response, they are rapidly referred to a human radiologist.
“Tools such as our chatbot are particularly important in the current clinical environment, which focuses on quality metrics and follows evidence-based clinical guidelines that are proven to help patients,” he said.
Seals said a team of academic radiologists curated the information provided in the application from the radiology literature, and it is rigorously scientific and evidence-based.
“We hope that using the application will encourage cutting-edge patient management that results in improved patient care and significantly benefits our patients,” he added.
“It can be thought of as ‘texting’ with a virtual representation of a human radiologist that offers a significant chunk of the functionality of speaking with an actual human radiologist,” Seals said.
When the non-radiologist clinician texts a question to the VIR, deep learning is used to understand that message and respond in an intelligent manner.
“We get a lot of questions that are fairly readily automated,” Seals said. “Such as ‘I am worried that my patient has a blood clot in their lungs. What is the best type of imaging to perform to make the diagnosis?’ The chatbot can respond to questions like this in a supersmart, evidence-based way.”
Sample responses, he said, can include instructive images (for example, a flowchart that shows a clinical algorithm), response text messages, and subprograms within the application — such as a calculator to determine a patient’s ‘Wells’ score,’ a metric doctors use to guide clinical management.
The VIR application resembles an online customer service chat.
To create a crucial foundation of knowledge, the researchers fed the app more than 2,000 data points that simulated the common inquiries interventional radiologists receive when they meet with patients.
VIR becomes smarter with each use
When a referring clinician asks a question, the extensive knowledge base of the app allows it to respond instantly with the best answer.
The various forms of responses can include websites, infographics, and custom programs.
If the VIR determines that an answer requires a human response, the program will provide contact information for a human interventional radiologist.
The app learns as clinicians use it, and each scenario teaches the VIR to become increasingly smarter and more powerful, Seals said.
The nature of chatbot communications should protect patient privacy.
Confidentiality is “critically important in the world of modern technology and something we take very seriously,” Seals said.
He added that the application was created and programmed by physicians with extensive HIPAA (Health Insurance Portability and Accountability Act of 1996) training.
“We are able to avoid these issues because users ask questions in a general and anonymous manner,” Seals said. “Protected health information is never needed to use the application, nor is it relevant to its function.”
All users — professional healthcare providers such as physicians and nurses — must agree to not include any specific protected patient information in their texts to the chatbot, he added.
None of the diverse functionality within the application requires specific patient information, Seals said.
Improving speed and efficiency of care
This new technology represents the fastest and easiest way for clinicians to get the information they need in the hospital, starting with radiology and eventually expanding to other specialties such as neurosurgery and cardiology, Seals said.
“Our technology can power any type of physician chatbot,” he explained. “Currently, there are information silos of sorts that exist between various specialists in the hospital, and there is no good tool for rapidly sharing information between these silos. It is often slow and difficult to get a busy radiologist on the phone, which inconveniences clinicians and delays patient care.”
Other clinicians at the UCLA David Geffen School of Medicine are testing the chatbot, and Seals and Lee say their technology is fully functional now.
“We are refining it and perfecting it so it can thrive in a wide release,” Seals said.
Seals’ engineering and software background allowed him to perform the necessary programming for the as-yet unfunded research project. He said he and his colleagues will seek funding as they expand.
This breakthrough technology will debut soon.
The VIR will be made available in about one month to all clinicians at the UCLA Ronald Reagan Medical Center. Further use at UCLA will help the team to refine the chatbot for wider release.
The VIR could also become a free app.
“We are exploring potential models for releasing the application,” Seals said. “It may very well be a free tool we release to assist our clinician colleagues, as we are academic radiologists focused on sharing knowledge and improving clinical medicine.
The researchers described the importance of the VIR in a summary of their findings: “Improved artificial intelligence through deep learning has the potential to fundamentally transform our society, from automated image analysis to the creation of self-driving cars.”