ORLANDO—AI-enabled virtual care assistants are reshaping patient care and augmenting healthcare professionals’ capabilities, but providers need to know how to build trust as the technologies evolve, experts said during the HIMSS24 preconference Virtual Care Forum on Monday.
Matt Cybulsky, founder and consultant at Ionian Healthcare Consulting, led the discussion on building trust in virtual agents and chatbots. Andy Chu, senior vice president of product and technology incubation at Providence Health, and Kathleen Mazza, clinical informatics consultant at Northwell Health, joined the panel.
Mazza said Northwell has been using chatbots since 2018, a move she called “fortuitous” as it allowed them to be “ahead of the curve” when New York became the Covid epicenter in 2020.
“We started with our Medicare population, trying to reduce those avoidable readmissions and using the chatbots for the first 30 days following discharge with very targeted chats developed for heart failure, COPD, stroke–the high-risk diagnoses that Medicare had identified,” she said.
Mazza said patients want to be connected to the health system even outside the walls of the hospital, and it is up to the organization to provide that connectivity.
Chu added that many patients now look at technologies, such as chatbots, akin to text messaging. The key is being proactive in determining why patients turn to these platforms.
“Close to 40% of the messages our patients ask the chatbot have nothing to do with clinical questions—they’re administrative questions: billing, appointment booking, or medication questions,” he said.
Providence recently released a feature where patients can ask a chatbot if they qualify for financial assistance.
“Those are the sorts of things we’re trying to do to be proactive as patients are trying to find care and as they are trying to navigate across the system,” Chu said.
Significant challenges remain regarding ensuring AI algorithms behind the interface are operating correctly.
“We start seeing certain categories where we’re not meeting the mark, which we call drift,” Chu explained. “Then we go into the model and see what’s going on and how we must continue to evolve.”
Mazza said it’s crucial to ensure patients’ conversations with chatbots and virtual agents are meaningful.
“If the conversation turns into a 10-minute task, it becomes cumbersome for the patient,” she cautioned. “If you’re not going to act on the information, be cautious about just adding additional questions.”
Both Mazza and Chu agreed that building patient trust and confidence in the chatbot is essential and will take time. They noted that ensuring a quick and seamless connection to a human presence must always be a priority.
“That chatbot we use has a phone icon on the screen, and if the patient answers a series of questions that raises an alarm, a nurse will get an alert and ask to speak to the patient,” Mazza said. “You have to have a human at the end somewhere. You’re not selling shoes to people online. This is healthcare.”