A recent Gartner survey found the majority of customers are “AI shy”: 64% say they’d rather companies not incorporate AI into the customer experience. Customers also were concerned about AI and incorrect information (42%), data security (34%) and bias/inequality (25%).
Ethical AI can help organizations create innovative, trustworthy user experiences – protecting brands and allowing them to maintain a competitive edge and foster better customer relationships. And ethical AI is part of the story at WellPower.
THE PROBLEM
In the mental health field, there are not enough therapists to help everyone experiencing problems. Community mental health centers such as WellPower in Colorado serve some of the most vulnerable populations needing help.
Because of the complex needs of those being served, WellPower clinicians face more complex documentation rules than therapists in private practice. These additional rules create an administrative burden that takes time that may otherwise have been spent on clinical care.
WellPower had been looking at how technology might serve as a workforce multiplier for mental health.
The provider organization turned to Iliff Innovation Lab, which works with AI, to see how health IT might enable people to connect to their care more easily, such as through telehealth; how people might move through treatment more rapidly by facilitating high-fidelity evidence-based practices and remote treatment monitoring; and how WellPower might reduce administrative burden by facilitating therapists’ generation of high-quality, accurate documentation while spending more of their focus on delivering care.
“When used correctly, clinical documentation is a particularly promising area for AI implementation, especially in behavioral health,” said Wes Williams, CIO and vice president of WellPower. “Large language models have proven especially adept at summarizing a lot of information.
“In a typical 45-minute psychotherapy session, there is a lot of information to summarize to document the service,” he continued. “Staff frequently spend 10 or more minutes completing the documentation for each service, adding up to hours that could otherwise be spent delivering clinical care.”
PROPOSAL
WellPower’s commitment to health equity drives how it approaches technology implementation, making work with Iliff necessary to continue with the mission, Williams said.
“AI tools are often black boxes concealing how they make decisions and can perpetuate biases that have led to the healthcare disparities faced by the people we serve,” he explained. “This places us in a bind, since not using these emerging tools would deny their efficiencies to the people who need them most but adopting them without evaluating for bias could serve to increase disparity if an AI system had historical healthcare biases baked into the system.
“We found a system that leveraged AI as a passive listening tool that could join therapy sessions (both telehealth and in-person) and serve as a sort of digital scribe, generating draft notes for our clinicians to review and approve,” he added. “We needed to ensure the digital scribe could be trusted, however, to generate summaries of the therapy sessions that were accurate, useful and unbiased.”
Behavioral health data is some of the most sensitive, from a privacy and security standpoint; these protections are needed to ensure people are comfortable seeking the help they need, he continued. Because of this, it is critical that WellPower thoroughly vets any new system, especially an AI-based one, he said.
RESULTS
To implement the AI digital scribe, WellPower needed to ensure it didn’t compromise the privacy or safety of the people it serves.
“Many therapists were initially hesitant to try the new system, citing these valid concerns,” said Alires Almon, director of innovation at WellPower. “We worked with the Iliff team to ensure the digital scribe had been ethically built with a privacy-first mindset.
“An example: The system does not make a recording of the therapy session, but rather codes the conversation on the fly,” she continued. “This means at the end of the session, the only thing that is stored is the metadata on what topics were covered during the session. With the insights from the team at Iliff, we were able to ensure the privacy of our patients while opening up more time for care.”
The application of an AI assistive platform to support transcription and develop progress notes drafts has greatly improved the therapeutic experience for both staff and the people WellPower serves, she added.
“Since adopting the Eleos system, WellPower has seen a significant improvement in the staff’s ability to complete their progress notes,” Almon reported. “Three out of every four outpatient therapists are using the system.
“For this group, mean time to complete documentation has improved by 75%, and total documentation time is down 60% (reducing note-writing time from 10 to four minutes),” she said. “Our therapists have been excited to engage with Eleos to the point where some have stated they would think twice about leaving WellPower because of their experience with Eleos.”
ADVICE FOR OTHERS
Artificial intelligence is a new and exciting venture for health IT, but it comes with its own unique baggage that has been defined by science-fiction, media hype and the realities of its capabilities, Almon noted.
“It is important for your organization to educate and define AI for your staff,” she advised. “Explain how it will be used and the processes and policies that will be put in place to protect them and their clients. AI is not perfect and will continue to evolve.
“If possible, before you start to deploy AI-enabled tools, take a pulse to assess the level of understanding about AI and how they feel about AI,” she continued. “Partnering with a program like Iliff’s Trust AI framework not only helps select ethical technology to use, but also communicates that your organization has reviewed the harms that can happen because of AI-enabled platforms.”
That is more important than the results themselves, she added.
“Finally, reassure your staff they cannot be replaced by artificial intelligence,” she concluded. “Human relationships are the most important relationships in the healing of individuals. Artificial intelligence is there to assist the human in their roles, it is an assistive technology. AI can support and assist, but it never replaces a therapeutic connection.”
Follow Bill’s HIT coverage on LinkedIn: Bill Siwicki
Email him: [email protected]
Healthcare IT News is a HIMSS Media publication.
Source : Healthcare IT News