This content is for paid members only.

Join for $20/month or $175/year

Amanda DiTrolio and Dr. Andrew Lundquist

A Guide to AI for Clinical Documentation with Dr. Andrew Lundquist

October 11, 2023
Community Wisdom

TL;DR: We sat down for a recent Lunch & Learn session with Dr. Andrew Lundquist (CMO, Mankato Clinic) to discuss his recent articles evaluating the use of AI in healthcare documentation (here and here). We focused on his broader key takeaways and insights related to how AI can be leveraged to alleviate the burden of documentation on clinicians and help inform startups in the space on key clinical considerations when building such solutions.

Below is a summary of our conversation. 

– 

Table of Contents 

AI Documentation Tools Presentation

  • Current State of Affairs
  • AI Dictation Market Explosion 
  • Elements to Evaluate Vendor Solutions

Q&A 

  • Q1: Apart from the performance criteria you highlighted earlier - speed, cost-effectiveness, and accuracy, what other crucial functional factors should be considered when evaluating various vendors?
  • Q2: What are the substantial product differences (e.g. workflow, model, model training, regulatory compliance, etc.) between AI documentation tools for clinics compared to consumer or SaaS AI documentation tools (e.g. Otter.ai)? Do you see the consumer healthcare world merging at all in that regard? 
  • Q3: There are dozens of companies popping up monthly, and some of them are kind of leapfrog with new evolutions of just how quickly you can get to market. And it's exciting to try to test them out, but I’m always worried about sensitive data leaking out. And so far, all of them seem to be storing the transcriptions in their servers. How would you think about assessing whether an organization can safely handle data and security issues?
  • Q4: Have you ever experienced a situation where the technology fails and you end a conversation in about 45 minutes with no notes of the discussion? 
  • Q5: For the non clinicians here, like myself who haven't experienced a demo or seen this in a visit, is there a generalizable workflow where you are clicking record on your computer before the visit as you're walking into the room and then you're getting an email two days later with a word doc with all the notes and are you pasting them into the EMR? Is there a generalizable workflow that happens across these companies? 
  • Q6: Being a CMO of an 180 doc practice thinking about implementing these tools - how much of the decision is technology, those sorts of elements versus gaining adoption within your provider group? I imagine there's a lot of change management that needs to happen within those 180 clinicians to get them to actually use the technology. How much of the conversation is about organizational readiness for that sort of change adoption amongst the entire provider group? 
  • Q7:  Do these platforms come with end user agreements? Is there a variation in risk across vendors?
  • Q8: Have you been able to quantify the ROI of these tools? (E.g. productivity improvement, reduction of time spent, appropriateness of billing, optimizing billing, etc.)

You can check out the full recording of the live discussion below.

Now, onto the highlights!

AI Documentation Tools Presentation

Current State of Affairs

See full recording snippet at 6:52 here.

  • According to a 2016 study, during the office day, physicians spend 27% of their total time on direct clinical face time with patients and 49.2% of their time on EHR and desk work. While in the examination room with patients, physicians spent 52.9% of the time on direct clinical facetime and 37% on EHR and deck work. The 21 physicians who completed after-hours diaries reported 1 to 2 hours of after-hours work each night, devoted mostly to EHR tasks. 
  • Drew provides a personal anecdote speaking to the mental burden of patient care:

I often discuss the mental strain healthcare providers experience when caring for patients. For instance, if an average physician sees around ten to 15 patients in a half-day session, either in the morning or afternoon, they typically have to allocate only ten to 20 minutes per visit. During this short time, they must ensure thorough documentation of all relevant information, leaving them with minimal opportunities to catch up on their patient load. Consequently, they often find themselves mentally juggling multiple patients' details, like recalling issues they might have forgotten to note from earlier encounters. This constant mental juggling can prevent healthcare providers from being fully present with each patient, as they strive to ensure comprehensive care and documentation. 

AI Dictation Market Explosion

See full recording snippet at 8:30 here.

General Timeline of Events

  • October 2022: AI documentation is hard to explain and few companies exist
  • November 2022: Chat GPT launches
  • October 2023: Dozens of solutions for AI documentation exist on the market

Elements to Evaluate Vendor Solutions

See full recording snippet at 11:38 here.

Elements to consider:

  • Clunkiness kills: The reality is, most clinicians are documenting how they historically have been trained. So, if you offer them a tool that is in any way clunky, they likely won’t use it. For instance, maybe you have to dictate on your phone, but then slide it over to your EHR on your computer that causes friction for the provider. 
  • Avoid redundancy of work: Don’t create a solution that requires duplicative processes. For example, if a clinician captures a note through dictation, but then has to copy and paste it into each portion of the form, it will take more time and also kill adoption of the tool. 
  • EHR integration: At this point, this is a ‘nice-to-have’ feature, but not a requirement. Providers are really looking for a tool that can efficiently and accurately document what is happening in the room. The EHR integration will come.

A Glimpse Into How Provider’s Think About Solution Priorities

See full recording snippet at 13:07 here.

In short, the most important elements to clinicians when evaluating the gamut of solutions on that market, the biggest priorities are:

  • Be quick
  • Be cheap 
  • Be exact

Drew provides an anecdote to how he evaluates vendors:

There's pricing, there's speed, and then there's work input by the clinician. So some of these solutions are really expensive. If they're really expensive, I just talked about cost in medicine and how we don’t have a lot of money to throw at things. If it's a really expensive tool and we have to put that cost on our clinician, it's almost to the point where they have to go to their spouse and say, “Hey, we're actually making quite a bit less money because I need something to help me do my work.” It’s that discussion. And it better be quick and it better get you home early and it better create very little work input.

Now, if there's a tool that's a little less expensive, like maybe the cost of a cable bill, then you don't even have to have that discussion. And it can be a little slower, it can have a little more work input from the clinician. 

So these companies are coming in with different pricing models, different speeds of work. And I think that one of the biggest things is that we just need to be okay with 70% of the work being done, which I think is the discussion within the room. A lot of my people that we've had pilot these, they're okay doing some of the work, they just need all of that redundant work taken away from them. 

Q&A

Q1: Apart from the performance criteria you highlighted earlier - speed, cost-effectiveness, and accuracy, what other crucial functional factors should be considered when evaluating various vendors?

See full recording snippet at 16:31 here.

Cost and reduction in amount/burden of work on clinicians are still the two most important criteria. Beyond that, other considerations / ‘nice-to-have’ things include:

  • Can it help us code more accurately and automatically?
  • Can we do RAF scoring? Can we get our HCC scores? Can we get recapture?
  • How good is the tool at writing emotional / subjective patient reported elements of a note?

Q2: What are the substantial product differences (e.g. workflow, model, model training, regulatory compliance, etc.) between AI documentation tools for clinics compared to consumer or SaaS AI documentation tools (e.g. Otter.ai)? Do you see the consumer healthcare world merging at all in that regard? 

See full recording snippet at 22:22 here.

  • There may be some merging, but ultimately you do need solutions that are built in more of a healthcare specific context and filtering. 
  • Then, behind the scenes, prompt engineering should also accommodate healthcare-focused needs. 

Drew provides a real world anecdote:

I think there needs to be that healthcare focus because I've tried to use my healthcare documentation tool for other conversations and it doesn't work very well. And I've tried to use other tools just to take notes in meetings and it doesn't do the same. So I think there needs to be that healthcare focus and then more layers of the other technology that's coming along to help to really widen all that down to the right thing. 

So I would say that in my experience and with what I'm seeing in the market, is that you definitely have the healthcare focus. Especially, if you go into risk adjustment factors and diagnoses.

Q3: There are dozens of companies popping up monthly, and some of them are kind of leapfrog with new evolutions of just how quickly you can get to market. And it's exciting to try to test them out, but I’m always worried about sensitive data leaking out. And so far, all of them seem to be storing the transcriptions in their servers. How would you think about assessing whether an organization can safely handle data and security issues?

See full recording snippet at 22:40 here.

  • At the end of the day, there will be some level of risk tolerance that exists. 
  • You should prioritize companies that have a history of demonstrating safe and best practices around storing patient information - this is something an IT team or HIPAA officer would be responsible for assessing. 
  • You can also evaluate the background of founders and the early team – do they have a background in security, data privacy, etc. that can serve as a barometer too.
  • In the clinical setting, we also implement a disclosure layer prior to the appointment that clearly asks patients if they are okay with the provider using technology during the visit.

Q4: Have you ever experienced a situation where the technology fails and you end a conversation in about 45 minutes with no notes of the discussion? 

See full recording snippet at 27:48 here.

Drew provides an anecdotal response:

Yes, I forgot to turn on or hit the record button once, and I was super relaxed just listening in, typing in my orders, and then it's gone. And I think that can happen. Back in the day, there were times I was doing my voice to text and it didn't work. 

One thing to consider, is there are certain vendors / technologies that will get these notes back to you the next day or the day after. My brain doesn't hold information as long as it used to, so if I think that it's all recorded and then two days later, I have no recording, then I have very little recollection. So it kind of comes back to that speed, price, and input criteria framework. If it's coming back right away and I can assess it right at that time or redo it at the time, that is very valuable to me. But if I'm paying a lot of money for a tool that's going to get back to me two days later, and then when it fails, it's going to take me 15 minutes to think things through again, that's difficult.

Q5: For the non clinicians here, like myself who haven't experienced a demo or seen this in a visit, is there a generalizable workflow where you are clicking record on your computer before the visit as you're walking into the room and then you're getting an email two days later with a word doc with all the notes and are you pasting them into the EMR? Is there a generalizable workflow that happens across these companies? 

See full recording snippet at 29:47 here.

The workflow across vendors is certainly varied. It will range from you hit ‘record’ and then you get a document that you can then copy paste over within about 30 seconds of the end of your visit. Or it can go to where you're hitting ‘record’ that goes off to a central location and then the platform then puts your note into the chart via an API and then two days later that comes back as a note that you have to review and sign.

Q6: Being a CMO of an 180 doc practice thinking about implementing these tools - how much of the decision is technology, those sorts of elements versus gaining adoption within your provider group? I imagine there's a lot of change management that needs to happen within those 180 clinicians to get them to actually use the technology. How much of the conversation is about organizational readiness for that sort of change adoption amongst the entire provider group? 

See full recording snippet at 32:00 here.

  • This is critically important. As with any organization, you will have providers who are about to retire and if you change their workflow, they will quit. On the other hand, you have new providers who are just coming out of residency that maybe have already used these tools or they are just more willing to test them. Further, change management comes up across organizations many times and does take a long time. It takes people a lot of time to adopt new technologies since folks are used to doing things the traditional way. 
  • To help alleviate some of the change management issues, it can be beneficial to offer multiple options to your org’s providers. For example, they can use voice recognition, participate in pilot programs for AI, or continue to type notes. Finding what works for people is what will work in the long run.  

Q7:  Do these platforms come with end user agreements? Is there a variation in risk across vendors?

See full recording snippet at 35:30 here.

Yes, each vendor required signing a BAA agreement.

Further, there are a variety of risk points when it comes to these tools, just as there is with traditional dictation / voice-to-text software. One lens to consider: is the transcript being retained by the vendor or not? If it’s not being retained, there is less risk associated with that. 

Q8: Have you been able to quantify the ROI of these tools? (E.g. productivity improvement, reduction of time spent, appropriateness of billing, optimizing billing, etc.)

See full recording snippet at 37:40 here.

In a recent pilot we conducted with one of the more expensive tools, we evaluated changes in the following: 

  • Improvements in provider coding levels
  • Patient satisfaction
  • Clinician satisfaction
  • Increase in # of visits
  • Etc.

Then we modeled out the question for providers: ‘At X amount of dollars to spend on the tool, you would then have to see Y many more patients per week to make this work financially.’ In other words, getting to that break even or net positive result to justify purchasing the tool. 

Conclusion

That’s all for now, folks! We had a great time chatting with Drew, and appreciate him taking the time to share their knowledge with our fellow HTNers. 

If you made it through this, and are finding that it sparked some additional questions for you, we’ve included Drew’s contact information in the case you’d like to reach out.

Dr. Andrew Lundquist: lundquistdpm@gmail.com