Cedars-Sinai is the first health system to try out the Nurse Assistant app from tech startup Aiva Health, which transcribes nurses’ voice dictation of patient encounters then uses the information to fill out fields in the Epic EHR. Becker’s caught up with Rachel Coren, vice president and associate CIO at Cedars-Sinai, about how the tool has been working so far.
Question: How does Cedars-Sinai measure the success of the pilot program, and what key performance indicators are being tracked?
Rachel Coren: Our clinical staff adoption and feedback drove the development and will continue to guide us. Our nurses and nurse aides have offered extremely positive feedback about their experience with the new assistant, and we’re starting to get a lot of interest from other nursing units, departments and our affiliate hospitals.
We’re seeing an improvement in real-time documentation. And while we aren’t able to directly correlate, we have seen a very positive increase in our patient experience outcomes on this pilot unit.
This reinforces our long-standing belief and commitment that when you care for your clinicians, who in turn care for your patients, you get better outcomes. We’re also seeing some modest reduction in overtime for some of the highest utilizers. This assistant has the potential to reduce that administrative burden and hopefully improve clinician well-being, letting them leave on time at the end of their shift, when they’ve completed all the things they need to do.
Q: How does this tool compare to other existing solutions on the market? How unique is it?
RC: We’re going to continue to see a rapid advancement in this type of technology in our industry. We’ve seen, and you’ve reported, a lot of organizations announcing partnerships. We’re excited we were able to accelerate our implementation and use this at the bedside.
You’ve also reported a lot about the industry piloting this type of technology focused on physician burnout, which we’re also doing, but we’re committed to finding a solution for our nurses.
The way nurses document is very different from a physician. The fact that this Aiva assistant integrates with our EHR and is able to capture a conversation and translate it into discrete documentation fields, I think is unique at this time, and is really what’s adding value for our nurses and our clinical partners.
Q: Have patients given feedback on this AI tool? Or can patients not tell it’s being used?
RC: We’re getting very positive feedback from our patients, both in conversation on the units and from our patient experience scores and comments. We’ve had Alexa devices in many of these rooms for a long time, so the concept of having a voice assistant be part of engaging with your care team is not unfamiliar to our patients. But they’re excited for our nurses, and they’re realizing they’re getting more time with their nurse.
Q: Is there a financial ROI for this program? How is that being tracked?
RC: The focus is around clinician well-being, because we do believe there are outcomes tied to that. As mentioned, we are seeing some reduction in overtime hours for our highest utilizers, and they’re telling us this is attributed to the use of the assistant. So, in there, there could be some financial savings.
We’re tracking a lot of different KPIs, and we’re looking to learn: How do you measure ROI on these kinds of tools? The vendors are trying to figure this out as well: how to price these products and how to demonstrate value.
Q: Are there any unintended consequences or ethical concerns with using AI in nursing?
RC: Cedars-Sinai takes a very thoughtful approach to all of these new capabilities. We have an AI council that includes an ethicist to help guide the governance and approach to the tools. Our AI validation framework assesses for bias both at the beginning and ongoing. We know these things can drift.
For this use case, the request came from our nurses. We did many rounds of comprehensive end-to-end testing with the vendor prior to releasing it to nursing. Then we started with a small-scale pilot to bring it into the field and learn. And then that was very successful, and we were able to quickly spread to a full unit.
We’re also tracking the data on how many times a nurse rejects that summary of the conversation to make sure the tool is accurately capturing what is being shared. Thus far, the rejection rates remain very low, so that’s positive for us, but it’s something we’re going to continue to track very closely.
Q: Does this platform require a lot of training?
RC: It is relatively intuitive, but we did design a training program for this pilot. We decided to start with in-person sessions during a nursing skills lab day and engaged our clinical partners at the change of their shift.
We had some presentations, a demo and the opportunity for everyone to ask questions. We then engaged our unit super users, and partnered with our informatics team and the vendor to provide go-live support.
We were hands-on in the beginning to make sure everything we tested was what we experienced in the field. As we continue through this pilot, we’re gathering that feedback and learning about adoption rates and how intuitive it is.