Los Angeles-based UCLA Health is going big on big data, using it to not only streamline operations and gauge the patient experience but also predict disease progression and identify the rarest of conditions, said Chief Data Officer Albert Duntugan.
Becker's recently interviewed Mr. Duntugan, who has been with the health system since 2007 and became chief data officer in 2020, about UCLA Health's most effective uses of big data and AI and where he sees this technology going next.
Note: This conversation has been condensed and lightly edited for clarity.
Question: What specific initiatives have come out of your team's work?
Albert Duntugan: With AI, there's this magic of math, where a prediction can be made with inputs that are automated through computers rather than being dependent on just people.
We have a portfolio of about a dozen AI models that touch on all aspects of healthcare.
On the operations side, we're trying to streamline operations in our clinics and in our hospital.
We also have models that try to predict the onset of a disease. Is a patient going to come up with a condition where they may be seen in our hospitals soon, through our emergency department or elsewhere? Will we have a chronic kidney disease patient progressing to where they'll need elevated monitoring from an endocrinologist?
Finally, we also have a set of models looking at the patient experience — patients who've had a number of their appointments canceled or showed up late. Is there something going on where we need to make resources more available to them so they're happier with their experience in our health system?
Q: There are differing opinions on the usefulness of AI in healthcare. Where do you stand on that topic?
AD: We have a patient transfer center that receives requests from providers in our area to have their patients admitted to our hospitals.
There's a triage team that's evaluating the requests. If it's a patient with acute needs, then maybe they need to be directed to our Ronald Reagan facility. Or if it's not that acute, they can go to our Santa Monica hospital, which is more of a community-based facility. So one of our algorithms has been predicting whether patients are going to be at that acute level or not.
Another operational example is what we call the "rising risk model." Is the patient going to be potentially admitted to one of our emergency rooms within the next 30 days? Do we need to flag a patient for an intervention?
We also have a partnership with the Regeneron Genetic Center to do whole-exome sequencing on a subset of our patients, about 40,000 of them. We do a very thorough analysis of their genomes to see if we can find rare diseases that we can potentially act on.
So in these cases, there's a lot of promise but there's also a lot of education that needs to happen with the primary care physician population. And there needs to be building up of infrastructure. For example, we're building up our teams of genetic care counselors, who will be looking at the data that's coming back from Regeneron.
Q: Have you gotten any positive results from these use cases?
AD: The sentiment from the stakeholders has been positive. Those triage teams feel their work is a lot easier, that they're surfacing patients they may not have paid attention to, that it's really taking care of some of the cognitive overload they've been facing with COVID and all the other stressors.
The screening algorithms, by their nature, are targeting a small subset of the population. For example, in that genomics work it's anticipated that maybe 1-3 percent of patients would have a clinically actionable finding over the next three years. So, although that may not sound like a lot, it's going to be very important for that small population that is pinged, and it'll allow us to build up our capability, where we're learning a lot from this small cohort and we can extend it to many others down the line.
Q: How does your work intertwine with UCLA Health's digital health and virtual care initiatives?
AD: After patients are discharged from surgery and have to manage their wounds at home, they are surveyed on an app: How are you feeling today?
This way, we get their psychosocial disposition, in addition to getting a fact-based picture where they're taking photos of their wounds, whether it's through their own device or through an iPad we give them.
So when their treating physician gets the data, it has both the images of their wounds to see how the wound is healing over time, and also how the patient is doing — how's their diet? How are they eating? This is a set of self-reported data we may not get in the traditional EMR sense where a doctor or nurse is doing all the data collection.
A decade or so ago, when meaningful use was a big thing and the EHR incentive program was a big thing, just knowing how to get data out of an EHR was a major accomplishment. But today, after all of that investment, it's nice to see that we've moved along to handling these far more complex datasets: these so-called big data sets, genomics, imaging waveforms that come from electrocardiograms.
Q: Is data processed quickly enough to gather actionable insights from it?
AD: What makes healthcare unique is that our data is sparse. When something bad happens — when a patient gets admitted to the hospital and procedures are done to them there — that's when you see that high spike in data collection. But we want to try to anticipate those important clinical events where we can do an early intervention.
So does it make sense to increase the frequency of collection, especially now that Apple Watches and these devices are becoming more ubiquitous? Would patients be willing to share more data with us so there's more of a chance we can catch something? These are the social challenges we need to address.
Q: Do you have any digital health or Big Tech collaborations that advance your data work?
AD: The infrastructure that's required to power this big data is very sophisticated, especially when you're an academic health system that's doing translational work between the research side and the clinical side.
Microsoft has been a key partner for us in the cloud. In order to make this big data happen, we have to spin up very powerful infrastructure that's difficult to do within our traditional data centers. Being able to do this in the cloud with Microsoft Azure has been very helpful.
This was important during COVID. We were fortunate enough to receive an award from Microsoft to do really groundbreaking research collecting data on COVID patients from across the University of California system. That infrastructure was in the Microsoft Azure cloud. And that same infrastructure is being leveraged on the genomics side.
Q: Where do you see this work going in the future?
AD: We have about a dozen AI models in our portfolio and growing. Does success in the future mean we have five dozen models in five years? Or when we talk about all of our care providers, does it mean that AI is permeating what all of their work is doing?
Or does it mean we'll continue to be hyper-specialized? We've seen other organizations where success for them means not having a dozen models but having three year over year.
In addition to these hyperspecialized data scientists, it's important to really just be side by side with our doctors and nurses, looking at what they're doing on the front line and really understanding what the patient journey looks like. And then as data is generated from those experiences, we can really appreciate how that data can be used for AI.