Here Todd Crosslin, Global head of healthcare and Life Sciences at Snowflake discusses the current components and challenges of data in Precision Healthcare.
Editor's note: Responses have been edited for clarity and length.
Be sure to check out the Becker's Healthcare & hc1 Precision Health Virtual Summit taking place to hear exclusive interviews and sessions with industry experts as they discuss the challenges and opportunities around value-based care and precision health in today's healthcare landscape. Register today here.
Question: What role does data have to play in the delivery of precision healthcare?
Todd Crosslin: Two extremely valuable, but challenging, data components in precision healthcare are DNA, and Real World Data/Evidence. Being able to join these, at scale, generates precision insight. Imagine a virtual, de-centralized clinical trial in which the Pharmaceutical Sponsor, CRO, healthcare provider(s), and FDA were able to collaborate, in real-time on a single copy of a combined dataset that includes the participants’ diagnostics, longitudinal Rx data, claims data and EMR history, VCF (genomic variant call format), and DICOM data/images. This is possible today through advanced cloud data platform methods whereby the patient and healthcare provider generate/access PHI, while all others would see an anonymized view.
Q: What challenges do healthcare systems face to be able to use their data for precision healthcare and how can we overcome those challenges?
TC: You would be hard pressed to find a data scientist that says the less data you provide me the more precise my model will be. Unfortunately, that is reality for most healthcare systems today. Data is locked in silos with little to no ability to scale to the needs of the system. IT is paralyzed by the fear of the next ransomware attack. Replacing on-premise data stores with a modern cloud data platform provides unlimited capacity and scalability with unmatched security and governance.
Q: What advice do you have for healthcare systems that want to ensure they are making the best use of their data for efficient operations and improved patient outcomes?
TC: Iterate and Collaborate. All too often I see disjointed efforts whereby an IT group over a period of months “waterfalls” all of an organization’s data from on-premise to a data lake in a cloud provider. What insight/value did operations, financial, or clinical teams gain from this? When IT Collaborates with an operations team in a weeks, not months long, effort, the Iterative effort can have a defined analytical value. For example, the rapid rollout of virtual care tools and processes generates a tremendous amount of raw data. Cloud data platforms ingest this new data without concern for capacity or compute scalability as it’s available on-demand. Operational analysts paired with data engineers join the virtual care with in-person care datasets and analytical value is derived with speed and agility having a positive impact on patient experience/outcome.