Americans are increasingly concerned about the state of cybersecurity.
The majority of Americans, 64 percent, have experienced a major data breach, according to a Pew Research Center report from January 2017. Pew Research Center, which surveyed 1,040 adults across the United States, suggests these breaches have eroded Americans' trust that institutions are able to protect their personal information.
Roughly half of Americans don't trust modern institutions like credit card providers and business companies to protect their data from unauthorized users, according to the report. A similar proportion of Americans even say they do not believe the federal government can keep their private information secure.
In the healthcare sector, American consumers aren't wrong to be worried. Last year, the healthcare industry reported 12 million compromised patient records, according to an IBM X-Force report. In 2015, nearly 100 million healthcare records were compromised. The frequency of data compromise has inevitably led some patients to ask: How secure are we?
Sumit Nagpal, co-founder and CEO of health IT company LumiraDx, spoke with Becker's Hospital Review about how healthcare stakeholders can make patients feel secure without downplaying the harsh realities of today's cybersecurity landscape.
Note: Responses have been lightly edited for length and clarity.
Question: About half of adults in the U.S. feel their personal information is less secure than it was five years ago, according to the Pew Research Center report. In what ways is this true, or not true, for the healthcare industry?
Sumit Nagpal: Over the past two years, we've witnessed a palpable increase in reported breaches of personal information, including health information, whether caused by preventable mistakes, inadequate safeguards or targeted malicious attacks. Reports of ransomware attacks are also on the rise.
While some of this increase can be attributed to improved reporting, the tremendous financial value of personal health information, increasing sophistication of malicious technologies and a potentially more permissive regulatory environment are certainly fueling more personal information to become accessible without our personal consent. We have serious reasons to be concerned.
Q: If patients are worried about the safety of their protected health information, how can health systems assuage their worries and give them peace of mind?
SN: I don't think it's a stretch to say that most health systems have strict policies and procedures, combined with substantial staff and infrastructure targeted at protecting personal health information. They take this responsibility very seriously, and invest more than most industries to secure such data.
Health systems often choose to defer benefits from innovations — cloud, Internet of Things, bring-your-own-device — that can improve access to data and dramatically reduce operational cost, because they have to balance privacy considerations against timely actionable information. They have to balance these privacy considerations against adopting innovations that can improve convenience, outcomes and save cost; against innovations that can improve patient engagement; and against using such health information for research that can save lives, improve outcomes, reduce costs and streamline operations.
The answer lies in continuing to invest in what these organizations are already doing with cybersecurity, combined with more transparency. Health systems can implement immediately accessible audit logs for patients, which can show how [their] data is being used, along with consent mechanisms that allow patients to choose — anytime, anywhere — whether that is OK. These common-sense measures are the key to building more trust and collaboration.
Q: Healthcare breaches are becoming more common. Organizations, especially those who have been hit by cyberattacks, are increasingly acknowledging that they are a reality in the industry. Can health systems ethically reassure patients of their privacy?
SN: This challenge requires health systems to continue to invest in cybersecurity; to require their vendors and service providers to adopt higher standards; and to communicate both the risks and benefits more effectively to patients. Health systems should combine these initiatives with tools that provide transparency, such as audit logs, and choice, such as granular, self-service consents. An ethical path forward requires such commitment and authenticity.
Q: Outside of breaches, another patient privacy concern deals with growing interest in large-scale data analysis (for example, the recent controversy with Google DeepMind and Royal Free London NHS' use of patient data). As data usage becomes more common, how do we make sure patients feel secure?
SN: Here, again, transparency and authenticity, combined with personal choice, is key. As an example, Apple's ResearchKit has redefined how huge quantities of personal data can be mobilized, relatively easily, for research and analytics, as long as we explain the "why" and ask individuals for their permission for each use of their anonymized data.
This approach creates a collaboration — a partnership — with individuals who are curious about research findings and supportive of the effort, rather than forever suspicious about intent and resentful of their personal information being used for an unknown entity's financial gain. This suspicion, as we have seen in the example you mentioned and several preceding it, undermines not only research benefits from secondary use, but, more importantly, safe, timely and cost-effective use of personal data for direct care that can save our lives.
Q: The healthcare industry is becoming increasingly excited about new technology — for example, the push to go paperless with EHRs. How can systems stay up-to-date with the rapidly changing IT landscape, while also ensuring their IT capabilities don't get ahead of their IT security?
SN: We've become accustomed to using service and technology providers that keep our personal data — bank records, fitness data, shopping habits, entertainment choices, travel bookings and so much more — in scalable cloud-based infrastructures, connected with our mobile devices and increasingly pervasive gadgets that know more and more about us. We let them capture our faces, our voices, our fingerprints.
We've become used to such innovators constantly using that data to enchant us with their ideas, hoping to retain and increase our engagement — and our purchases — with personalized insights and recommendations, all the time with more ubiquity and convenience. Given these benefits, we choose to "opt in" with little anxiety.
Healthcare data adds a significant complexity, since the consequences of data access without our consent can be life changing. This means healthcare organizations are more conservative when adopting technologies that would obviously improve convenience, outcomes and cost. Autonomous driving poses a similar challenge. The benefits are so obvious, but the impact of unauthorized access can be devastating. And yet, we will be driven around by our cars, some of us already are, in ways that are safer, more relaxing, more accessible to people with disabilities and to elderly and frail individuals.
What is common between healthcare and such applications is the responsibility of care — to design systems that are secure, test them with an understanding of consequences, adopt them when they are safe, and to do all these things in an increasingly responsive, agile fashion.
Healthcare organizations are increasingly encouraging their suppliers to move forward at pace, enabling new technologies to be adopted more rapidly, but in a manner that ensures, and perhaps increases, their safety and security. This pull and push between healthcare organizations and their suppliers will determine the pace at which we all get to benefit from such technologies.