The July theft of some laptops from the offices of a large health system in Illinois represents the second largest breach ever of electronic protected health information. The names, addresses, Social Security numbers, plus a host of medical and insurance information belonging to over 4 million patients was reportedly compromised. Prior to this breach, the total number of affected patient records was just over 20 million, as reported by HHS, so this one breach alone represents an approximately 25 percent increase in the total number.
The target — Downer's Grove, Ill.-based Advocate Health Care — says the workstations were not encrypted and stated post-breach remediation measures now include encryption and 24/7 physical security of the areas where the computers were stolen.
There's only one word to describe the response: Dumb! Encryption, more training, more policy manuals and security cameras are not the answer. Given sufficient time and resources, even encrypted laptops and hard drives can eventually be compromised.
The ePHI data shouldn't be there in the first place, it should be in a secure data center or hosted by a medical-grade cloud services provider.
Why was ePHI stored on those machines? Two very simple reasons:
1. Because users can. The IT system design allows it.
2. Because users feel they need to, to get their jobs done.
The typical hospital IT system is frequently slow to access and hard to use, so once users log in and get access to the data, they feel a need to download it so they can work on it with less overhead. (Sometimes they even download it to a portable USB drive, or email files to themselves so they can work on them at home!)
If you really analyze this situation to its root cause, you will find that no amount of employee training or policy memos will solve this problem. (This is probably why so many of these entities that report breaches turn out to be repeat offenders.) Only when IT architecture is properly designed will this problem begin to disappear.
There are two primary changes that are needed.
First, the IT systems should not allow data to be stored on local devices like workstations and laptops in the first place. The downside of the PC (as in personal computer, which also includes laptops) is that users don't think twice about downloading massive amounts of ePHI to run reports or do query analyses. Common user-based tools such as Excel, Access and Crystal Reports allow users to slice and dice healthcare data to their hearts' content. Properly designed IT infrastructure that virtualizes desktops and uses thin- or zero-client devices at users' desks is the only way to prevent users from storing data locally. In this environment, the user's "desktop" (and C: drive) are actually stored on a secure server, not on the local device. If a thin- or zero-client device is stolen, it contains no local data. And the user software also runs on a server, where more horsepower is usually available to improve performance.
The other required component is to set up the IT systems to be easy and fast to use, to eliminate the slowness associated with logging in, logging out and running programs. Unfortunately the usual response to these types of breaches is to increase user security even further (e.g., "Memo to all employees: User passwords will now be 14 alpha/numeric/symbolic/Greek characters, they must be changed every day, and auto-logoff times will be decreased to 2 minutes.") Single sign-on solutions and improved 2-factor authentication systems based on physical hospital-issued ID badges are readily available now to allow users to easily log in and log out of all their clinical and business applications in a matter of seconds, and all their work is saved on a secure server.
It is clear from the reported breaches that the old-fashioned client/server model, with traditional PCs and laptops employing common user-driven security, just doesn't cut it anymore. No amount of training, policy memos, encryption, security cameras and more onerous user/password schemes will do the trick. It requires a fundamental shift in the way healthcare IT services are designed and delivered.
All the technology tools are readily available, but IT departments and finance managers are afraid of the perceived cost. The irony is, with improved (shortened) login times and with less expensive user devices and other cost savings, these systems actually pay for themselves in a year or so. And they make clinicians and other users happy and able to work more efficiently.
The cost of continuing the status quo? At an estimated $200-$1,000 per patient record in a breach situation, this organization is facing an $800 million to $4 billion problem, not to mention the loss of productivity and morale among its employees, plus a significantly tarnished public image.
Dr. Marion K. Jenkins, is executive vice president-healthcare at 3t Systems and an adjunct faculty member at the University of Denver.
More Articles on PHI Security:
The Real Causes of HIPAA Security Breaches: Bad IT System Design, Bad User Behavior, Bad Policies, Bad Operations
Misuse of Shortcuts in EHRs Can Be Problematic