Five Ways Healthcare Technology Vendors Put Their Customers’ PHI at Risk

Author: Chris Bowen
Chief Privacy and Security Officer and Founder, CISSP, CIPP/US, CIPT
ClearDATA

Warning to technology vendors that service the healthcare industry: nearly half of serious data breaches occur in the healthcare sector and the majority are caused by a third party.

There are five common ways technology vendors set themselves up—and their healthcare customers—for a data breach that could be catastrophic to patients’ privacy and the vendor’s reputation.

#1: Failure to Assess Risk

The HIPAA Security Rule requires that certain organizations, known as covered entities and business associates, regularly perform risk assessments. Yet 33 percent never have, increasing the rate of healthcare data breaches.

Businesses that skip this assessment typically struggle to find enough staff time to take it on. And no doubt, it’s intensive work. But not having enough staff on hand to perform it won’t spare your organization from litigation, fines, remediation and restitution which can reach into the millions of dollars in the event a data breach is traced back to you.

Action: Implement and stick to a risk assessment policy that includes a periodic review of data inventories and critical assets; administrative, physical and technical safeguards; and regular re-evaluations of risk to protected health information.

#2: Unaware of System Activity

Given that many breaches aren’t discovered until months later, too many organizations are in the dark about threat attempts. In one of the most notorious examples, while the Anthem breach of 80 million records wasn’t announced to the public until February 2015, subsequent forensics traced the beginning of the breach to April 2014. That’s 11 months of covert activity. Such delays are actually more common than not, with research showing that only 5 percent of breaches are discovered within three months of entry.

Action: Enable continuous logging; keep these logs protected; and perform regular system activity reviews—an essential component of risk management.

#3: Patching Fail

Failure to keep up-to-date with patches and firmware has led to previous breaches, including one at ACMHS that resulted in a $150,000 fine. Although security patches should be applied as soon as they are released, they frequently aren’t. Of course, patches are sometimes faulty. For example, while ACMHS was found to be negligent, the resulting fine that was issued happened on the heels of a year of patching woes for most Microsoft customers. It’s a balancing act to decide whether to deal with the fallout of a botched patch, or wait to receive one that’s error-free.

Action: Document your decisions of what and when to patch, but don’t stop progress for what could be a long wait for perfection.

#4: Lack of Proper Training

General Security Awareness training is a HIPAA requirement. But what about training for Secure Development Practices? Or training on your Software Development Lifecycle? Healthcare technology vendors should make both a part of mandatory training. As web-based solutions increasingly perform or support hospital workflows, it is essential that vendors understand the potential security risks of the applications they build. Vulnerabilities abound, and include un-validated parameters, broken access controls, cross-site scripting flaws, insecure use of cryptography and more.

Action: Check out the Open Web Application Security Project (OWASP) at www.owasp.org. This is a great resource for third parties that support or deploy software to learn about the top security issues in web apps and services.

#5: Failure to Manage Change

Change control is the process of managing change to an organization’s environment and assessing the potential impact on business. And it is something that just a little over half of organizations apply to their information technology assets and business processes. Now consider that 80 percent of unplanned outages are due to ill-planned changes made by operations staff or developers, and that the average cost of downtime is around the $8,000 per minute mark.

Action: Study the IT Process Institute’s Visible Ops Handbook. This gives vendors insight on several change-related security measures, such as reducing access to systems that can be changed; the importance of having detailed information about all IT assets, and how to build a RACI; how to create a repeatable build library; and making continual improvement a part of the vendor’s IT culture.

Circling back to that risk assessment, once it’s performed, any revealed vulnerabilities should be quickly fixed—before the bad guys exploit them. Yet only 5 percent of organizations affected by a breach resolve the underlying security issue within a month. The vast majority are resolved months or more than a year later, or never at all.

3rd Party Attacks on the Rise

42% of serious data breaches in 2014 were in the healthcare sector

  • January 2014 – Blue Cross Blue Shield of New Jersey: Loss of data affecting 839,711 individuals. A laptop was stolen – there was no encryption.
  • May 2014 – Sutherland Healthcare Solutions: Thieves stole eight computers from Sutherland’s Torrance, Calif. Office. They got away with the medical records of 342,197 individuals. There was no encryption.
  • August 2014 – Community Health Pro-Services Corporation: Unauthorized access. In a legal dispute with Texas HHS, Xerox removed patient records from servers and hard drives and permitted other parties to view the records of 2,000,000 individuals.
  • February 2015 – Anthem: 80,000,000 records stolen via Hack. Attackers created a bogus domain name, “we11point.com” and used malware to mimic Citrix VPN software. Harvested user credentials.

Originally published by Health IT News on May 5th, 2015.

Thank you for subscribing!