Final HIPAA Security Regulations and EHRs

Note: this article was originally published in Maryland Physician Magazine in its May/June 2013 issue.

The HiTech Act in 2009 set in motion a series of changes to the HIPAA rules that govern the use, disclosure and protection of protected health information (“PHI”).  The Department of Health and Human Services (“HHS”) subsequently issued interim regulations in response to these changes in the law, and this year issued a final regulation as of March 26, 2013 that requires compliance by covered entities and business associates within 180 days.  These final HIPAA security regulations make a number of important changes which may impact your relationship with vendors that provide you with electronic health record (“EHR”) licensing and support.

First, prior to HiTech, business associates of covered entities were not required to comply with the security rules and standards set forth in the HIPAA security regulations.  HiTech changed the applicability of the security regulations to include business associates.  The final regulation from HHS implements this provision of the HiTech Act, but with a twist: subcontractors to business associates are also defined as business associates within the final regulation.  What this means is that EHR vendors and their subcontractors must fully comply with the HIPAA security rules, not just with “reasonable” security measures.

Second, prior to HiTech, there was no federal requirement that a covered entity or business associate report a security breach that resulted in the disclosure of protected health information (“PHI”).  HHS subsequently issued interim regulations to implement these notification requirements, and as of March 26, 2013, HHS issued final regulations that alter the assumptions and exceptions to what constitutes a “breach” under HIPAA.  In addition, business associates and subcontractors are obligated to report security breaches to covered entities.

For providers that are at the beginning of their search for an EHR vendor, have an attorney review any proposed contract between your organization and the vendor to ensure that the business associate provisions comply with the final regulations.  If you already have an existing relationship, work with your attorney to ensure that the contract in place complies with the final regulatory requirements.  All business associate agreements must come into compliance with the final regulations by September, 2014.

In recent years, some EHR vendors have moved to “cloud”-based data storage and access solutions for their clients.  These cloud systems are designed so that provider data collected by the EHR is stored at a remote data center, and made available over an internet connection with the provider.  Some EHR vendors subcontract with a third party to provide the cloud data storage.  More likely than not, that subcontractor is now a business associate under the final regulations and takes on the same obligations as the EHR vendor with regards to your data.  The final regulations require that a covered entity’s contract with their business associate require subcontractor compliance with the final security regulations.

Beyond compliance issues, providers will want to evaluate whether an EHR vendor that hosts your data in the “cloud” has really made sufficient provisions for security.  Such an evaluation makes good business sense because of the incredibly negative consequences of any security breach that results in a loss of PHI for a health care provider.  For example, does the vendor comply with a recognized, national security standard (like NIST)?  Is the EHR vendor, or the data center it uses for storing your data, audited against a SAS standard like SAS-70?  What are the security practices and security devices in place at the EHR vendor to protect your data?  If the vendor will host your data, what are its disaster recovery and data backup procedures?  Are those procedures regularly tested?

Providers and their counsel should also evaluate what, if any, additional provisions should be negotiated into any final agreement with the EHR vendor concerning the vendor’s compliance with a security standard, commitment to security procedures, and related obligations (such as maintaining appropriate border security and/or appropriate encryption for data during its transmission).

The changes in HIPAA compliance mean that providers cannot simply treat EHR vendors as a “black box” into which providers place PHI, and rely on the EHR vendor’s representations that they know best regarding security.  In addition, because the scope of HIPAA now covers more than just covered entities and business associates, but also most subcontractors of business associates that handle PHI, more entities are at risk for substantial fines for failing to comply with the applicable security standards.  All providers should work with their counsel to analyze and address compliance with the final regulations.

Reported PHI Breaches

The Department of Health and Human Services (“HHS”) maintains an online list of covered entities and business associates that have experienced PHI breaches where more than 500 individual patient records were involved.  As of the writing of this post, a total of 572 reported breaches are listed on this website.  What can we learn from this information?

First, the dataset covers breaches reported from September, 2009 through February, 2013.  A total of more than 21 million patient records are listed on this report (though it is likely there is some duplication of patient records between data breaches reported here).  These incidents total less than the single data loss reported by the Department of Veterans Affairs in 2006 when a single laptop was stolen from an employee’s home that contained in excess of 26 million records.  Nonetheless, a significant amount of PHI has been lost or stolen and reported to HHS over the last three and a half years.

Second, the most common scenarios for PHI breaches are tape backups that are lost, followed by theft.  Almost 6 million patient records were affected by this kind of data loss.  The theft or loss of a laptop came in fourth, affecting about 2.3 million patient records.  Theft generally accounted for more than one third of all records compromised, followed next by loss (which probably includes scenarios like we accidentally put the backup tapes in the dumpster, or the tape fell out of my bag between the office and my car), also accounting for about one third of all records compromised.  Hacking appears down the list, affecting a total of 1.3 million patient records.

Third, a little more than half of data breaches appear to involve a business associate of a covered entity in terms of patient records breached.  However, only 92 of the 572 data breaches note a business associate’s involvement, which tends to suggest that when a business associate is involved, more records on average are affected by the data breach.  This is consistent with the expectation that technology vendors like those that implement and/or host electronic health records often do so for more clients and are a bigger target for data theft or hacking and computer viruses.

With the change in breach notification in the final HIPAA regulations recently issued by HHS, it will be interesting to see if there are more breach notifications published to HHS’ web site.

Changes in HIPAA Breach Notification Rule

HHS recently released the final regulations that revise certain provisions of HIPAA, including the HIPAA breach notification rule.  Congress, in enacting the HiTech Act in 2009, included a statutory requirement that covered entities report breaches that involved the unauthorized access or loss of protected health information (“PHI”).  HHS then promulgated an interim rule to implement this statutory provision.  That interim rule required reporting of the breach under the “significant risk of financial, reputational or other harm” standard.  Criticism was subsequently leveled at this standard as being too subjective.  HHS just recently issued its final rule (effective on March 26, 2013) that changes the breach reporting rule in two ways.

First, if there is a breach that involves PHI, and the breach does not fall within a regulatory exception, the presumption of the regulation is that the breach must be reported.  This means that a party that experiences a loss of PHI cannot assume, on the grounds that the loss was uncertain to cause significant harm to the patients, that notification of the breach was not required.

Second, the final regulation replaces the interim rule’s standard with a requirement that the party who experienced the loss must demonstrate that there is a low probability that the PHI has been compromised.  In order to qualify under this new standard, the party must perform a risk assessment, taking into account at least the four factors outlined in the regulation.  These factors are found in § 164.402(2):

(i) The nature and extent of the protected health information involved, including the types of identifiers and the likelihood of re-identification;

(ii) The unauthorized person who used the protected health information or to whom the disclosure was made;

(iii) Whether the protected health information was actually acquired or viewed; and

(iv) The extent to which the risk to the protected health information has been mitigated.

So, let’s evaluate some typical hypothetical scenarios that involve the loss of PHI.  The most common reported PHI breach involves data backup tapes that are lost.  By design, a data backup tape is usually the entire database of patient records, because this entire dataset would normally be required to restore the data from the backup.

Under the first factor, such a loss would militate towards breach notification, because the dataset would almost certainly include patient identifiers and, if the backup was of an electronic health record, extensive health information on each patient.  Under the second factor, if the tape was merely lost, there is no determination of who might have had unauthorized access to the PHI.  If, for example, the backup tape was just simply lost by a contractor that stores the backup tapes in a vault for retrieval on demand, this factor might lean towards not making a notification.  On the other hand, if the tape was in the trunk of the network administrator’s car, and the car was stolen, this factor might lean towards making a notification.

As to the third factor, a lost data tape alone, without more information, would not inform us whether the data was actually acquired by anyone, or viewed by someone.  There is certainly the potential that a lost tape could be viewed, assuming that the person that obtained it had access to a compatible tape drive.  But based on what we know, this factor is probably neutral.

As to the fourth factor, the question here is whether the backup tape itself was encrypted, or was stored in a locked storage box.  A tape that is encrypted is much harder to access, even if the tape was intentionally stolen to obtain unauthorized access to PHI.  A tape in a locked storage box that was merely lost may be less likely to be accessed by an unauthorized user.  So this factor may swing either way based on what, if any, mitigations were in place to protect the data on the backup tape.

If we assumed that no mitigations were in place, the overall analysis would lean towards breach notification under the new rule.  As you can see, however, the facts and circumstances matter greatly in evaluating whether a breach has occurred that requires notification.

Changes in HIPAA Compliance

The HiTech Act set in motion a series of changes to Health Insurance Portability and Accountability Act (“HIPAA”) compliance for covered entities and business associates in 2009, which were followed by interim regulations issued by the department of Health and Human Services (“HHS”).  HHS has issued a final regulation that goes into effect on March 26, 2013, and requires compliance within 180 days by all covered entities and business associates.

The HiTech Act made a number of important changes to the law governing the security and disclosure of protected health information.  First, prior to HiTech, business associates of covered entities were not required to comply with the security rules and standards set forth in the HIPAA security regulations.  HiTech changed the applicability of the security regulations to include business associates.  The final regulation from HHS implements this provision of the HiTech Act.

Second, prior to HiTech, there was no federal requirement that a covered entity or business associate report a security breach that resulted in the disclosure of protected health information (“PHI”).  HHS subsequently issued interim regulations to implement these notification requirements, and as of March 26, 2013, HHS issued final regulations that alter the assumptions and exceptions to what constitutes a “breach” under HIPAA.

Business Associates are Covered Entities when it comes to PHI

HiTech initially changed the law governing PHI by requiring that business associates comply with the same security regulations that govern covered entities.  The final regulations with HHS clarify which security rules also apply to business associates under section 164.104 and 164.106, including those applicable rules found in Parts 160 and 162.  However, HHS also expanded the definition of “business associate” to include subcontractors of business associates that handle PHI on behalf of the business associate for the covered entity.  The regulation does provide certain narrow exceptions to who is now covered in the definition of a “business associate,” including an exception for “conduits” of PHI that may, on a transitory basis, transmit PHI but would not access the PHI except on a random or infrequent basis.  But the regulation appears to generally expand further the legal responsibilities, and potential liability, for members of the industry that work even indirectly for covered entities.

For existing health care providers, now might be the time to revisit your business associate agreement with your business associates, such as your EHR vendors.  Section 164.314 establishes certain requirements for these agreements, including provisions that all business associates comply with the full security rule, that subcontractors to business associates also comply with the full security rule, and that business associates provide the covered entity with security incident reporting in the event of a breach at the business associate’s or subcontractor’s facility or systems.

Changes in Security Breach and Notification

HiTech also introduced a breach notification provision which was intended to require covered entities to report to HHS, and where appropriate, to patients affected by a security breach involving their PHI.  The final regulations have modified the definition of a “breach” by establishing the assumption that an unauthorized access of PHI is a breach unless it can be demonstrated by the covered entity or business associate that there is a low probability that the PHI has been compromised.

Such a demonstration requires that the covered entity or business associate conduct a risk assessment and evaluate at a minimum the four factors described in the regulation: “(i) the nature and extent of the protected health information involved, including the types of identifiers and the likelihood of re-identification, (ii) the unauthorized person who used the protected health information or to whom the disclosure was made, (iii) whether the protected health information was actually acquired or viewed, and (iv) the extent to which the risk to the protected health information has been mitigated.”

Altering the burden and requiring a covered entity or business associate to engage in this risk assessment is likely to increase the number of breach notifications required under the final regulation.

The final regulation includes a variety of other changes in requirements for covered entities and business associates not discussed in this article, such as sale and marketing of PHI, use of genetic information for insurance underwriting, notices to patients of privacy practices, and disclosure of PHI to friends and families of decedents.  Providers should promptly examine their privacy and security policies to ensure compliance with the final regulations.

Data Breach: No Joke

As recently noted by the New York Times in this article, a lot of health data for nearly 11 million people has been inadvertently disclosed in violation of patient privacy.  Electronic health records systems alone are not to blame, as readers will note that the improper disposal of paper medical records in dumpsters has happened more than once (23 reports are noted on the HHS website of data breaches exposing 500 or more paper patient records in one way or another from 2009-2010).  However, computer databases make it easier to disclose larger amounts of health data than in the paper records days of yore.  As a part of the American Recovery and Reinvestment Act of 2009, Congress enacted federal reporting requirements in the event of a data breach by a covered entity.  For the entire law, click here: ARRA Enrolled Bill.

Section 13402 provides the statutory basis for requiring a covered entity to report to the Secretary of Health and Human Services when the security of protected health information is breached.  Both individual notice to the persons affected by the data breach, and public notification via the local media is required when more than 500 individual’s information has been lost due to a breach.  In addition, the covered entity is required to advise the Secretary in the event of a breach in excess of 500 individuals (if less than that, the entity can keep a log and submit it at the end of the year).

Patients may suffer identity theft and public embarrassment when their health information is lost by a covered entity.  And, if the breach is substantial enough, the covered entity may lose patients and clinical revenue as a result.  Health care providers can reduce the possibility of such data losses by having strong policies and internal database controls that limit access and portability of data by its employees and contractors.  Unfortunately, the problem of data loss (whether by accident or because of hacking) appears to not be improving, in spite of a number of sentinel events in the last few years, including the loss of a laptop with health data on over 20 million veterans served by the Veterans Administration.

Preparing for Disasters – Practical Preparedness

Disasters happen in the world, some of which may directly affect your organization.  Preparing for disasters, whether they be hurricanes, tornadoes, terrorists, hackers, power outages, fires, or earthquakes, means thinking about: (a) how your business operates today, (b) how your business would likely operate in the event of a disaster, (c) and developing some kind of testable plan for recovering from a variety of disasters that is practical but well-designed.  Preparedness is also a commitment to ongoing planning and the investment of a certain amount of resources each budget period to the process, because your plan will evolve with the extent and scope of your business as it changes over time.

In Maryland, there are not specific ethics rules that require lawyers to prepare for disasters, though common sense would tell an attorney that missing a deadline because of a disaster is still a missed deadline, and the loss or inadvertent disclosure of confidential client information is still a loss whether or not caused by a natural disaster or simple human error.   Both circumstances can lead to an ethics complaint from a disgruntled client.  For attorneys, there are a number of resources available from the ABA to help firms do a better job of preparing for a disaster.

Doctor’s offices that are joining the electronic health record system revolution because of the incentives under ARRA, also will need to have a plan for disaster recovery.  The HIPAA security regulations include standards for preparing for recovering from disasters (45 CR § 164.308(a)(7) is addressed specifically to contingency planning for covered entities and business associates).  The security regulations are cloaked in terms of “reasonableness,” which means that a covered entity’s disaster recovery planning efforts should be commensurate with the amount of data and resources it has.  So, a practice of two physicians that sees 8,000 patient visits a year is not expected to have its data available in three DR hot sites.  But, if you are a major insurance carrier, three DR hot sites might not be enough for your operation.  However, in neither case is no plan an acceptable answer.  Nor is a plan that has never been tested.

Risk Assessment

So where do you start?  The logical starting point is a risk assessment of your existing systems and infrastructure (also required of covered entities under the HIPAA security rules in section 164.308(a)(1)).  A risk assessment will guide you through gathering an inventory of your existing systems, and help to identify known and potentially unknown risks, along with the likelihood that such a risk will be realized and what you are doing now (if anything) to mitigate that risk.  The risk assessment will also help you to categorize how critical a system is to your operations, and will also identify severe risks that remain unmitigated.  This resulting list helps you to come up with a starting place for the next step: doing something about it.

The Disaster Plan

In parallel, you can also use the inventory of your existing systems and risks to develop a disaster recovery plan.  First, you now have a list of your critical systems which are your highest priority to recover in the event of a failure.  Second, you also have a list of likely risks to those systems with the likelihood based in part on your past experience with a particular disaster.  These lists help you to identify what you need to protect and what you need to protect from.  The other two questions you need to ask for each system are: (a) how much data can I stand to lose in the event of a disaster? and (b) how long can I wait to have my system restored to normal operations?

This analysis of your existing systems, risks, and business requirements will help lead the practice to a plan that includes procedures for how to function when systems are unavailable, and how to go about restoring an unavailable system within the business requirements of the practice.  Once you have your plan, and have implemented the systems or policies required by the plan, your next step is to test the plan.  Table top exercises allow you, in a conference room, to walk through the staffing, procedures, and possible issues that may arise as a result of a particular disaster scenario.  Technical testing permits your IT staff to make sure that a disaster recovery system works according to the expected technical outcomes.  Full blown testing is to actually simulate a disaster, perhaps during non-business hours, and actually run through the disaster plan’s procedures for operations and IT.

Hypothetical

As an example, suppose that you have an electronic health record system.  This is a critical system based on the risk assessment.  In the last five years, you have had a virus that partially disabled your records system causing an outage for two business days, and you have had your database crash, causing you to lose a week’s worth of data.  You have implemented two mitigations.  The first is anti-virus software that regularly updates for definitions and regularly scans the system for viruses and removes them.  The second is a backup system that makes a backup of your system’s data on a weekly basis and stores the data in a separate storage system.

Based on interviews with the practice staff and owner, the records system is used as a part of patient care.  During normal business hours, an outage of the system can result in patients being re-scheduled, and also creates double work to document kept visits on paper and again in the record system when it becomes available.  The practice has indicated that the most it can be without the system is a single business, and the most data that it can lose from this system is the most recent 4 hours of data entry (which can be reconstructed by the clinical staff that day).

You then evaluate the mitigations in place today that allow for a system recovery in the event of a likely disaster (virus or database crash based on the past experience of the practice).  The backup system today only runs once per week, which means that a crash of virus that occurred later in the week would result in more than 4 hours of lost data.  Recovery from the backup device to a new server also appears to require more than a business day, because the practice has no spare server equipment available.  So you would have to start over with the existing server (installing the operating system, database software, and then restoring the data from the backup), or purchase a new server and have it delivered to complete the restore.

The conclusion here is that while there is an existing mitigation for recovery from a likely disaster, the mitigation does not meet the business requirements of the practice.

Budget for New Sufficient Mitigations

Once you have your list of unmitigated or insufficiently mitigated risks, the next step is to look for mitigations that you could implement on your network.  A mitigation might be a disaster recovery system or service, or it might be some other service or product that can be purchased (like anti-virus software, a hardware warranty, a staff person, etc.).  At this point, the help of a technical consultant may be required if you don’t have your own IT department.  The consultant’s role here is to advise you about what you can do and what the likely costs are to purchase and implement the solution which will meet your business requirements based on your likely risks for disasters.

Once sufficient solutions have been identified, the next step is to purchase a solution and implement it.  From there, testing is key as noted above.  An untested plan is not much of a plan.

 

 

Disaster Recovery and the Japanese Tsunami

The art of disaster recovery is to plan for what may be the unthinkable while balancing mitigations that are both feasible and reasonable for your organization’s resources and circumstances.  On March 11, Japan was struck by a massive earth quake and tsunami that caused enormous destruction, estimated at a total loss of $310 billion.  Over the last several weeks, one of the major failures has been at the nuclear power complex in Fukushima, home to six nuclear power plants.  This disaster continues, as of the writing of this post, as at least two of the plants continue to be in a critical state because of a failure of the complex’s power and backup power systems that helped to control the temperature of the nuclear fuel rods used to generate power at the plants.

As an unfortunate consequence, many people have been exposed to more radiation than normal, food grown in the area of the plant has shown higher levels of radioactive materials than normal, radioactive isotopes in higher-than-normal concentrations have been detected in the ocean near the plants, and numerous nuclear technicians have been exposed to significant radiation, resulting in injuries and hospitalizations.  As far as disasters go, the loss of life and resources has been severe.  And like other major environmental and natural disasters, the effects of the earthquake and tsunami will be felt for years by many people.

Natural disasters like this one cannot be prevented.  We lack the technology today to effectively predict or control for these kinds of events.  And while these larger scale disasters are relatively rare, planners still need to assess the relative likelihood of such events, and develop reasonable mitigation plans to help an entity recover should such a disaster occur.  Computerized health records present an opportunity to permit recovery in that the data housed by these systems can be cost-effectively backed up and retained at other secure locations, permitting system recovery and the ability to continue operations.  In contrast to digital files, paper records are far less likely to be recovered were a tsunami or other similar natural disaster to occur and wash the records away.

Even the best recovery plan, however, will be severely tested should a major disaster be realized.  Japan was hardly unprepared for a major earthquake, and still is struggling to bring its nuclear facilities under control nearly three weeks later.  However, having a plan and testing it regularly will increase the odds of recovery.  My thoughts are with the Japanese during these difficult times.