Final HIPAA Security Regulations and EHRs

Note: this article was originally published in Maryland Physician Magazine in its May/June 2013 issue.

The HiTech Act in 2009 set in motion a series of changes to the HIPAA rules that govern the use, disclosure and protection of protected health information (“PHI”).  The Department of Health and Human Services (“HHS”) subsequently issued interim regulations in response to these changes in the law, and this year issued a final regulation as of March 26, 2013 that requires compliance by covered entities and business associates within 180 days.  These final HIPAA security regulations make a number of important changes which may impact your relationship with vendors that provide you with electronic health record (“EHR”) licensing and support.

First, prior to HiTech, business associates of covered entities were not required to comply with the security rules and standards set forth in the HIPAA security regulations.  HiTech changed the applicability of the security regulations to include business associates.  The final regulation from HHS implements this provision of the HiTech Act, but with a twist: subcontractors to business associates are also defined as business associates within the final regulation.  What this means is that EHR vendors and their subcontractors must fully comply with the HIPAA security rules, not just with “reasonable” security measures.

Second, prior to HiTech, there was no federal requirement that a covered entity or business associate report a security breach that resulted in the disclosure of protected health information (“PHI”).  HHS subsequently issued interim regulations to implement these notification requirements, and as of March 26, 2013, HHS issued final regulations that alter the assumptions and exceptions to what constitutes a “breach” under HIPAA.  In addition, business associates and subcontractors are obligated to report security breaches to covered entities.

For providers that are at the beginning of their search for an EHR vendor, have an attorney review any proposed contract between your organization and the vendor to ensure that the business associate provisions comply with the final regulations.  If you already have an existing relationship, work with your attorney to ensure that the contract in place complies with the final regulatory requirements.  All business associate agreements must come into compliance with the final regulations by September, 2014.

In recent years, some EHR vendors have moved to “cloud”-based data storage and access solutions for their clients.  These cloud systems are designed so that provider data collected by the EHR is stored at a remote data center, and made available over an internet connection with the provider.  Some EHR vendors subcontract with a third party to provide the cloud data storage.  More likely than not, that subcontractor is now a business associate under the final regulations and takes on the same obligations as the EHR vendor with regards to your data.  The final regulations require that a covered entity’s contract with their business associate require subcontractor compliance with the final security regulations.

Beyond compliance issues, providers will want to evaluate whether an EHR vendor that hosts your data in the “cloud” has really made sufficient provisions for security.  Such an evaluation makes good business sense because of the incredibly negative consequences of any security breach that results in a loss of PHI for a health care provider.  For example, does the vendor comply with a recognized, national security standard (like NIST)?  Is the EHR vendor, or the data center it uses for storing your data, audited against a SAS standard like SAS-70?  What are the security practices and security devices in place at the EHR vendor to protect your data?  If the vendor will host your data, what are its disaster recovery and data backup procedures?  Are those procedures regularly tested?

Providers and their counsel should also evaluate what, if any, additional provisions should be negotiated into any final agreement with the EHR vendor concerning the vendor’s compliance with a security standard, commitment to security procedures, and related obligations (such as maintaining appropriate border security and/or appropriate encryption for data during its transmission).

The changes in HIPAA compliance mean that providers cannot simply treat EHR vendors as a “black box” into which providers place PHI, and rely on the EHR vendor’s representations that they know best regarding security.  In addition, because the scope of HIPAA now covers more than just covered entities and business associates, but also most subcontractors of business associates that handle PHI, more entities are at risk for substantial fines for failing to comply with the applicable security standards.  All providers should work with their counsel to analyze and address compliance with the final regulations.

Reported PHI Breaches

The Department of Health and Human Services (“HHS”) maintains an online list of covered entities and business associates that have experienced PHI breaches where more than 500 individual patient records were involved.  As of the writing of this post, a total of 572 reported breaches are listed on this website.  What can we learn from this information?

First, the dataset covers breaches reported from September, 2009 through February, 2013.  A total of more than 21 million patient records are listed on this report (though it is likely there is some duplication of patient records between data breaches reported here).  These incidents total less than the single data loss reported by the Department of Veterans Affairs in 2006 when a single laptop was stolen from an employee’s home that contained in excess of 26 million records.  Nonetheless, a significant amount of PHI has been lost or stolen and reported to HHS over the last three and a half years.

Second, the most common scenarios for PHI breaches are tape backups that are lost, followed by theft.  Almost 6 million patient records were affected by this kind of data loss.  The theft or loss of a laptop came in fourth, affecting about 2.3 million patient records.  Theft generally accounted for more than one third of all records compromised, followed next by loss (which probably includes scenarios like we accidentally put the backup tapes in the dumpster, or the tape fell out of my bag between the office and my car), also accounting for about one third of all records compromised.  Hacking appears down the list, affecting a total of 1.3 million patient records.

Third, a little more than half of data breaches appear to involve a business associate of a covered entity in terms of patient records breached.  However, only 92 of the 572 data breaches note a business associate’s involvement, which tends to suggest that when a business associate is involved, more records on average are affected by the data breach.  This is consistent with the expectation that technology vendors like those that implement and/or host electronic health records often do so for more clients and are a bigger target for data theft or hacking and computer viruses.

With the change in breach notification in the final HIPAA regulations recently issued by HHS, it will be interesting to see if there are more breach notifications published to HHS’ web site.

Changes in HIPAA Breach Notification Rule

HHS recently released the final regulations that revise certain provisions of HIPAA, including the HIPAA breach notification rule.  Congress, in enacting the HiTech Act in 2009, included a statutory requirement that covered entities report breaches that involved the unauthorized access or loss of protected health information (“PHI”).  HHS then promulgated an interim rule to implement this statutory provision.  That interim rule required reporting of the breach under the “significant risk of financial, reputational or other harm” standard.  Criticism was subsequently leveled at this standard as being too subjective.  HHS just recently issued its final rule (effective on March 26, 2013) that changes the breach reporting rule in two ways.

First, if there is a breach that involves PHI, and the breach does not fall within a regulatory exception, the presumption of the regulation is that the breach must be reported.  This means that a party that experiences a loss of PHI cannot assume, on the grounds that the loss was uncertain to cause significant harm to the patients, that notification of the breach was not required.

Second, the final regulation replaces the interim rule’s standard with a requirement that the party who experienced the loss must demonstrate that there is a low probability that the PHI has been compromised.  In order to qualify under this new standard, the party must perform a risk assessment, taking into account at least the four factors outlined in the regulation.  These factors are found in § 164.402(2):

(i) The nature and extent of the protected health information involved, including the types of identifiers and the likelihood of re-identification;

(ii) The unauthorized person who used the protected health information or to whom the disclosure was made;

(iii) Whether the protected health information was actually acquired or viewed; and

(iv) The extent to which the risk to the protected health information has been mitigated.

So, let’s evaluate some typical hypothetical scenarios that involve the loss of PHI.  The most common reported PHI breach involves data backup tapes that are lost.  By design, a data backup tape is usually the entire database of patient records, because this entire dataset would normally be required to restore the data from the backup.

Under the first factor, such a loss would militate towards breach notification, because the dataset would almost certainly include patient identifiers and, if the backup was of an electronic health record, extensive health information on each patient.  Under the second factor, if the tape was merely lost, there is no determination of who might have had unauthorized access to the PHI.  If, for example, the backup tape was just simply lost by a contractor that stores the backup tapes in a vault for retrieval on demand, this factor might lean towards not making a notification.  On the other hand, if the tape was in the trunk of the network administrator’s car, and the car was stolen, this factor might lean towards making a notification.

As to the third factor, a lost data tape alone, without more information, would not inform us whether the data was actually acquired by anyone, or viewed by someone.  There is certainly the potential that a lost tape could be viewed, assuming that the person that obtained it had access to a compatible tape drive.  But based on what we know, this factor is probably neutral.

As to the fourth factor, the question here is whether the backup tape itself was encrypted, or was stored in a locked storage box.  A tape that is encrypted is much harder to access, even if the tape was intentionally stolen to obtain unauthorized access to PHI.  A tape in a locked storage box that was merely lost may be less likely to be accessed by an unauthorized user.  So this factor may swing either way based on what, if any, mitigations were in place to protect the data on the backup tape.

If we assumed that no mitigations were in place, the overall analysis would lean towards breach notification under the new rule.  As you can see, however, the facts and circumstances matter greatly in evaluating whether a breach has occurred that requires notification.

Changes in HIPAA Compliance

The HiTech Act set in motion a series of changes to Health Insurance Portability and Accountability Act (“HIPAA”) compliance for covered entities and business associates in 2009, which were followed by interim regulations issued by the department of Health and Human Services (“HHS”).  HHS has issued a final regulation that goes into effect on March 26, 2013, and requires compliance within 180 days by all covered entities and business associates.

The HiTech Act made a number of important changes to the law governing the security and disclosure of protected health information.  First, prior to HiTech, business associates of covered entities were not required to comply with the security rules and standards set forth in the HIPAA security regulations.  HiTech changed the applicability of the security regulations to include business associates.  The final regulation from HHS implements this provision of the HiTech Act.

Second, prior to HiTech, there was no federal requirement that a covered entity or business associate report a security breach that resulted in the disclosure of protected health information (“PHI”).  HHS subsequently issued interim regulations to implement these notification requirements, and as of March 26, 2013, HHS issued final regulations that alter the assumptions and exceptions to what constitutes a “breach” under HIPAA.

Business Associates are Covered Entities when it comes to PHI

HiTech initially changed the law governing PHI by requiring that business associates comply with the same security regulations that govern covered entities.  The final regulations with HHS clarify which security rules also apply to business associates under section 164.104 and 164.106, including those applicable rules found in Parts 160 and 162.  However, HHS also expanded the definition of “business associate” to include subcontractors of business associates that handle PHI on behalf of the business associate for the covered entity.  The regulation does provide certain narrow exceptions to who is now covered in the definition of a “business associate,” including an exception for “conduits” of PHI that may, on a transitory basis, transmit PHI but would not access the PHI except on a random or infrequent basis.  But the regulation appears to generally expand further the legal responsibilities, and potential liability, for members of the industry that work even indirectly for covered entities.

For existing health care providers, now might be the time to revisit your business associate agreement with your business associates, such as your EHR vendors.  Section 164.314 establishes certain requirements for these agreements, including provisions that all business associates comply with the full security rule, that subcontractors to business associates also comply with the full security rule, and that business associates provide the covered entity with security incident reporting in the event of a breach at the business associate’s or subcontractor’s facility or systems.

Changes in Security Breach and Notification

HiTech also introduced a breach notification provision which was intended to require covered entities to report to HHS, and where appropriate, to patients affected by a security breach involving their PHI.  The final regulations have modified the definition of a “breach” by establishing the assumption that an unauthorized access of PHI is a breach unless it can be demonstrated by the covered entity or business associate that there is a low probability that the PHI has been compromised.

Such a demonstration requires that the covered entity or business associate conduct a risk assessment and evaluate at a minimum the four factors described in the regulation: “(i) the nature and extent of the protected health information involved, including the types of identifiers and the likelihood of re-identification, (ii) the unauthorized person who used the protected health information or to whom the disclosure was made, (iii) whether the protected health information was actually acquired or viewed, and (iv) the extent to which the risk to the protected health information has been mitigated.”

Altering the burden and requiring a covered entity or business associate to engage in this risk assessment is likely to increase the number of breach notifications required under the final regulation.

The final regulation includes a variety of other changes in requirements for covered entities and business associates not discussed in this article, such as sale and marketing of PHI, use of genetic information for insurance underwriting, notices to patients of privacy practices, and disclosure of PHI to friends and families of decedents.  Providers should promptly examine their privacy and security policies to ensure compliance with the final regulations.

Meaningful Use Stage 2 Regulations Released

The Meaningful Use Stage 2 proposed rule has been released earlier this week.  You can download a copy of the full 455 page regulation here: MU Stage 2 Proposed Rule.  For those keeping score at home, there are three stages of “meaningful use” as that term is defined in section 495.6 of the regulations.  Stage 1 set certain Core (required) and Menu (pick from the list to implement) Criteria, and established minimum compliance metrics for a “eligible professional” to qualify for the federal incentives.  The original regulations that defined “meaningful use” indicated that there would be future changes to the definition in two more stages.  We initially expected Stage 2 to be defined for compliance in 2013.  However, the regulations have pushed out compliance for Stage 2 to 2014.  This article will take a look at what’s been proposed for Stage 2.

First off, there are more “Core” or required Criteria in Stage 2.  Stage 1 had a total of 15 Core Criteria, some of which any certified electronic health record would have to meet (such as collecting certain demographic and vital signs data for patients seen in the office).  In addition, there were several Core criteria that, when originally published, no one had yet defined how you might actually comply.  For example, there is a Core Criteria in Stage 1 where providers were required to submit certain quality data to either CMS or their State Medicaid program.  But, no one had indicated when the regulations were published what data, exactly, or how this data was to be provided.  The metric in Stage 1 was merely the ability to submit a test file.

Stage 2 has 17 total Core Criteria.  In several cases, CMS has proposed to terminate a prior Stage 1 Core item entirely in Stage 2.  And in a number of cases, Criteria that were previously on the “Menu” in Stage 1 are now incorporated as Stage 2 Core Criteria.  For example, structured lab data, patient lists by specific condition for use in a quality improvement initiative, patient reminders, patient access to electronic health information, patient education resources, medication reconciliation for transition of care, care summary for patients transitioned to another provider, and data submission to an immunization registry were all Menu Criteria in Stage 1 and are now Core Criteria in Stage 2.

Also, where a Stage 1 Criteria was kept, the minimum compliance percentage has increased, in some cases substantially, in Stage 2.  For example, where a 50% compliance rate was sufficient for Stage 1 for collecting patient smoking status, in Stage 2, the compliance rate minimum is 80%.  In Stage 1, a single decision support rule needed to be implemented for compliance.  In Stage 2, five such rules must be implemented.

As for the Menu Criteria, Stage 1 required that you implement 5 of the 10 on the list as an eligible provider.  In total, therefore, a provider had a total of 20 Criteria that had to be met to achieve meaningful use.  In Stage 2, there are only 5 menu criteria, and the provider must meet at least three.  So the total number of required criteria is no different, but providers have fewer menu criteria to choose to comply with.  In addition, the Menu Criteria in Stage 2 include three interfaces with specific state or public health registries, and the remaining two involve access to imaging results in the EHR and storing family health history in a structured data format.  You may be able to waive out of some of these if there isn’t a way in your state to submit surveillance or other registry data electronically.  However, if you elect to implement one of these interfaces, the compliance requirement under Stage 2 is full year data submission to the registry (not just submitting a test file).  If you plan on doing one of these, start early to make sure you can get to the compliance target by 2014.

Overall, Stage 2 appears to “up the game” for providers who wish to continue to receive incentive payments in out years of the program.  The Stage 2 rules that were published this week are interim rules.  The public has 60 days to submit comments.  After that, CMS will ultimately publish a final rule, taking into account comments made during the comment period.  While it is possible that CMS may back down on some of these measures, providers should get plan to comply with much of this Rule.  Talk with your EHR vendor, consultant, MSO or other service providers to analyze and plan for compliance.

Living in the Cloud(s)

I wrote about cloud computing in an earlier post and discussed some of the general pros and cons involved with the idea.  For attorneys, doctors and other professionals that are regulated, cloud computing creates some new wrinkles.  For attorneys, protecting the confidences of clients is an ethical obligation.  The unauthorized disclosure of client secrets can lead an attorney to disciplinary action and disbarment.  For physicians and other health care providers, federal laws on the privacy of patient information put providers at risk for substantial fines for inappropriately disclosing patient health information (or otherwise not complying with HIPAA’s privacy and security rules).  Using the cloud for applications that might have such confidential information adds a layer of uncertainty for the practitioner.

On the other hand, cloud computing is coming to a practice near you whether you like it or not.  For example, an increasing number of attorney practice management systems are cloud-based, such as Clio.  Legal research tools like FastCase, LexisNexis, Westlaw and Google Scholar are all cloud-based systems (in the sense that the information being searched is not stored on your local network but in internet-based database repositories that you access through your web browser).  And a growing number of email providers, including Google Apps for Business, Mailstreet.com, and others have been providing cloud-based email solutions for custom domain names.

State bar ethics groups and the ABA have been working on ethics opinions about these cloud-based systems.  North Carolina’s Bar had initially proposed a restrictive rule on the use of cloud computing systems by attorneys in the state.  The NC Bar had suggested that the use of web-based systems like directlaw.com (which allows clients to complete a questionnaire online for specific legal documents which are reviewed by an attorney before becoming final) represented a violation of the state’s ethics rules.  However, the NC Bar later revised its opinion and indicated that cloud computing solutions can be acceptable, so long as the attorney takes reasonable steps to minimize the inadvertent disclosure of confidential information.  “Reasonable,” a favorite word of attorneys for generations, has the virtue and vice of being subject to interpretation.  However, given the pace of change of technology, a bright line rule that favors one system over another faces prompt obsolescence.

In the context of the NC Bar 2011 Formal Opinion 6, for software as a service providers, ethics considerations include: (a) what’s in the contract between the vendor and the lawyer as to confidentiality, (b) how the attorney will be able to retrieve data from the provider should it go out of business or the parties terminate the SAAS contract, (c) an understanding of the security policy and practices of the vendor, (d) the steps the vendor takes to protect its network, such as firewalls, antivirus software, encryption and intrusion detection, and (e) the SAAS vendor’s backup and recovery plan.

Can you penetrate past the marketing of a vendor to truly understand its security practices?  For example, Google does not even disclose the total number of physical servers it uses to provide you those instant search results (though you can learn where its data centers are – there is even one in Finland as of the writing of this article – here).  And, in spite of Google’s security vigilance, Google and the applications it provides have periodic outages and hack attacks, such as the Aurora attack on gmail that became known in 2010.  Other data centers and service providers may be less transparent concerning these security issues.  In some cases, the opacity is a security strategy.  Just as the garrison of a castle wouldn’t advertise its weak spots, cloud providers aren’t likely to admit to security problems until either after the breach is plugged, or the breach is irreparable.

What’s your alternative?  For you Luddites, perhaps paper and pencil can’t be hacked, but good luck if you have a fire, or a disgruntled employee dumps your files in a local dumpster for all to see one weekend.  For those of you that want computer system in your practice, can you maintain these systems in-house in a cost-effective manner?  Do you have the resources to keep up with the software and hardware upgrades, service contracts, backup & recovery tests, and security features to reasonably protect your data?  How does that stack with professional-grade data centers?  Are you SAS-70 or SAS-16 compliant?  Do you know how data you access is encrypted?  In functional terms, do you really exercise more effective control over your security risks if you have IT people as employees rather than a data center under a reasonable commercial contract?

There are a lot of considerations.  And the best part?  They keep changing!

Data Breach: No Joke

As recently noted by the New York Times in this article, a lot of health data for nearly 11 million people has been inadvertently disclosed in violation of patient privacy.  Electronic health records systems alone are not to blame, as readers will note that the improper disposal of paper medical records in dumpsters has happened more than once (23 reports are noted on the HHS website of data breaches exposing 500 or more paper patient records in one way or another from 2009-2010).  However, computer databases make it easier to disclose larger amounts of health data than in the paper records days of yore.  As a part of the American Recovery and Reinvestment Act of 2009, Congress enacted federal reporting requirements in the event of a data breach by a covered entity.  For the entire law, click here: ARRA Enrolled Bill.

Section 13402 provides the statutory basis for requiring a covered entity to report to the Secretary of Health and Human Services when the security of protected health information is breached.  Both individual notice to the persons affected by the data breach, and public notification via the local media is required when more than 500 individual’s information has been lost due to a breach.  In addition, the covered entity is required to advise the Secretary in the event of a breach in excess of 500 individuals (if less than that, the entity can keep a log and submit it at the end of the year).

Patients may suffer identity theft and public embarrassment when their health information is lost by a covered entity.  And, if the breach is substantial enough, the covered entity may lose patients and clinical revenue as a result.  Health care providers can reduce the possibility of such data losses by having strong policies and internal database controls that limit access and portability of data by its employees and contractors.  Unfortunately, the problem of data loss (whether by accident or because of hacking) appears to not be improving, in spite of a number of sentinel events in the last few years, including the loss of a laptop with health data on over 20 million veterans served by the Veterans Administration.