OMB outlines approach for cloud computing security

In the latest follow-up to “cloud first” policy advocated by federal CIO Vivek Kundra and the Federal Cloud Computing Strategy issued last February, OMB released new federal policy guidance directing agencies to use the requirements, security assessment procedures, and cloud service authorization packages developed under the Federal Risk and Authorization Management Program (FedRAMP). In a memo to agency CIOs entitled “Security Authorization of Information Systems in Cloud Computing Environments“, current federal CIO Steven van Roekel (who replaced Kundra in the position in August) outlined key aspects and anticipated benefits of the FedRAMP program and detailed expectations for executive agencies and explained the role of the FedRAMP Joint Authorization Board (led collectively by the Department of Homeland Security, Department of Defense, and General Services Administration).

FedRAMP-LogoFedRAMP is a multi-agency collaborative effort managed by GSA that provides a standard process for assessing cloud service providers against FISMA requirements and the security control framework specified in Special Publication 800-53. Using this process, cloud service providers seeking to do business with government agencies hire approved third party assessment organizations to conduct independent reviews of their security and to produce system security plans, security assessment reports, and other Risk Management Framework documentation that can be used by the FedRAMP Joint Authorization Board23 to decide whether to authorize the cloud service providers for use by government agencies. This approach essentially establishes pre-authorized cloud computing providers so that individual agencies can avoid incurring the time and resource costs ordinarily required to perform an agency-specific assessment.

FedRAMP represents something of a departure from standard federal acquisition practices, as cloud service providers will apply directly to the FedRAMP program when seeking authorization, potentially allowing them to first complete authorization and then compete for government agency contracts for cloud services. In contrast, when GSA awarded the first federal cloud computing contracts in 2010 (to 11 companies to provide infrastructure as a service offered through Apps.gov), all of the service providers receiving prime contracts still needed to demonstrate FISMA compliance and achieve authorization to operate.

TRICARE data breach shows (again) why encryption of removable media is essential

The Department of Defense’s TRICARE program disclosed last week that backup tapes containing medical records on nearly 5 million active-duty and retired military personnel and their dependents were stolen from the car of a contractor who was transporting the tapes. According to spokesmen for TRICARE and the contractor (SAIC) quoted in the media, only some of the personal information included on the tapes had been encrypted prior to backup, and that encryption apparently did not satisfy government standards for strength of cryptographic modules. More surprising is a statement attributed to a TRICARE spokesman that the military healthcare provider does not have a policy on encryption of backup tapes. The TRICARE Management Authority (TMA) provides a link on its website to a June 2009 memo from DoD Senior Privacy Official Michael Rhodes that issues department-wide policies regarding “Safeguarding Against and Responding to the Breach of Personally Identifiable Information (PII).” This memo, among other provisions, refers to the DoD’s statutory obligations for protecting PII, specifically citing government-wide guidance from OMB in Memorandum M-07-16. This OMB memo, and M-06-16 that preceded it, require all federal agencies to encrypt agency data stored on portable devices, and to use encryption complying with the FIPS 140-2 standard. The language in M-06-16 is even more explicit, directing agencies transporting or storing PII offsite to use encryption during transport and for storage at a remote site. The DoD also has policies in place requiring that that all electronic records containing personally identifiable information be categorized at either moderate or high impact levels, and mandating encryption at rest (including storage on removable media) for all data categorized as high impact.

Reports of this latest breach often note that the number of individuals potentially affected makes this the largest breach of protected health information since the federal health data breach notification and disclosure rules went into effect in September 2009. Those rules provide an exception for lost, stolen, or otherwise compromised health data that is encrypted, giving healthcare organizations a strong incentive to implement encryption even where it is not required (under the HIPAA Security Rule, encryption of electronic PHI in transit and at rest is “addressable” rather than required). For government entities however, there seems little basis on which to argue that encryption is optional, since even where PHI-specific policies allow for discretionary use of encryption as a security control, agency-level and federal policies on the protection of all personally identifiable information obligate agencies to use encryption for data while in transport.

Supreme Court will hear case on GPS tracking, warrants, and the 4th Amendment

The U.S. Supreme Court has scheduled oral arguments for November 8, 2011 in United States v. Jones, an appeal by the government of an August 2010 D.C. Circuit Court ruling that continuous monitoring of a GPS tracking device placed on a suspected drug trafficker’s vehicle without a warrant violated the suspect’s 4th Amendment rights. The diversity of opinions by courts at multiple levels over the past couple of years helped to increase the probability that the Supreme Court would take up the issue, as the cases brought before the courts address classes of technology and tracking capabilities that go far beyond what was envisioned when the current laws were enacted or when major precedent cases like United States v. Knotts were decided. A recent New York Times article calls United States v. Jones “the most important 4th Amendment case in a decade” and invokes comparisons of the government’s efforts to use comprehensive surveillance technologies to the “Big Brother” state described in George Orwell’s 1984. Perhaps more notable is the potential for the Supreme Court’s attention to these issues to prompt a more comprehensive review of the outdated laws and regulatory practices that are so often unsuccessfully applied to modern communications technologies. Some members of Congress have repeatedly tried to get traction on overhauling the Electronic Communications Privacy Act (ECPA), enacted in 1986, to bring its provisions in line with current technology and possibly revised social norms and standards about what constitutes reasonable expectations of privacy. These efforts often focus on geolocation data, which while certainly not the only product of new technology poorly addressed by current laws, seems to bring about multiple perspectives and open questions. Beyond the disposition of the current case, it will be interesting to see if judicial action prompts any legislative response.

VA decision to allow iPad use without FIPS certification provides good example of risk-based decsion making

The decision by Department of Veterans Affairs CIO Roger Baker to allow users to connect mobile devices such as the Apple iPad and iPhone to the agency’s computing network provides a good example of the trade-off many organizations face between security, user desires, and practical business considerations, but also illustrates the subjectivity inherent in security management decisions and the authority delegated to federal agency executives to apply their own risk tolerance to decisions. Baker was quoted in an article by Nextgov in July acknowledging the fact that the security software is not FIPS certified, but indicating that he is willing to accept the risk associated with the decision to allow the devices to be used anyway, with the assumption that even without FIPS certification the encryption technology is sufficient to provide the needed protection. While the VA prepares for broader support for mobile devices this fall, it is operating a pilot program with Apple devices. Baker is participating in the pilot, and according to FederalTimes.com has traded his own laptop for an iPad.

From a security standpoint, the VA’s plan to allow agency-issued and personal mobile devices to access Departmental networks is most noteworthy because the devices in question do not yet satisfy federal standards for encryption. This is a particularly sensitive issue for the VA, which has a checkered history when it comes to data breaches, including the well-publicized 2006 theft of a VA laptop containing unencrypted records on some 26.5 million veterans. To be fair, Apple devices do offer encryption capabilities, but the software used to do so is not certified compliant with Federal Information Processing Standard (FIPS) 140-2, and so fails to satisfy federal security requirements for cryptographic modules. Apple is currently in the process of validating its cryptographic modules for both the iPhone and iPad through the National Institute of Standards and Technology’s Cryptographic Module Validation program. According to NIST’s “Modules in Process” list both Apple modules are in the first phase of the process, called “implementation under test,” meaning Apple has a testing contract in place with a cryptographic security and testing lab and has provided the module and all required documentation to the lab. While still in the early stages of certification, this progress may give the VA and other agencies some degree of confidence that FIPS certification is pending, making the risk associated with running un-certified security a temporary issue.

The fact that VA can independently make the decision to essentially waive a federal technology standard reflects the authority that most federal agencies have under current law and policy. The majority of agencies are self-accrediting when it comes to determining the appropriate security measures to put in place to adequately protect enterprise information and other assets. Federal agencies are expected to apply risk-based decision-making to security management practices, and since the authority rests with each agency, decision makers evaluate the risk to the organization from the use of a given system or technology against the benefits offered and the cost of implementing security safeguards. Different organizations (and different decision makers within those organizations) have different appetites for risk, so what may be acceptable to one agency would be unacceptable to another. In the VA’s case, it seems likely that Baker is not demonstrating especially high risk tolerance, but instead that the perceived risk of using encryption that has not yet achieved FIPS certification is not high enough to preclude the use of mobile computing devices in health care delivery settings.

Among the potential downsides of using encryption that doesn’t have FIPS 140-2 certification is in the area of breach notification. The new federal health data breach notification and disclosure requirements, which went into effect in September 2009 under the authority of an interim final rule, exempt organizations from having to disclose data breaches if the data is “unusable, unreadable, or otherwise indecipherable to unauthorized
individuals,” which HHS declared to mean that the data has either been encrypted or destroyed. The FIPS 140-2 requirements apply whenever government regulations call for the use security based on cryptographic modules, so the practical interpretation of HHS’ breach disclosure exemption for encrypted data is that such encryption must use FIPS 140-2 certified cryptography. In theory this means if the VA loses one of its newly network-connected iPads with protected health information on it, it would have to report the breach even if the device had encryption enabled. Practically speaking, VA already reports many types of data breaches to Congress and to the public, and does so to comply with requirements of the Veterans Benefits, Health Care, and Information Technology Act of 2006 (Pub. L. No. 109-461), so the new health data breach rules stemming from the HITECH Act are in many ways redundant to existing practice.

HIPAA “access report” potentially much simpler to implement, more valuable than accounting of disclosures

Among the provisions of the Health Information Technology for Economic and Clinical Health (HITECH) Act garnering significant attention are the changes to existing HIPAA requirements for covered entities to produce accounting of disclosures of protected health information and a new proposed requirement that entities and business associates also maintain (and furnish upon request) a record of accesses to individuals’ electronic health records. Both of these measures are addressed in a notice of proposed rulemaking (NPRM) published by HHS in the Federal Register in late May, the comment period for which closed August 1. Public objections to the proposed rules emphasize the administrative burden to health care organizations to collect and store the information required for accountings of disclosures and access histories, and the apparent lack of interest among members of the public in requesting this information from health care entities. While the two provisions share obvious functional elements, there are significant differences in both technical feasibility and practical relevance that justify separate consideration of the proposed rules, and in particular suggest that the new access record provision may be more difficult to dismiss using the arguments put forth to date.

Many industry watchers are more familiar with the proposed changes to the accounting of disclosure requirements, since those changes are spelled out in the HITECH Act (at §13405(c)), which most notably change the time period covered from six years to three and also remove the exception under current HIPAA provisions for disclosures for the purpose of treatment, payment, or health care operations. The NPRM repeats the current (45 CFR §164.528(b)(2)) implementation specifications for the content that must be included for each disclosure in the accounting:

  • Date of disclosure
  • Name of the entity or person (and their address, if known) receiving the disclosed PHI
  • Description of the PHI disclosed
  • Purpose for the disclosure

These requirements apply to PHI disclosed in both paper and electronic form, although the objections to the rule seem to focus on electronic disclosure, perhaps due to the inherent limitations of many electronic health record (EHR) systems and other applications to capture the required information. There are valid procedural objections as well, because with the exception of the date of the disclosure, the content required cannot easily be extracted or automatically recorded from EHR systems, and with respect to the purpose for disclosure in particular, it seems likely that the need to capture this information would insert a step in routine business processes where the purpose would need to be recorded before the process could be completed. By applying the accounting of disclosure to the types of disclosure that likely make up the vast majority of purposes for most health care entities, the removal of the exceptions for treatment, payment, and health care operations will unquestionably add to the administrative workload of covered entities and business associates who must comply with the law. Before developing the NPRM, HHS issued a request for information in May 2010 seeking comments on the accounting of disclosure changes in the HITECH Act, in which HHS sought to “better understand the  interests of individuals with respect to learning of such disclosures, the administrative burden on covered entities and business associates of accounting for such disclosures, and other information that may inform the Department’s rulemaking in this area.” Many commenters apparently pointed to the lack of consumer demand for accountings of disclosures, with few requests received by entities in the several years since the provision was first enacted. It seems possible however that while health care organizations will undoubtedly need to devote greater resources to complying with the revised rule, by covering a much greater proportion of total disclosures that individuals might find the accounting more valuable than in the past, when those requesting accountings would likely get no information regarding the most common or perhaps consumer-intuitive situations where disclosures had occurred.

Under the definition used in HIPAA (45 CFR §160.103), a “disclosure” only occurs when information leaves the entity holding it, so the accounting of disclosures only covers release or transfer of PHI from one person or organization to another. The new access report provision has no such limitation, and HHS indicated in its NPRM that it chose to add coverage for access to PHI by members of an entity’s workforce as part of an expanded perspective that includes both internal and external access to information in an individual’s health record. In contrast to the accounting of disclosures, the access report would apply only to records in electronic form, which might make the provision seem somewhat less comprehensive than the accounting of disclosures, but which – intentionally or not – greatly simplifies the collection and maintenance of record access information. The proposed implementation standard for the content of the access report specifies the following information:

  • Date of access
  • Time of access
  • Name of the person accessing the record (if available, or else the name of the entity)
  • What information was accessed, if available
  • Action taken by the user (e.g. create, modify, access, delete)

With the exception of describing what information was accessed, all of the elements proposed in the implementation specification reflect data routinely captured in audit logs that can typically be automatically generated by database-centric computer systems such as those used to manage EHRs. Distinguishing subsets of data contained in a single record accessed by a user would likely require more granular tracking than many audit logs provide, particularly in read-only events where no data is changed. However, the simpler set of content required for the access report makes the technical feasibility of this proposed requirement much greater than for the accounting of disclosures. This is true even without the flexibility afforded to organizations about providing the name of the person accessing the record, although the NPRM acknowledges that producing the first and last name may require mapping the user ID captured in an audit log to a list of full names. According to the NPRM, allowing for entity-level attribution rather than person-level is intended for situations where organizations outside the entity holding the information are provided access; employees or contractors working for the entity cannot share authentication credentials, as unique user identification is required under the technical safeguards specified in the HIPAA Security Rule (45 CFR §164.312).

One valid criticism regarding the proposed access record provision is that HHS seems to assume that relevant organizations would have only one electronic record system, but hospitals and large health care entities often have multiple systems, so creating the access report would require an aggregation of audit logs or other data drawn from multiple systems, adding cost and complexity to efforts to comply. Given the nature of the data required the technical barrier to producing an integrated view of audit records may not be too great, particularly for organizations that have implemented standards such as the IHE’s Audit Trail and Node Authentication, which include standard formats for audit logs to facilitate integrated audit reporting.

Previous objections to the accounting of disclosure rule often center on covered entities’ prior experience with patients and consumers and the apparent lack of interest by individuals in getting accountings, based on the few historical requests they have received. The implication is that there is a lot of administrative overhead to produce a “product” that for which there is little demand. This argument rings a bit hollow when applied to access records. Accounting of disclosures to date exclude many – perhaps most – occurrences, and only cover external exchanges of data. The access report is focused as much or more on access by insiders, to provide some insight into routine authorized accesses and, more importantly, to indicate instances of inappropriate access by authorized users. There is ample anecdotal evidence to suggest that inappropriate insider access is all too common, although the most well publicized incidents tend to involve abuse of privilege to view celebrity medical records. This type of incident is not limited to health care – recall the State Department contractors who improperly accessed the passport records on the candidates in the 2008 presidential election. In government agencies like the IRS there are formal policies against misuse of authorized access privileges to, for instance, browse tax records, such as Internal Revenue Manual 10.8.34.2, which explicitly forbids users from accessing their own accounts or accounts of friends, relatives, coworkers, other IRS employees, or celebrities. Absent access records, discovery of inappropriate user access must rely on technologies like intrusion detection or auditing systems, and if the latter are in place, it seems a short step to leverage the data already being collected through routine user event monitoring. In large organizations where inappropriate use may be a concern, implementing mechanisms to support data collection needed for access reports under the proposed HIPAA rule – and making employees aware of such data collection – may actually serve as a deterrent to inappropriate behavior.