A May 6 article in the New York Times (“E.U. To Consider More Stringent Reporting of Data Breaches“) includes quotes an opinions from a number of people suggesting that the European Union may be heading for a comprehensive breach notification law requiring public and private sector organizations to tell people when their personal information has been lost or disclosed. While the vast majority of states in the U.S. have some form of breach notification law, there in not yet a federal standard, with the possible exception of the disclosure requirements for breaches of unsecured personal health information contained in the American Recovery and Reinvestment Act. As noted in the Times article, “Most European countries, including Britain, do not require businesses or other entities to notify the public when they lose personal data, although some do so voluntarily.”
The draft U.S. Information and Communications Enhancement (U.S. ICE) legislation expected to be introduced by Senator Tom Carper (D – Del.) addresses and tries to remedy many of the shortcomings in the Federal Information Security Management Act (FISMA). The feature drawing the most attention recently is the position and corresponding office that would be created in the legislation of an executive branch Director of the National Office for Cyberspace. This new role would provide direct oversight of federal agency security programs (civilian and defense), including reviewing and approving agency information security programs mandated under FISMA. For security architects, there are a number of very interesting provisions in the law. These include:
The two primary information resources with which most people are familiar for security emergency response are the Computer Emergency Response Team Coordination Center (CERT/CC) at Carnegie-Mellon University’s Software Engineering Institute and the U.S. Computer Emergency Readiness Team (US-CERT) run by the U.S. Department of Homeland Security. The CERT/CC has long been a source of valuable information for public and private sector organizations seeking information on current and former security threats and vulnerabilities. Communication with CERT is two-way, as security researchers and others send information to CERT about the latest observations from the field, and others look to CERT for up-to-date information about new threats against which protective action should be taken. US-CERT serves as the focal point of communication for U.S. federal agencies that experience security breaches or other incidents; federal guidelines mandate that such incidents (or suspected incidents) be reported to US-CERT within one hour of their discovery. Because of this reporting process, US-CERT has tended to emphasize incident response to known exploits and incidents, while directing agencies to NIST or other sources for guidance on preventive controls. The director of US-CERT announced this week that the organization would shift its focus aware from incident response towards prevention. There’s no question that over-emphasizing incident response in the absence of security planning, risk assessment, and preventive controls can leads to a situation in which an organization is always one or more steps behind the attackers, always playing “catch-up” or cleaning up after an intrusion. It seems very risky however to openly de-emphasize incident response in the way that US-CERT Director Mischel Kwon does when she says “Incident response should be rare. Forensics should not be the norm.” She needs to look no further than her own organization, which reports more than a three-fold increase from 2006 to 2008 in the number of security incidents reported by federal agencies, to know that incident response and forensics remain important functions in any security program intended to improve the security posture of an organization. Better and more consistently applied preventive measures, including more robust penetration testing, will hopefully help stem the exponential growth of security incidents. However, until federal (and non-federal) organizations get a better handle on how the bad guys are getting to them, effective incident response will remain a critical componet of information security programs.
The emphasis on electronic information exchange among public sector agencies and private sector organizations has increased attention on both technical and non-technical barriers to sharing data among different organizational entities. In many ways, efforts to overcome the non-technical barriers have fallen short of their intended objectives, with the result that would-be information exchange participants who can exchange data choose not to do so because of concerns about how appropriate security and privacy requirements will be honored when the participants in the exchange are not subject to the same or comparable constraints. Some initial attempts to rationalize these differences have focused on information security controls, especially those applied at the system level. This approach cannot arrive at a mutually acceptable level of trust among diverse entities, because information system security drivers and requirements are too subjective. A more effective approach would focus on the data being exchanged, and the privacy and other content-based rules and regulations that apply to it, using these objective requirements to determine both procedural and technical safeguards needed to meet the requirements and provide the necessary basis of trust. Further development of this concept and a privacy requirement-driven framework to support it are key focus areas of our current research.
There seems to be an inordinate amount of attention on FISMA in the ongoing debate about how to establish a sufficient trust framework among public and private sector participants in health information exchange. Federal government security executives seem especially focused on the idea, still under development, that there needs to be a way to apply the security and privacy requirements government agencies are held to under FISMA to non-government entities when those non-government entities are part of an information exchange with the federal government. Leaving aside for the moment the suggestion that there may be a more suitable foundation (such as health information privacy regulations) on which to base security and privacy minimally acceptable requirements, there are at least three major problems with using FISMA as the basis of trust among information exchange participants.
The biggest issue is that while many of the security and privacy standards used and guidance followed by federal agencies under FISMA are common references, the provision of “adequate” security and privacy protections is entirely subjective, and as such differs from agency to agency. While all agencies use the security control framework contained in NIST Special Publication 800-53 to identify the sorts of measures they put in place, there are very few requirements about how these controls are actually implemented. Recent annual FISMA reports (including the most recently released report to Congress for fiscal year 2008) highlight the increase in the number and proportion of systems that receive authorization to operate based on formal certification and accreditation. The decision to accredit a system means that the accrediting authority (usually a senior security officer for the agency operating the system) agrees to accept the risk related to putting the system into production. Almost all federal agencies are self-accrediting, and each has its own risk tolerance in terms of what risks it finds acceptable and what it does not. Two agencies might not render the same accreditation decision on the same system implemented in their own environments, even using the same security controls. This lack of consistency regarding what is “secure” or “secure enough” presents an enormous barrier to agreeing on an appropriate minimum set of security provisions that could be used as the basis of trust among health information exchange participants, both within and outside the government.
Perhaps just as troubling, by focusing on FISMA requirements, the government is implicitly de-emphasizing the protection of privacy. To be sure, FISMA addresses privacy, most obviously in the requirement that all accredited systems be analyzed to identify the extent to which they store and make available personally identifiable information. These privacy impact assessments typically result in public notice being given detailing the data stored in and used by any system that handles personally identifiable information. But FISMA does not specify any actions for protecting privacy, nor does its accompanying NIST guidance include any controls to address privacy requirements stemming from the wide variety of legislation and regulatory guidance related to privacy.
It’s not entirely clear what it would mean for a non-government organization to try to comply with FISMA requirements. As noted above, most federal agencies are self-accrediting, so presumably the determination of whether a non-government system is adequately secured against risk would rest with the organization itself. The basis for this determination (including the private-sector organization’s risk tolerance) might be more or less robust than corresponding decisions made by federal agencies, so simply requiring non-government organization to follow a formal certification and accreditation process cannot establish a minimum security baseline any more than it does within the government. Few outside of government follow NIST 800-53, but many follow the similarly rigorous ISO/IEC 27000 security framework, so these organizations arguably would not need to adopt 800-53 if they already comply with an acceptable security management standard. (NIST has been working on an alignment matrix between 800-53 and ISO 27002, partly as a reflection of the similarity between the two standards and also in an effort to better harmonize public and private sector approaches.)
Even if some agreement can be reached wherein non-governmental entities agree to comply with FISMA security requirements, the law as enacted contains no civil or criminal penalties for failure to comply. Federal agencies judged to be doing a poor job with their information security programs receive poor grades on their annual FISMA report cards (fully half the reporting agencies received a grade of C or below for fiscal year 2007), but there is no correlation between budget allocations and good or bad grades, and no negative impact to poorly performing agencies other than bad publicity.
A better alternative (and one more consistent with master trust agreements like the NHIN Data Use and Reciprocal Sharing Agreement) would use privacy controls as a basis for establishing trust. One challenge in this regard is the number of different privacy regulations that come into play, making the HIPAA Privacy Rule alone (or other major privacy legislation) insufficient. Building a comprehensive set of privacy requirements and corresponding controls to be used as the foundation for trust in health information exchange is a topic we’ll continue to address here.