Initial observations on Revision 3 of SP800-53

NIST last week released the final version of Revision 3 of its Special Publication 800-53, “Recommended Security Controls for Federal Information Systems and Organizations.” This update has a number of really interesting characteristics, beyond the simple summary about the number of controls in the latest version of the 800-53 framework, which serves as the basis for determining which controls should be used in federal computing environments and for evaluating those controls as part of the process of certifying and accrediting information systems. The new release is among the first final products resulting from a NIST-managed effort to incorporate the perspectives of civilian, defense, and intelligence agency security programs, with a single control framework applicable to all federal agencies. (A separate but related effort will standardize the certification and accreditation process by normalizing aspects of the DIACAP process favored by the Department of Defense with the civilian agency standard guidance in Special Publication 800-37.) The influence of DOD and intelligence community contributions is evident in the new 800-53, especially in areas such as the System and Communications Protection security control family, where nearly a third of controls new to the framework appear.

From an overall perspective, the most notable change may be the expansion of the coverage of 800-53 to begin to address agency security programs in addition to information systems. The obvious change here is in the addition of an 18th security control family called “Program Management,” but the change in scope is apparent even in the title of the document, which in its two previous versions did not include the words “and Organizations.” This change in consistent with a need now recognized in Congress as well as among agencies that the existing emphasis on information system security fails to address the overall effectiveness of the agency information security programs mandated by FISMA. While the vast majority of current federal security standards and guidance is still focused on information and information systems (with little or no attention to business processes or programs), subtle changes in language and tone in 800-53 and other recent draft documents such as Special Publication 800-39, “Managing Risk from Information Systems: An Organizational Perspective” suggest that NIST is evolving away from this narrow system focus to try to raise the visibility of enterprise risk management and information assurance functions. Another significant new feature is the addition of a prioritization rating (1 to 3) that gives an indication of what controls to address first. These ratings don’t obviate the need to implement all required controls corresponding to the security categorization of an information system, but seems to be a practical recognition that you can’t address everything at once.

In the aggregate, the number of security control families went from 17 to 18, with 34 new controls added and seven withdrawn, bringing the total number of controls across all families to 198. Something else to keep in mind is that deltas don’t tell the whole story — some controls were re-named or had their emphasis changed so organizations will need to revisit these to make sure existing controls are consistent with the new intent in 800-53. While a detailed review of all the changes in the framework is well beyond the scope of this forum, a few noteworthy aspects are summarized here.

  • In the Access Control family, the new AC-21 (User-based collaboration and information sharing) for the first time addresses the distinct authorizations needed for information sharing partners, as opposed to internal users. In a similar vein, the Identification and Authentication family now separates identification and authorization into two separate controls, one (IA-2) for internal (organizational) users and another (IA-8) for external (non-organizational) users.
  • A new audit control AU-13 (Monitoring for information disclosure) extends and augments the Information System Monitoring (SI-4) control, with an explicit emphasis on inappropriate data flows out of the organization, consistent with the industry momentum behind Data Loss Prevention technologies and approaches.
  • In System and Services Acquisition, new controls including SA-13 (Trustworthiness) and SA-14 (Critical Information System Components) that formally introduce the idea of level of trust as an organizationally defined parameter, and recognize that at high security levels, organizations may not find sufficiently trustworthy products or components (even if functionality requirements are met) and will need to come up with internally developed alternatives.
  • The single biggest update at the family level is to System and Communications Protection, which now comprises 34 controls (to be fair, most of the new additions are not required within the existing security profiles), some of which suggest an acknowledgement from NIST that newer and more pervasive types of threats have to be addressed. One such control is SC-28 Protection of Information at Rest, highlighting an area of security that the government has increasingly stressed given the recurrence of breaches of sensitive data.
  • Some other controls in System and Communications Protection reflect very explicit thinking about ways to enhance the security posture of agencies and their systems, even if that thinking conflicts with other current practices. For example, SC-29 (Heterogeneity) encourages diversity of system components to enhance security, a recommendation often in conflict with emphases on using technology standards and reducing the overall number of technologies used or supported in the organization. In SC-30 (Virtualization techniques) NIST postulates that virtualizing system components can be a security enhancing approach, by disguising systems through random instantiation of virtualized components. This is an interesting contrast to some of the current (negative) hype about security applications and data in outsourced infrastructure, platform, and application delivery models such as cloud computing.
  • Still other new additions to System and Communications Protection demonstrate very clearly the incorporation of DOD-favored information assurance practices. Examples of these defense-influenced controls include SC-32 (Information system partitioning) also known familiarly as “physical separation”; SC-33 (Transmission preparation integrity); and SC-34 (Non-modifiable executable programs). While many of these controls are not required under any of the 800-53 annex information system security profiles, it will be interesting to see how civilian agencies accommodate new requirements such as physical separation.

Putting 800-53 Revision 3 into effect by itself will have a reasonably significant impact on agencies, particularly as the re-authorizations come due for their existing production systems. It is not yet clear if agencies will feel the need to re-authorize sooner, perhaps based on required annual risk assessments that are bound to turn up controls that need to be addressed that weren’t included in 800-53 when the previous certification and accreditation was performed.

No point in asking private entities to comply with FISMA

In what has become a consistent theme out of the Office of the National Coordinator for Health IT, it seems the idea is still under consideration to try to require private-sector organizations to comply with the Federal Information Security Management Act (FISMA) in order to participate in health information exchanges with federal agencies. As currently in force, there are a couple of real problems with this approach, at least if the goal is to make sure the systems in question are protected with at least a consensus minimum set of measures. Based on a subjective security categorization, FISMA does mandate a minimum set of control types that must be put in place via the control framework in Special Publication 800-53. The biggest problem is that the determination of how each of the required security controls is implemented is subjective, and there is currently no federal standard for evaluating controls to determine their effectiveness. The accreditation of information systems in federal agencies is also to a subjective decision, based not only on what controls are put in place but also on what level of risk the organization is willing to accept. Risk tolerance isn’t consistent among federal agencies (which is why, for example, you find a security policy from the Centers for Medicare and Medicaid Services that forbids its contractors to send personally identifiable information via the Internet, regardless of the encryption, VPN, or other protective measures that might be applied). It seems safe to assume that risk tolerance would be even more variable among the many types of private entities that might have a role in nationwide health information exchange. Applying FISMA would tell a private enterprise that they need to decide that their own implementation of security controls is “secure enough” to be put in production, but doesn’t require any entity to consider the security needs or preferences of its information exchange partners. Once the private entity says “yes” to that, under FISMA they’re good to go. Oh, and there isn’t any mechanism for an auditor or other overseer to follow up and see if they agree with the entity’s own assessment.

So now we might have a slew of private enterprises, all producing lots of system security documentation as federal agencies do now, self-proclaiming the sufficiency of their security, and moving happily into production with health information exchange. What happens when something goes wrong, like a breach or inappropriate disclosure of information? Under FISMA, the answer is “nothing.” The Office of Management and Budget, it its capacity as the receiver of FISMA report information from federal agencies, assesses agency information security programs as a whole, but does not as a general rule delve into the details of individual information systems. For really high profile systems, the Government Accountability Office might do its own assessment and produce a report, as is often the case when a member of Congress asks for a review of an important system that supports a key program. No one has yet suggested that OMB, ONC, or any NHIN-specific governance body would be staffed and tasked with the responsibility of evaluating exchange participants’ security, at least not beyond the time of initial “enrollment” or joining the NHIN. There is no penalty, civil or criminal, for failing to comply with FISMA, or for suffering an incident, even if it was due to an agency’s failure to properly implement a required security control. It is unclear how asking private entities to operate under these requirements would have any meaningful impact on securing information exchanges, aside from the huge increase in work for these entities to document their existing security mechanisms.

Not everyone agrees what is (and isn’t) personal information

Deliberations among European Union member countries made privacy headlines in early 2008 when Peter Scharr, data protection commissioner for Germany and leader of a group of EU data privacy regulators, while speaking at a European Parliament hearing on online data protection, concluded that Internet Protocol (IP) addresses should be considered personal information, insofar as they can often be used to identify an individual based on the individuals ownership or use of a computer associated with an IP address. This view — not pervasive across the European Community but at least indicative of a way of thinking that emphasizes personal privacy protections — has long been opposed by major IT industry players like Google, but is typically supported by privacy advocates like the Electronic Privacy Information Center. Last month a federal judge concluded just the opposite, ruling that IP addresses are not personal information while dismissing a class action suit against Microsoft in which plaintiffs had argued that Microsoft’s practice of collecting IP addresses during automated updates violated its user agreement, which does not allow the company to collect information that personally identifies users.

Beyond the still unresolved issue of whether an IP address should be treated as personally identifiable information, and under what specific circumstances, the disagreement in federal level thinking between countries further highlights the challenges that organizations face in complying with privacy laws and regulations across the global economy, and the difficulty faced in overseeing and enforcing those laws and regulations.

FISMA still being touted as best security for health information exchange

Coming out of the recent CONNECT User Training Seminar held this week in Washington, DC is a reiteration of the opinion previous expressed by federal stakeholders working on the Nationwide Health Information Network (NHIN) that non-federal entities seeking to participate in the NHIN need to step up their security and privacy to at least meet the level of federal practices under FISMA. The suggestion once again is that security practices of private sector healthcare organizations and other businesses are less rigorous and less effective than those of public sector organizations. The recommendation is that all would-be NHIN participants should adopt a risk-based security management and security control standard such as the framework articulated in NIST Special Publication 800-53, used by all federal agencies.

There’s no question that a baseline set of security standards and practices would go a long way towards establishing the minimum level of trust needed for public and private sector entities to be comfortable with sharing health data. What seems a bit disingenuous however is the suggestion, repeated on Tuesday by the CIO of the Center for Medicare and Medicaid Services, that current government security and privacy practices are the model that should be broadened to apply to the private sector. Any organization currently following ISO/IEC 27000 series standards for risk management and information security controls is already assuming a posture commensurate with a federal agency using 800-53 — no less an authority than the FISMA team at NIST has acknowledge the substantial overlap between 800-53 and ISO 27002 controls, and NIST’s more recent released SP800-39 risk management guidance was influenced by the corresponding risk management elements in ISO 27001, 27002, and 27005 as well.

The hardest piece to reconcile may be the need for organizations to certify the security of their systems and supporting processes. Here again, it’s hard to argue that some sort of certification (or even objective validation) of security controls could help establish, monitor, and enforce necessary security measures in all participating organizations. The federal model for certification and accreditation is a self-accrediting form of security governance, so the logical extension of this model would be to have private enterprises similarly self-certify and assert their security and privacy practices are sufficient. Aside from the trust issues inherent to any subjective system of self-reported compliance, it’s not at all clear what level of oversight would be put in place under the still-emerging NHIN governance framework, or what federal laws have to offer in terms of an approach. While there are explicit legal penalties for violation of health privacy and security laws such as HIPAA, the only outcome for a federal agency failing to follow effective security practices under FISMA is a bad grade on an OMB report card. FISMA simply isn’t a best practice for verifying effective security.

GAO adds to the chorus calling for better security metrics

In a GAO report released last week reflecting testimony delivered to the House subcommittee on Technology and Innovation, GAO’s Greg Wilshusen echoed his own previous testimony and a growing number of congressional voices pointing out that progress in FISMA scores do not translate into more effective security programs or improved security postures for federal agencies. Wilshusen’s recent testimony focused on the Department of Homeland Security and the National Institute for Standards and Technology (NIST), but his findings are broadly applicable across the government. Not only have many federal agencies failed to fully implement information security programs as required under FISMA, but the security measures reported annually to OMB continue to focus on the implementation of required security controls, rather than their effectiveness in achieving enhanced security. GAO joins a group of senators (including Tom Carper of Delaware and Olympia Snowe of Maine and John Rockefeller of West Virginia) who have introduced legislation intended to strengthen FISMA, both through assignment of responsibilities to the new federal cybersecurity coordinator, and through changes in the focus of requirements for security control measurement, testing, and oversight.