Lots of recommendations for new cyber-security czar

Ever since President Obama announced his intention to appoint a federal cyber-security “czar” in the Executive Office of the President, there have been a steady stream of open letters and articles making recommendations for the as-yet-unfilled position, as well as expressing concerns about the obstacles such an individual will face in being effective in the role. Adding to the unsolicited opinions this week is a succinct and thoughtful piece in Federal Computer Week by SANS research director Alan Paller, “The limits of a cyber-czar.” Paller highlights three of the many issues with the current state of information security in the federal government: too many integrators and vendors delivering systems that include security flaws or vulnerabilities; a lack of technically qualified security personnel; and a general failure on the part of government auditors (and agencies themselves) to assess the effectiveness (rather than just the implementation) of existing security controls.

Paller’s point (and mine in highlighting his article) is not just that these three issues represent big security challenges for the government, but also that they are issues that the new cyber-security czar may be able to take on an influence. Of course, not all aspects of federal information assurance are well-suited to top-down management or to common solutions, but with the emphasis to date placed on the role of the cyber-security czar in formulating and implementing policy for the federal government, these do seem like fruitful areas for exerting some executive branch influence. For the software security and general quality issue, the czar need look no further than the Department of Defense for standards, contract language, and requirements that demand vendors and integrators demonstrate that their products and developed systems meet specific security configuration criteria, such as those contained in the DISA Security Technical Implementation Guides (STIGs). Paller points to sparse implementation and enforcement of the Federal Desktop Core Configuration (FDCC) requirements that officially went into effect in February 2008, and lays the blame at the feet of the agencies who appear unwilling or uninteresting in explicitly requiring (and holding to account) their contractors to comply with the existing standards and rules. The scope of the FDCC is much narrower than that covered by the STIGs, so any federal-level policy about security standards should look beyond merely requiring FDCC compliance.

On certifying security workers, Paller calls out the DOD for having the right intentions (to require personnel with security responsibilities to be certified) but in short-sightedly lowering its technical certification standards. It’s a bit of a funny argument coming from Paller, given that the SANS GIAC cert is among those (along with the CISSP from (ISC)2 and CISA and CISM from ISACA) fulfilling the DOD certification requirement. It is true that in rolling out the DOD 8570 certification rules, DOD has included several certifications traditionally aimed at infosec managers (rather than hands-on practitioners), and that many of these are broad-based in nature rather than deeply technical. It is also true that the SANS set of certifications underlying the GIAC do tend to require much deeper mastery and demonstration of technical skills than DOD-approved certs from other organizations. Again using DOD as a model (even if its experience hasn’t been perfect) for how minimum security qualifications might be both required and demonstrated for federal infosec workers, the cyber-security czar could put in place personnel policies that would (over time) raise the level of competency in the federal security workforce.

The toughest nut to crack for federal agencies may be first assessing and then improving on the effectiveness of security controls implemented within the government. Even with the very positive development seen with the third revision of NIST Special Publication 800-53 that will standardize the set of controls for all government agency systems (Defense and Intel included), 800-53 is still used primarily as a planning or implementation checklist, and not as a basis for evaluating security controls put into production. It remains to be seen whether the cyber-security czar, perhaps under whatever framework emerges for the revised national cybersecurity initiative, can craft a policy to require penetration testing and specific audit procedures that go beyond the documentation-validation exercises so prevalent today, and work with agency CISOs to see it implemented.

Old security issues keep coming up

In an otherwise unremarkable Washington Post article about the Department of Defense’s plan to create a “cyber-command” run out of the Pentagon, a couple of points raised in the article demonstrate the persistence of some information assurance themes about both data integrity and the legal and ethical aspects of cyber warfare.

In the article by Post staff writer Ellen Nakashima, U.S. Strategic Commander General Kevin P. Chilton’s concern about maintaining the integrity of mission-critical information is quoted: “So I put out an order on my computer that says I want all my forces to go left, and when they receive it, it says, ‘Go right.’ . . . I’d want to defend against that.” This is a simple example of the data integrity problem known as “Byzantine failure,” a topic of great interest to us and one that underlies some of our ongoing research into integrity assertions.

The article also mentions a recent report from the National Research Council that called for a national policy on cyber attack to address, among other things, the legal and otherwise defensible bases upon which a military sort of response to a cyber attack would be justified. As Nakashima puts it, “If a foreign country flew a reconnaissance plane over the United States and took pictures, for instance, the United States would reserve the right to shoot it down in U.S. airspace, experts said. But if that same country sent malicious code into a military network, what should the response be?” The general legal line of thinking follows the Computer Security Act and the PATRIOT Act to essentially give the U.S. the right to defend itself from attack, even if that means responding in kind against an online adversary. The ethical implications of such a presumed stance are not at all clear, especially given the frequent use of secondary servers and compromised hosts to launch attacks. If an intrusion or attack is detected and traced to a source at a university, or a hospital, or a government data center, disabling the apparent attacker, even when technically feasible, may not always be the right thing to do.

NIST finalizing standard government-wide security controls

After more than two years of collaboration among civilian, defense, and intelligence agencies, the National Institute of Standards and Technology’s Information Technology Laboratory has released the final public draft of revision 3 of its Special Publication 800-53, “Recommended Security Controls for Federal Information Systems and Organizations.” The most notable aspect of this release is not the set of controls it contains, as the control structure has been largely the same since 800-53 was originally published in 2005, with minor modifications in 2006 and again in 2007. What is remarkable this time around is the consensus achieved by Ron Ross and his team at NIST in for the first time getting DOD, the intel community, and civilian agencies to agree to a single set of government-wide standards. Dr. Ross leads the Joint Task Force Transformation Initiative Interagency Working Group, whose primary focus is harmonizing security practices, standards, and guidelines across the government, even for national security systems not falling under the scope of the Federal Information Security Management Act. Similar consolidation is forthcoming in the area of Certification and Accreditation under Special Publication 800-37, “Guide for Security Authorization of Federal Information Systems: A Security Lifecycle Approach,” still in draft, which when finalized and adopted, will unify the DIACAP approach (itself a revision of DITSCAP) long used in the Department of Defense with the civilian agency C&A process that became required for use in 2004. This trend of agreement on standards and practices throughout government seems to be a positive indicator for those (like President Obama) who advocate more central oversight and administration of federal information security.

Making sense of information privacy

With more and more initiatives focused on information sharing, data exchange, aggregation, and analysis, there is also increased attention on establishing and protecting privacy, particularly of personal information. As noted in yesterday’s post, a federal panel focusing on such issues has recommended substantial updates on the primary federal laws governing privacy protections, including the Privacy Act of 1974 and the E-Government Act of 2002. Federal legislation related to privacy is a large and complex area, with numerous narrowly defined requirements based on specific types of data (medical records, educational records, social security numbers, etc.) or on the nature of the individuals whose data is being stored (veterans, mental health patients, children, U.S. citizens vs. non-permanent residents, etc.). Efforts at establishing over-arching legal or policy frameworks on privacy have been frustrated by a number of factors, including technological advances beyond what was envisioned at the time existing laws were written. As noted yesterday by the New York Times, there are two conflicting perspectives on information privacy that are fundamentally at odds: one favors a “data minimization” approach centered on the idea that the best way to prevent the disclosure of sensitive data is not to store it in the first place; the other places great value on storing as much information as possible, as long as individuals are given control over who can see the information and under what circumstances. The latter perspective is a characteristic of social networking, although the extent to which individual users of sites like Facebook and MySpace are really in control of their data is subject to some debate.

Coming to some resolution or balancing point between data interoperability and privacy (and security) provisions is a prerequisite to success for many ambitious initiatives in both the government and private sectors. For instance, wide-scale adoption of health information technology such as electronic medical records or personal health records will not become a reality (no matter the industry incentives offered by the government) unless and until individuals are satisfied that their personal information will be appropriately secured and their privacy maintained. There are technical, legal, and philosophical arguments being proposed by privacy advocates, information sharing proponents, and neutral observers, but the goal of getting to common understanding of just what “privacy” means, in what contexts, and how that privacy should be protected remains elusive. Into this debate comes Daniel Solove, a law professor at George Washington University who has written quite a bit in recent years on the legal aspects of privacy and information technology. Solove’s most recent effort is Understanding Privacy, which provides a detailed analysis of why previous attempts at defining privacy and setting standards for its protection have failed. Solove also proposes his own method for resolving this issue, following a pragmatic methodology that does not seek to establish a single, over-arching conception of privacy, but instead accepts that privacy problems and their solutions are different depending on their contexts. While his writing style is by turns both academic and philosophical, his concise (the text is just under 200 pages) treatment of the topic does indeed contribute to developing an understanding of privacy and just why it remains so problematic, whether or not you agree with his take on addressing it.

NIST recommends updates to Privacy Act

Last week the Information Security and Privacy Advisory Board (ISPAB) published a report, “Toward A 21st Century Framework for Federal Government Privacy Policy”, recommending a variety of both broad and targeted actions intended to update the privacy provisions in the Privacy Act of 1974 and the privacy portions (section 208) of the E-Government Act of 2002. The recommendations are largely driven by the perceived gap in privacy legislative requirements developed 35 years ago and the technological and operational environments of the modern information age. Key recommendations in this report include:

  • Amendments to the Privacy Act and E-Government Act in order to:
    • Improve Government privacy notices;
    • Update the definition of System of Records to cover relational and distributed systems based on government use, not holding, of records.
    • Clearly cover commercial data sources under both the Privacy Act and the E‐Government Act.
  • Improve government leadership and governance of privacy
    • OMB should hire a full-time Chief Privacy Officer with resources.
    • Privacy Act Guidance from OMB must be regularly updated.
    • Chief Privacy Officers should be hired at all “CFO agencies.”
    • A Chief Privacy Officers’ Council should be developed.
  • Changes and clarifications in privacy policy
    • OMB should update the federal government’s cookie policy.
    • OMB should issue privacy guidance on agency use of location information.
    • OMB should work with US‐CERT to create interagency information on data loss across the government
    • There should be public reporting on use of Social Security Numbers

For privacy practitioners, there is much to like in this report. Of particular interest (especially in light of new federal data breach disclosure notice requirements that apply to some commercial sector organizations as well as the government) is the re-definition of “system of records” to encompass databases and other systems storing personal information that are used by the government, and not limited to those the government actually holds. On a more technical level, the ISPAB recommends a long-overdue reevaluation of the federal policy on the use of cookies (in general, the use of persistent cookies is banned on federal websites), in part to help the government realize some of the benefits of Web 2.0 technologies.