Discretionary Declassification and Release of Contemporary National Security Information

The Problem

Classification and declassification decisions are based on risk assessments and time-based standards for withholding.  Agencies determine whether to protect information through classification by evaluating the damage its release would cause to the national security.  Information deemed worthy of classification is marked as such, assigned a declassification date based on its perceived sensitivity, and protected for the duration of its classification.  Too little attention is given to the value of declassifying information ahead of these deadlines or not classifying information in the first instance.

A Role for the Executive Branch

Policymakers should consider the advantages of declassifying information before prescribed deadlines or not classifying certain information.  Discretionary declassification of information less than twenty-five years old should become a hallmark of the classification system, and less information should be classified from the start.

Executive Order 13526 (the Order) encourages discretionary declassification when the benefits of protecting information are outweighed by the public’s interest in its disclosure.  At present, however, this provision is seldom used.  Discretionary declassification provides policymakers a means of aligning classification determinations with contemporary events and superseding the abstract deadlines associated with time-based declassification.  To a greater extent, declassification of contemporary information freed from the constraints of arbitrary declassification dates empowers democratic discourse and enables more transparent and informed decision making.   As President Barack Obama’s May 2010 decision to release the size of the U.S. nuclear stockpile demonstrated, these discretionary determinations have the capacity to strengthen national security, reduce the quantity of material that is classified, and hasten the public’s access to Government information.  The value of discretionary declassification is compounded when decisions are incorporated into agency classification and declassification guidance.

As they revise their classification guidance in accordance with the Fundamental Classification Guidance Review requirement of the Order, agencies should also reduce the duration of classifications.  When information is classified because of the sensitivity of a decision rather than the inherent sensitivity of the supporting information, declassification instructions should specify concrete events (e.g., “conclusion of mission”) in place of arbitrary dates or provide an event that may supersede the date.  In some cases, agencies should apply a future date, less than 25 years, possibly only weeks or months, when the classification would be “self-extinguishing.”

In recognition of the public interest and the costs of protecting information in the digital age, agencies should better employ the presumption against classification.  Maintaining massive amounts of information as classified limits the choices available to policymakers and threatens national security by restricting information sharing and affording costly protection to increasingly unmanageable amounts of data.  Treating increasing volumes of Government information as classified heightens public cynicism toward the legitimacy of classification and, in some instances, fosters an attitude of indifference amongst authorized users toward their information security responsibilities.  By reducing the quantity of information protected, agencies can restore public trust in the system’s validity and devote resources to protecting only the worthiest items.

A Role for the Legislative Branch

In recent years, Congress has recognized the value of specifically-directed, expedited declassification review through legislation.  Statutes such as the President John F. Kennedy Assassination Records Collection Act and the Nazi War Crimes Disclosure Act have mandated accelerated declassification review for classified records associated with topics of great national interest and directed agencies to withhold only the narrowest categories of information when they conduct their reviews.  Contemporary Congressional commissions, such as the National Commission on Terrorist Attacks Upon the United States (the 9/11 Commission), have likewise encouraged the release of their records to the fullest extent possible. [1]

Records released through these efforts have broadened Americans’ historical understanding and, in some cases, compelled agencies to revise their declassification guidance in favor of greater disclosure.  In the future, when controversy surrounding a past Government action or historical event becomes extraordinarily acute, Congress should contemplate appropriating funds for topical review boards to support the National Declassification Center in expediting the declassification review of all reasonably related Federal records.


Agencies and the public will greatly benefit from decisions to discretionarily declassify or maintain as unclassified categories of contemporary national security information.  By protecting less information as classified, agencies will reduce the costs associated with storing, transmitting, and declassifying classified information.  Tailoring release decisions to contemporary circumstances will overcome some of the obstacles and delays associated with declassification and allow for more rapid public access to Government information.

Minimizing the quantity of information classified will increase public confidence in the system while bolstering the Government’s ability to protect the information most deserving of classification.  Of course, the intended benefits of a better informed public and increased public confidence in the decisions of its Government are only realized if releases are full disclosures of the matter revealed.  This might only be a number, as in the size of the nuclear stockpile or the aggregate amount of the intelligence budget, but may also be a large volume of records explaining a major effort of the government.  Whatever the case, advantageous release must be intended to fully inform, and not to mislead.

[1] Commissions like the 9/11 Commission worked diligently with classification authorities to produce extensive unclassified reports (by avoiding key words, dates, names, etc.) and have captured related classified information with an eye toward early declassification.


12 thoughts on “Discretionary Declassification and Release of Contemporary National Security Information

  1. The business case for discretionary declassification is IMHO weak.

    I doubt that the actual cost reduction will be proportional to the volume of records. More likely the relatively small decreases of, say, 10% in the volume of classified records will have no impact on cost at all. The transparency-and-trust argument is OK, but it’s difficult to quantify in dollars.

    At the same time conditional declassification means a significant additional workload for records managers. Discretionary declassification also means extra responsibility for the decision-makers. As a result, actual agency’s expenses could rise.

  2. I am in total agreement with the end state described in this paper, which encourages discretionary declassification above and beyond what may be required. But this discussion lacks a practical policy mechanism for advancing towards the wise goal it presents. Today, far from practicing discretionary declassification, many agencies fail to declassify even when it is required or “automatic”! So something more than sensible argument will be needed in order to move forward.

    I would suggest that PIDB give the strongest possible emphasis to the Fundamental Classification Guidance Review as a voluntary (discretionary) way of reducing the scope of the classification system, not sometime in the indefinite future, but this year and next year. Unfortunately, there is no evidence that the Fundamental Review to date has led to any specific reductions in classification. It’s true that most agencies have not yet completed their Reviews, but in cases where they have done so (e.g., US European Command), they ended up reaffirming existing classification guidance and did not eliminate anything. Therefore, it may be necessary for the President to set performance goals for the Fundamental Review, e.g. a 5% reduction in the number of discrete classification instructions. PIDB should do what it can to encourage a productive outcome from the Fundamental Review.

    If the Fundamental Review fails to produce a reduction in classification, then some other mechanism will be needed to motivate and drive discretionary declassification. Such mechanisms could conceivably include:

    A cap on original classification decisions in each agency.

    A cap on the number of security clearances held by employees in each agency.

    A cap on classification-related expenditures.

    If the Fundamental Review does produce a measurable reduction in classification over the coming year, then its repetition in two-year cycles would provide a foundation for further discretionary reductions on an ongoing basis. But failing that, the choice will be between compulsory limits on classification — or perpetuation of the status quo until the system breaks down.

    1. I am compelled to comment that Mr. Aftergood doesn’t go far enough in making recommendations to solve this problem. (I never thought I’d say that). While his view here is essentially correct, we are not going to solve the problem of over classification until we completely overhaul the classification system. Rather than caps as proposed by Mr. Aftergood, I believe we need to take a more radical approach:

      1) Define damage, serious damage and exceptionally grave damage. We’ve been classifying using these vague descriptions of what should be classified since the Truman order.

      2) Throw out the old guides, standards and approaches. We will not learn how to change the way we’ve done this business since Truman unless we start with a clean sheet of paper and design a completely new classification system for the 21st century.

      3) Analyze historic classified records to identify those aspects that warrant continued protection. This requires funding and time. If we are going to do it right, we need resources to investigate and support to develop standards in a time frame that is aggressive, but possible. Getting this done requires more than a policy saying we’ve got to get it done by 2013, it requires some real leadership.

      1. I respectfully disagree with Mr. Cooper’s comment.

        The problem of overclassification probably cannot be “solved” to everyone’s satisfaction because the process of classification involves subjective judgments that will always be susceptible to dispute and disagreement. Instead, the near-term policy objective should be to optimize the classification process with a robust error correction mechanism so that the unavoidable disputes can be aired and the best possible judgments made. (The ISCAP appeals process is one small example of such an error correction mechanism; we need others.)

        New definitions of the term damage will not advance the process. The practical meaning of damage is articulated not in the executive order but in agency classification guides, where the specific items of information requiring classification are identified. So that’s where the corrective process needs to focus right now. Since the President has already ordered a review of all of those agency classification guides, that process now holds the most promise for near-term change, and so that’s where the immediate emphasis should be placed.

        The current classification system may be beyond repair, and we may indeed need a new classification system for the 21st century. But merely saying so does not bring us any closer. How do we get from here to there? That’s what we’re here to talk about, isn’t it?

        One way to proceed would be to establish a classification policy test bed, in which new approaches to classification and declassification could be explored and developed as pilot projects. Among other things, this effort would need an authorized exemption from the requirements of the executive order, in order to move beyond the status quo and to test policies and procedures that are “forbidden” under existing guidance. To begin with, such policies could include the testing of various modes of discretionary declassification, self-cancelling classification markings, discretionary grants of temporary security clearances, zero review declassification policies, etc.

        But recommendations that require significant new funding or infrastructure should be avoided, since no new funding is likely to be available. To insist that spending more money is necessary is likely to doom the whole enterprise.

  3. Mr. Cooper is “spot on” in that we need policy and guidance that befits this century and forward. His comments in previous blog replies regarding a National Lab have merit. The problem is the subject at hand does not compete with flash and bang initiatives, so it becomes a tough leadership effort to obtain adequate funding.

  4. There are instances when protected information can be declassified at an accelerated rate. Take information that is sensitive because it deals with policy. Tasking that is time sensitive and/or reporting on those tasks can often be declassified sooner than allowed by the automatic declassification date. Information that is protected because it deals with sources can also be declassified at a faster rate when certain situations arise. Examples would include when a country is dissolved, such as East Germany, or when there is a fundamental change in government, then discretionary declassification certainly could play a role as sensitivities from a source and/or policy standpoint may no longer require protection. The death of a party leader or head of state could also fall into that same category.

  5. Harry Cooper is right that much better analysis based on actual experience is needed to identify what truly needs protection. Steve Aftergood’s emphasis on Fundamental Classification Guidance Review is also right because it builds on the president’s mandate and recognizes that individual agencies have the best expertise to do effective analysis. However, artificial “caps” on original classification decisions, clearances, or expenditures are probably impractical as enforcement measures. A more viable approach might be to find an example of where Fundamental Review is done reasonably well for praise by the President. Such an example might be found in an agency that has a strong interest in minimizing classification to benefit the public interest through accomplishment of its governmental mission by sharing information with uncleared recipients.

  6. As the paper suggests, it seems to me that we do not have a way to identify relevant documents and declassify quickly when it is in our national security interest to do so. The premise that only premature release threatens damage to national security, not inflexible secrecy, is outdated. Anonymous’s comments along these lines are interesting and reflect a grasp of the rapid changes taking place in international politics. How do we take practical steps in this direction without, as others have cautioned, deflecting attention from the concurrent need for solutions to backlogs and overall costs?

  7. I don’t think that Mr. Aftergood and I are that far apart. We have a clear need for a place where we can appropriately analyze what Mr. Aftergood refers to as “over classification.” For the most part the optic on over classification is from someone who only sees what is released and not what is protected. Sensitivity of information is over a range and I would agree that the line between what ought to be classified and what ought not is too far toward the classified side. We do consistently err on the side of over classification. Generally that can be attributed to a fuzzy definition of “damage” and actions of employees not trained in understanding what damage means who are in good faith not wanting to under classify and inadvertently expose sensitive information that would legitimately harm the United States.

    But once you leave the fuzzy middle ground between classified and unclassified, the protection of information marked as classified is both legitimate and necessary. Mr. Aftergood has seen the fuzzy area both in terms of material that has been released over government objections and material that has been released after being classified for many years. From that limited point of view it is easy to say that everything is over classified and some arbitrary date should be set when all this stuff is just released. If you are only talking about the information in that middle fuzzy area – I agree. You must take into account the information that would cause legitimate damage and either get people killed or destroy the advantage the US often has politically and militarily. The sweeping statements about over classification can only do harm if they are not tempered with understanding that some information really does require protection from release.

    To put this in perspective, classification needs to be done in a surgical way to identify specific information that needs protection for legitimate reasons. That protection is on a scale that may range from a few days to some decades and a very small amount (maybe 1/10th of 1%) that needs protection virtually forever. The tool we’ve been given is not a scalpel, however, its more like a sledge or a fire axe. No matter how we wield this tool, we are destine to do extensive collateral damage. The result is a classification system that appears completely broken when countless documents are protected unnecessarily or necessary protection exceeds the weeks or months that is warranted and goes on for decades.

    I agree that we need a classification policy test bed where we can find the right balance using some trial and error. We simply have to gather the right people and work on the real solutions to the problem.

  8. As this White Paper recognizes, both over-classification and the failure to declassify information once the need for secrecy has passed obscure vast amounts of information from public scrutiny, impeding oversight and potentially shielding misconduct. Moreover, over-classification and delayed declassification needlessly create and perpetuate informational silos between government agencies and branches, ultimately compromising public safety. By weakening the system of checks and balances, these practices also raise constitutional concerns.

    The Constitution Project (TCP) is an independent, nonprofit organization that promotes and defends constitutional safeguards by bringing together a wide array of individuals who share a common concern about preserving civil liberties. As part of its work, in July 2009, TCP’s bipartisan Liberty and Security Committee – composed of members of law enforcement, legal academics, former government officials, and advocates from across the political spectrum – released a report entitled Reining in Excessive Secrecy: Recommendations for Reform of the Classification and Controlled Unclassified Information Systems. Since the release of this report, which made eighteen specific recommendations for reforming the classification regime, several important strides forward have been made, most notably the issuance of Executive Order 13526 by President Obama. Much remains to be done, however, and TCP is grateful to the Public Interest Declassification Board (PIDB) for its efforts in this regard. As set forth below in more detail, TCP supports many of the features in the White Paper which are consistent with TCP’s own report, and suggests additional measures to be included in the PIDB’s final proposal.

    In particular, TCP applauds the PIDB’s recommendation that agencies better employ the presumption against classification. TCP’s Reining in Excessive Secrecy specifically recommends “a presumption in favor of lower level classifications, or declassification, such that decisionmakers resolve doubts by applying the lower classification level or no classification.” As this White Paper notes, unwarranted classification of information threatens national security by preventing the timely exchange of information between government agencies and branches, burdens the system with unnecessary costs of protecting huge stores of data, and fosters public distrust in the government.

    TCP also concurs with this White Paper’s recommendation that agencies more robustly employ the provision of the Executive Order that allows for discretionary declassification and include guidance to this effect in their internal policies and procedures. This recommendation reflects our Liberty and Security Committee’s recommendations that the public interest in releasing information should be taken into account whenever classification and declassification determinations are made. Similarly, TCP supports the PIDB’s recommendation that agencies take affirmative steps to reduce the duration of classifications. Both of these recommendations reflect the importance of governmental transparency, particularly when the rationale for secrecy is outweighed by the public interest in the release of the information, or when the need for sequestering the information has passed altogether.

    However, despite these laudable goals set forth by PIDB, TCP shares the concerns voiced by other commentators that this White Paper lacks specific and concrete steps that agencies should take in order to meet those goals. For example, TCP’s report recommends that the timeframes for automatic declassification should be decreased, so that automatic declassification is required after 10 years, except for particularly sensitive information requiring a period of up to 25 years. Specifically, our report recommends that the “lower time limit of this automatic declassification range should be decreased from 10 years to 5 years, and the upper limit should be decreased from 25 years to 20 years.” Therefore, we urge the PIDB to expand upon the White Paper’s statement that “agencies should also reduce the duration of classifications,” and provide specific guidance on such shorter timeframes for automatic declassification when it issues its final recommendations.

    Similarly, although we commend the White Paper’s citation to the provision of the Executive Order on weighing the public interest in discretionary declassification decisions, we urge the PIDB to strengthen this guidance. TCP’s report includes two specific recommendations on weighing the public interest: that agencies should be required to consider the public interest before information is classified and that the government should adopt a balancing test requiring that the public interest be weighed in making declassification determinations. In addition, while we appreciate the White Paper’s suggestion that congressional action could assist in the accelerated review and possible declassification of records pertaining to significant events or controversial government actions, we note that congressional action is not necessary for this acceleration to occur. Rather, the agencies themselves can and should be required to prioritize for review the records related to these events and actions.

    There are also a number of other questions that this White Paper leaves unanswered, where greater guidance could further promote the goals set forth by the PIDB. For example, how should agencies ensure that their employees with classification authority are adequately weighing the public interest in making their classification determinations? What systems and process could be put in place that would facilitate this review? TCP also recommends that each federal agency that classifies information “should periodically conduct a detailed public review of its classification practices” to ensure there are not broad categories of information, or simply too many documents, that are currently being classified that should not be.

    In sum, TCP supports the aspirations set forth in this White Paper, but more specific guidance is needed if real progress is to be made. We respectfully urge PIDB to issue more detailed, specific, and robust final recommendations.

  9. Given the extraordinary success of the John F. Kennedy Assassination Records Review Board in releasing previously secret documents that (1) greatly illuminated recent history and (2) revealed nothing injurious to American national security, I welcome the proposal encouraging Congress to establish topical review boards that would build on the ARRB’s magnificent record. I think that ultimately congressionally authorized independent review boards would be the most effective and most practical solution to the problem of chronic, needless and damaging government secrecy.
    Ken Hughes
    Presidential Recordings Program
    Miller Center
    University of Virginia

  10. I believe that by far the most important issue raised by Ms Sims’s paper is the discussion below it regarding early release of materials and defining the nature of “damage.” The polar opposite case which “Anonymous” raises sets the stage. The system should certainly be geared to encourage early release (as well as performing to the standard of meeting “on time” release), but the discussion contains a hidden presumption that agencies should continue to “own” information–and that I disagree with. The equity problem is one of the most important obstacles to progress right now, and can be expected to become more onerous yet if not dealt with immediately. I do not believe the legitimate public interest lies in providing an imprimatur to equity in exchange for vague promises of releasing classified documents before their time.
    I have a proposal to re-boot this system which I will put into comments on Commissioner Briick’s paper. For the present let me just note that bulk early–even premature–release is no bad thing if divorced from the agency environment.
    Moreover, the examples we have shed interesting light on the “damage” question: the two most prominent, the leaks of the Pentagon Papers and those from Wikileaks, do not seem to have shattered the Republic. Mr. Cooper is quite correct that, as an a priori matter, our understanding of “damage” is poor.

Comments are closed.