The Problem
National security records frequently contain classified information from more than one agency. Under Executive Order 13526, “Classified National Security Information” (the Order), only the agency that creates classified information can declassify it. [1] If an agency produces a record containing its own information as well as information from one or more other agencies, it must not only review its own information for declassification but must also refer the record to each agency that “owns” the additional classified information (called “equities”) in question. A final declassification determination on information in the record is made only after all agencies have rendered decisions on their respective equities.
For the past 16 years, no single policy consideration has generated more difficulties for the conduct of the Automatic Declassification program than the principle of agency ownership. The principle of agency ownership of information has added to delays, costs, and errors in meeting the goals of the Executive order. The National Declassification Center (NDC) and the Joint Referral Center were created in an attempt to address this problem by co-locating agency personnel at one location to accomplish their independent reviews. Both efforts represent an important transitional attempt to bring order and efficiency to the declassification process. Nonetheless, having multiple reviewers and multiple agencies processing the same record severely limits the benefits of this approach by failing to eliminate the inherent redundancies of the current model.
In many instances today, the final declassification determination on a record is needlessly delayed when the other agency’s equity is superficial or minimal. The rote referral to another agency regarding information that might easily have been evaluated in the initial review obstructs the declassification process and prevents the timely release of declassified records to the public.
The increase in the volume of records subject to review in the automatic declassification program requires a reassessment to the current policy of agency ownership. Since 1995 agencies have attempted to execute the program within policy strictures that never envisioned such a massive undertaking. This cumbersome process has resulted in a backlog of approximately 420 million pages. Given the dramatic increase in electronic records that will require a review for declassification, continuing to use the same antiquated review process will result in even further gridlock.
Background
A fundamental tenet of the current classification system is that each federal agency retains ownership and control over its information, regardless of the information’s age. The basis for that principle is the belief that the expertise necessary to render proper declassification decisions resides exclusively with the agency that created the information.
Extensive training has been underway since the earliest days of the program to ensure that declassification reviewers from every agency can identify those equities of other agencies in their records that require referral. Absent accurate identification, the referral requirement becomes a moot point. Yet while the current system recognizes that training can and must ensure an agency can consistently identify the equities of another agency, it refuses to accept the idea that similar training could enable that agency to accurately implement the declassification policies of the other. Identifying the imbedded information of other agencies is a more demanding task than the rendering of the declassification decision itself, if the specific policies and guidance are understood.
Proposals
The Board is currently exploring two proposals as solutions to the issue of agency ownership. In both proposals a single, properly trained reviewer would conduct declassification review for all equities in a record followed by a single quality control review. Such a change must be directed by the President, since the current structure has existed in the “Classified National Security Information” Executive orders since 1995. Since agencies are reticent to allow anyone other than their own reviewers to review their information, any change in the Order should direct that agencies continue to write declassification guides and submit them to the Interagency Security Classification Appeals Panel (ISCAP) for approval. The NDC and/or agencies would then be trained to conduct their comprehensive reviews according to approved declassification guidance. This process will provide a strong incentive for agencies to submit detailed and specific declassification guides.
Option 1: Single Agency Review
One option would be to replace the current approach by vesting the agency that created a record with the authority to review it and declassify it in its entirety, including any information within the record that was incorporated from other agencies. The agency in possession of the record would have the authority to exempt specific information in accordance with ISCAP approved declassification guides.
Additionally, this approach would include at least three major roles for the NDC:
- Ensuring a robust training program for all agencies on the declassification policies of their counterparts;
- Conducting quality control reviews by sampling multi-equity records to ensure agencies are properly executing their responsibilities; and
- Serving as the focal point for the declassification review of all Presidential library records.
Placing the responsibility for declassifying Presidential library records with the NDC eliminates the inefficiencies of the current system. Presidential records are very likely to contain multiple agencies’ equities, best reviewed for declassification in a comprehensive fashion by a government-wide authority. Placing responsibility for their review in the NDC would help alleviate the processing delays in declassifying these historically significant records. It would also ensure that the agencies staff the NDC with their strongest officers.
Option 2: Single Centralized Review
The NDC, applying ISCAP approved declassification guidance, would become the single government-wide authority empowered to review all historical classified information for exemption beyond its initial automatic declassification date. [2] Centralizing this authority within the NDC would eliminate the inefficiencies of the existing model. Furthermore, synthesizing declassification guidance and streamlining the review process will strengthen the consistency of declassification reviews. Moreover, it would also ensure that declassification reviews of historical records are scheduled for review in accordance with NDC review priorities and occur in the context of full archival processing that will allow declassified records to reach the public shelves.
Net Result for the Future
This simplified declassification review environment will amplify the benefits of implementing new technologies, particularly context accumulation capabilities, to make the declassification of records quicker, more consistent, and more effective. For records containing multiple agencies’ information, it will be much easier and more efficient to use technology that identifies and reviews for all agencies information than to develop multiple programs or systems that operate in isolation. A comprehensive system would ingest reviewer decisions on all types of records and have the ability to identify patterns and inconsistencies in declassification guidance and determinations across agencies. Reviewing the information of all agencies simultaneously will maximize the system’s corpus of contextual knowledge, allowing the reviewer to produce the most precise and consistent determinations in the timeliest manner.
[1] Agencies can waive their ownership. In one of the best success stories, the National Security Staff (NSS) has longstanding waiver agreements with the National Archives, the State Department, the Defense Department, and the Central Intelligence Agency. Apart from a limited number of well-defined and easily identified exceptions, the NSS waives their interest in reviewing their equities in most documents over 25 years old.
[2] In most cases this would transfer declassification authority to a single government-wide authority when records reach 25 years of age. For records initially exempted from automatic declassification under 3.3(h) of E.O. 13526, declassification authority would be transferred in accordance with the timeline outlined in that section.
Both proposals seriously underestimate the complexities involved. Currently few declassification reviewers have the security clearances and accesses required to receive the guidance the need to evaluate other agency equities for a decision. This includes reviewers at the NDC and JRC, which are not currently approved to physically handle or even contain certain types of classified information. Contrary to the claim here, RECOGNIZING who is responsible for something is far simpler than understanding all of the complexities involved in a declassification decision. Some material cannot be evaluated by a simple yes/no checklist. The JRC process recognizes this reality as it applies to the DoD Intelligence Community.
The potential of the NDC, with its currently limited size and focus on NARA objectives, to meet all of the declassification needs covered in the second option is extremely limited. Nor would it eliminate the need for similar capabilities within the various agencies as they wrestle with FOIA, Systematic Declassification Review, and similar concerns unless they, too, shifted to the NDC to create an immense bureaucracy. The net result would inevitably be tremendous expense for little gain real.
One underlying assumption here is that declassification guidance is static and that declassification decisions are simple. This assumption appears to be endemic in the declassification and records management communities. But it is not the case in all agencies. Decisions, even those involving records at or beyond the 50-year mark and clear guidance, can require close coordination with subject matter experts. This can be a positive process if greater public access is the goal, as it questions and challenges previous decisions. Guidance needs to continually evolve if the President’s intent is to be met. The closer the declassification reviewers are to the means of changing the guidance the more responsive and adaptable that guidance becomes. This argues for agencies themselves to control both the guidance for and process of reviewing their information equities.
The other assumption is that pass-fail review is universally adequate for records declassification. Meaning that a single item- or a single agency’s concern- halts declassification of the rest of the information in a given record. This may work for NARA records management purposes, but it actually slows the flow of information to the public if applied at the agency level. Redacted records can disclose considerable information while still protecting sensitive information.
Option 1 holds more promise than option 2’s expansion of the NDC, but it does not contain a clear requirement that agencies respect- and protect- other agency information equities in the information that they create. Assuming that they did so the researcher could easily get several versions of the same record, each with different portions redacted, and be left to piece them together. Or the alternative- pass/fail review that never released anything based on the interaction of multiple agency equities and/or the lack of suitable clearance and training to render a clear decision about those equities.
Of course failing to protect other agency equities invites, paraphrasing EO 13526, grave or exceptionally grave harm to national security. Those without access to highly classified material, unfortunately, simply do not and necessarily cannot receive specific explanations of what such harm could be from seemingly innocuous, decades-old records. But it takes little imagination to understand that agency X’s passing mention of an agency Y activity or capability could be disastrous.
Note that training a reviewer in even a single agency’s equities and guidance can take months. Such training often takes the appearance of a graduate level course in the history of that agency. A truly flexible and responsive system involves living guidance, and so a system requiring constant interpretation and revision of guidance as it is applied. The sheer scale of the training, knowledge, and continuing education required to pass judgment on multiple agency records- particularly those within compartmented channels- makes it virtually impossible for anyone outside an agency with such information to render accurate release decisions upon it. The NDC and JRC are making tremendous progress, but it is important to note that they avoid this issue by avoiding entire categories of records. These proposals appear to remove that constraint without addressing it.
If greater public accountability and openness in government are the goals, at some point we have to respect the reality that the amount of information requiring review will inevitably surpass any feasible means of providing that review. Selecting certain categories of documentation for priority handling simply insures that others will be deferred or ignored- while the selection process becomes increasingly suspect.
A corollary to this observation is the significance of objectively identifying and summarizing information about classified government activities for the earliest possible disclosure. Accountable government history programs, in particular those of the intelligence community under the languishing Intelligence Community Directive 108, offer a means of providing the public with more timely and accurate information about government activities prior to the declassification of many pertinent records. The long-term success of the State Department’s Foreign Relations of the United States series demonstrates the potential role of history programs in increased public disclosure. If the intent is greater public access to originally classified information, full and accountable implementation of ICD 108 can go a long way to solving the problem of steady expansion in record volume without requiring vast new bureaucracies. If this is merely an exercise in NARA’s warehouse management it will inevitably have to decide between releasing only a small percentage of documents or accepting necessarily unknown risks to national security.
The unasked question about the NDC so far is indicative- how long is the delay between an NDC decision to release a document and that document’s availability to the general public?
I would strongly support option 2. I think a specialized staff with broad training and exposure to multiple agency declassification guides could bring a more informed perspective to the declassification problem. The intensely narrow focus that agency ownership brings seems to me to be detrimental to the declassification process and to a broader understanding of our history and the government’s successful or unsuccessful actions over time. The point of declassification is not simply to cater to prurient interest in our secrets, but to learn from what apparently has worked or has not worked in the past (albeit the far past). That is what the FRUS (Foreign Relations of the U.S.) series, at its best, does, and that is what a single center with a broad perspective could add to the declassification process.
But FRUS items are identified by historians and then sent through the agencies for declassification. They are not the products of a regularized declassification process, rather they are compendiums assembled by research historians with external oversight. As recent events demonstrated, that oversight does work. This is my point about the need to examine the contribution of history programs and fully implimenting/expanding ICD 108.
Informed about what? A broadly trained staff would actually be less informed by definition, and have a less balanced persepective about the individual agencies’ guidance and activities, than a specialist in those issues has. The experience coming out of the JRC reflects this. There are reasons that the JRC uses different processes to treat intelligence community records.
Remember that declass analysts do not chose what they declassify, and under 25-year review agencies do not have the luxury of chosing topics or collections. The narow focus of an agency reviewer is, per the current EO, a specialists’ knoweldge of the guidance and contents of an agency’s records. This actually speeds the process of treating that agency’s records, v. a generalist that would spend much more time consulting guidance and reference material.
I believe that Mr. Charlston’s analysis is spot on. Some additional points to consider with regard to the posted piece include:
1) ISCAP approved guidance is not really declassification, but rather conditions upon which agencies can apply exemptions. The guides are called declassification guides, but they really only provide high-level and somewhat cryptic guidance on the kinds of information that can be exempted. For virtually all agencies that have such guides, its not possible to simply use the guide as written without a substantive knowledge of the business areas to which the guide pertains.
2) The “guidance” that has been approved by the ISCAP looks like: “information pertaining to collection effort X that would reveal a current classified ability or vulnerability.” Using this guidance requires knowledge of the abilities and vulnerabilities of collection effort X and an understanding of current classified aspects of this effort. So training users not familiar with the agency or program about which the guidance was written really does require in depth training of the genre that Mr. Charlston described.
3) The piece speculated that such a centralized effort would aid in context aggregation. As discussed elsewhere in this BLOG actually reaching context aggregation requires converting any paper to electronic text and sophisticated data mining and context analysis aided by trained humans. The paper’s off-hand statement that centralization of the declassification effort will facilitate this is without any factual basis and merely sounds good.
4) The key premise of the paper: A few people can simply use approved guides and replace hundreds of highly trained reviewers and produce a process that is easier and cheaper and produces greater results, is seriously flawed as it fails to consider that the 400 million pages represent virtually every classified weapon system, diplomatic effort, military strategy & tactic, and all intelligence activities over a 60 year period.
5) The paper suggests that the referral process itself is to blame for the backlog of 400 million pages. In point of fact reviews were completed and referrals identified and
Even if we agree that 90% of the material within the 400 million pages can be declassified, the 10% that can’t be is scattered throughout the 400 million pages
#5 above was garbled:
5) The paper suggests that the referral process itself is to blame for the backlog of 400 million pages. In point of fact virtually all of the pages were reviewed, referrals made, notifications delivered to agencies with equity and a process put in place to work those equities. The “backlog” is the result of issues not addressed in the paper:
a) The National Archives failed to adequately resource or manage the review process for the 400 million pages that took place over a dozen years resulting in virtually no detailed records of reviews and no centralized process for knowing what had been done and what had not. The process from 1996 until 2008 was to allow agencies with equities and cleared reviewers to work records at NARA at their own pace and in their own fashion. No centralized process existed and collections were reviewed in a chaotic process by scores of agencies with no single integrated plan.
b) Reviews in the first 5-7 years under automatic declassification that were done by the agencies that created the records in many cases showed poor training and preparation for the reviews so quality analyses completed as recently as 2009 revealed that many of the records were just not reviewed correctly or referred correctly. Had NARA provided some management and ongoing quality assessment of this process between 1996 and 2009 the problems could have been addressed and not become a crisis.
c) Programs established by NARA to complete “processing” of the 400 million pages of reviewed records for release were scaled to produce no more than a few hundred thousand pages each year. The simple math is that at a rate of 100,000 pages a year, the 400 million pages would take 4000 years. NARA never appreciated the scale of the problem, never was resourced to handle this volume. The quality problem with the previous 15 years of review coupled with the inability to actually move this volume of records from the classified holdings to the open stacks with any kind of due diligence was the most important reason for the NDC where records could be scheduled for review in no specific time after accession and processed for public access in a manner consistent with available resources.
I agree with the premise of the paper that there is no alternative to “simplifying” the declassification review process, but I don’t think that merely streamlining the referral process goes nearly far enough. Instead, I think we need to confront the possibility of “accepting necessarily unknown risks to national security,” in Mr. Charlston’s words.
This should not be done blindly or frivolously. An effort should be made to place boundaries on the potential risks, perhaps by limiting the subject matter, originating agency or age of the records that are to be declassified with a greater tolerance for risk. But then the risks should be accepted.
Towards this end, I would suggest establishing an NDC pilot project on “radical” approaches to declassification. This should involve testing different levels and styles of review on a series of a collections — including ZERO review, random sampling, and other approaches — and then evaluating the results. Did they in fact yield any potential “disasters”? Did they dramatically increase productivity and reduce costs? Did the resulting data suggest other practical changes that could be made?
To the extent that we can rely on empirical data rather than imagining worst case scenarios, we will be in better shape. But in any case we need a sharp, transformative break from current declassification practice. Or we will be stuck.
The second proposal, for centralized declassification review, in the white paper has considerable merit despite the shortcomings mentioned by both Mr. Charlston and Mr. Cooper. What we have found after many years of experience is that reviewing agencies are more than capable of making serious mistakes, as Mr. Cooper readily points out. However, it must be understood that for the period of 1996 through the end of 2009, NARA did not have authority to address specific questions of how originating agencies were performing their reviews. Indeed, when NARA began a true quality assurance effort in 2007 covering E.O. 12958 reviews, declassification program managers throughout the declassification community protested vehemently while denying their programs were capable of such errors. (Please note that ISOO does have the authority to examine agency declassification programs and has been doing so for a number of years.)
We have to disagree with Mr. Cooper and say that bad referrals do matter when we take into account the huge amount of records involved. Automatic declassification applies only to records that have permanent value (as delineated in an agency’s NARA-approved records schedule), and the overwhelming bulk of these records do not fall into the SCI or SAP categories. The amount of effort it takes to administer just one bad referral (performing records withdrawal, performing data entry, performing record rehousing, moving the records to the Interagency Referral Center, having agency reviewers review the referrals, having the agency reviewers create secondary and tertiary referrals, then finally refiling the declassified records when agency reviewers clear the referrals) demands that only quality referrals be passed through the system. Many are the times that agency reviewers working in the IRC ask the question “Why am I seeing this referral?” The hundreds of thousands of bad referrals in the 400M page backlog are a major contributor to the delays in moving declassified records to the public. The concept that over-referring is better to protect critical National Security Information (NSI) fails because our experience indicates that reviewers will miss easily identified sensitive information, including marked RD/FRD and obvious sensitive letterhead even while over-referring.
Despite the quality issues that have been the focus of much discussion, there is much talent and experience out there in the reviewer community. However, only a reviewer that is free from the culture (but not the knowledge) of the originating agency can make an unbiased decision based upon a centralized collection of ISCAP-approved guides provided by all declassification community members. Only a reviewer that is centrally managed can conduct consistent reviews performed under a single management structure that builds common performance, disciplinary, and evaluation standards (keep in mind that agencies now conduct one, two, or even three levels of review, depending upon the agency). Finally, only a reviewer that is centrally trained and certified can perform quality reviews that minimize drag on an already overloaded system (The NDC is in the process of developing a centralized training program). Such a centralized program can protect National Security Information better than the current stovepiped arrangement if participating agencies provide quality and current guides freely.
Messrs Carmichael, Daverede & McIlwain, and Ms. Proctor made some good points. For those not familiar with how the National Archives works, some clarification is needed, however.
It is accurate to say that only permanent federal records are subject to automatic declassification. The implication that records in what my colleagues refer to as SCI and SAP categories are not permanent isn’t accurate. They are for the most part always permanent, but are usually within file series exempt from automatic declassification. That doesn’t mean that there are not a number of very sensitive SCI and SAP documents interfiled with other non-SCI records. The relative absence of large quantities of these most sensitive records does help simplify the review process, but not that much.
Under 44 U.S.C. records accessioned to the National Archives become the property of the Archivist of the United States. It is not accurate to say that the Archivist does not have the authority to determine how these records are accessed (even by accessioning agencies) or authority to determine the processes and procedures used for access and review of the records.
The review procedures established by NARA at the inception of the automatic declassification program required the agency that accessioned the records to conduct the initial review. Upon finding the equity of another agency the reviewer would place a paper band around the document and write on the band in pencil the name of the agency that had equity in the document. That agency was notified and sent someone to review the document. If the reviewer believed the equity could be declassified, the NARA procedure called for that reviewer to simply remove the paper tab. Years later when NARA began to process the records for release they found equities that should have been referred, but all records of the referral were thrown away with the paper tab and there was no way to know if the equity was “missed” or if the agency was forward leaning and decided to release the document. Some of the “bad referrals” mentioned by my colleagues are merely the result of an ineffective process to document that the review took place.
Bad referrals do matter and early referrals were very bad. These referrals were done by people trained by the agencies that accessioned the records¬¬¬¬ but also done as part of a chaotic and undocumented process.
A simple reality is that processing even fully reviewed records for public release is a tedious process requiring all exempted documents to be removed, place markers inserted to maintain provenance, and a second look to ensure that information requiring protection by statute (such as personal privacy information) is identified. Processing 400M pages in 100 years requires that 4 million pages or 4,500 archive boxes be processed each year. Meeting even a 100 year deadline would require resources in excess of what NARA has available. Achieving public access of the 400 million by 2013 as directed by the President will require more than just a faster declassification review process.
These 400M pages contain every military operation, every weapon system, every weapons platform, every diplomatic negotiation, every intelligence operation, and the tactics, strategies and planning for everything the US Government has done for the past 60 years. The idea that even very smart people with no subject matter expertise can be trained to use guides and then responsibly understand the kinds of information that rises to the level of protection after 25 years is short sighted. Add to that the fact that the guides themselves do not describe everything that needs to be protected, but rather they describe concepts to be used by subject matter experts and the problem is not that simple.
A centralized process run by capable people who supervise, reward and discipline highly trained professional reviewers sounds like a good plan. This plan is predicated on being able to craft guides that actually provide information to those not familiar with another agency’s business, find ways to train these smart people to recognize hundreds or thousands of potential kinds of sensitive equities, and to establish an integrated quality control process that will effectively identify errors both in unwarranted exemptions and in inadvertent release. That is a very tall order and I am not convinced it can be pulled off in the time frame allotted.
In response to several questions/comments we have received about NDC operations, I recently started a series of posts on the NDC Blog (http://blogs.archives.gov/ndc) that will look at the changes in the declassification process. The first post briefly descrives how the current process was developed and includes a high level diagram. Upcoming posts will look more closely at the parts of the process.
Option two appears to improve efficiency in the system by minimizing or ending multiple reviews, which is critical to saving resources in a system that is already over-burdened. What is also needed is a database that would house all review decisions so that reviews are consistent and documents can be accessed by the public in the most efficient way possible. Reviewers must be trained to recognize other agencies’ equities, and a certification process issued by agencies could serve as a vehicle to ensure high-quality training. There could also be interagency review teams in place that would serve the review community and be detailed to different agencies, as needed. Teaming ensures that reviewers are sharing their knowledge and communicating with fellow reviewers, which is essential to promoting high-quality reviews, mitigating risk and minimizing bad decisions. Although there will most likely be cultural resistance from many in the review community, these training and teaming measures can help build confidence in the system, especially among those whose equities involve intelligence sources and methods. The Presidential Libraries could be ideal candidates for testing this new approach, as they have experience with the RAC program, administered by the CIA.
A database to house all review decisions is impractical. There are over 400m pages in the backlog alone. At an average of 4pgs per document (calculated over time by a large federal agency) that is as much as 100m entries that would have to be done along with enough data to capture review decisions.
To have a database that would also facilitate public access the actual documents would have to be scanned and indexed to facilitate locating and viewing them. Scanning and indexing that many documents would cost at least 10s of millions of dollars and as much as $100M.
While there are cultural divides that would have to be crossed to centralize review and declassification, there is a much greater practical and financial problem to solve before we could even get to the cultural issues.
The concept of review teams is logical and may be practical. We need a short term solution to mitigate any remaining risk in the 400M page backlog that must be released by 2013. Then we need a long term solution that can be employed in the NDC for the long run. Any realistic solution must take into account the total volume of accessioned classified records each year so the backlog doesn’t reappear in a few years. It is apparent that some kind of joint review is needed, that resources to review the total volume of records are needed, and that a system for tracking and documenting the decisions made is essential as is a mechanism for public access to the declassified records. This is no easy task. The NDC has a significant challenge, but I am confident that the current Director of the NDC can meet that challenge as long as resources can be found ($$ and human) to do the work.
The Presidential Libraries support simplification of the declassification review process including equity referral. I would be very interested to see the further development of the possibility of vesting complete declassification authority with the agency that created the record. As I understand Option 1, centralization of the declassification review of Presidential Library materials is viewed as a major role for the National Declassification Center (NDC). Since the beginning of the NDC, the review of Presidential Library materials has been an integral part of the NDC. Under the auspices of the NDC, the Presidential Library System participates in a centralized declassification process through the Remote Archives Capture (RAC) Project. The RAC Project is a 15-year partnership with CIA and other classifying agencies whereby scanned images of Presidential materials at Library locations are brought to Washington for review by equity-holding agencies. Agency reviewers can review Presidential materials from a centralized location in the Washington DC area and we are working to have additional review terminals available to reviewers located at the NDC. Additionally, the Presidential Libraries are a significant part of the NDC’s annual prioritization plan. The RAC Project has been the most successful declassification effort to date for the Presidential Libraries. This allowed us to meet EO 12958’s 2006 referral deadline of 25 year old materials and has returned over a million pages of reviewed material to the Presidential Libraries. Over the years, we have made considerable advancements in the release of declassified Presidential materials through the RAC project making them available at unclassified RAC terminals in Presidential Library research rooms. We have unclassified RAC terminals at the Carter and Johnson Libraries and expect to roll out additional terminals later this year.
I agree with arguments for ending agency ownership and turning to centralized review, including: an electronic database, adherence to sunshine and other deadlines, and having greater tolerance for risk. What might also be helpful is a universal mission statement for all agencies, one akin to that of the National Archives and Records Administration: It states that NARA “serves American democracy by safeguarding and preserving the records of our Government, ensuring that the people can discover, use, and learn from this documentary heritage. We ensure continuing access to the essential documentation of the rights of American citizens and the actions of their government. We support democracy, promote civic education, and facilitate historical understanding of our national experience.”
In terms of Commissioner Briick’s paper itself, I support the option 2 proposal for single central review system. This takes the locus of action away from the agencies, avoids the silly to-ing and fro-ing of equity, and permits economies of scale. To the complaint of this creating a new overblown bureaucracy the appropriate answer is that we currently have 16? 48? 90? bureaucracies, each of them with tendencies to become overblown. The subject expertise issue I deal with below.
Before proceeding further I also want to react to claims made in behalf of the RAC program. As an originator of declassification requests for many documents subsequently reviewed by RAC, I can report that those RAC examinations actually opened new material in just a small percentage of cases, and that when they did, it was much more common that the result was an extra word unredacted here or there. In exchange RAC absorbed a significant proportion of agency manpower that could have been engaged in wholesale review of requested or volunteered new material. My view is that RAC accomplished very little for its great monopolization of available resources.
The overall system is broken–RIGHT NOW!–and something more than RAC is required to break the logjam. Any situation in which a majority of government agencies can ignore an order to review their procedures, and a majority of those that do are able to conclude without adverse consequence that their existing arrangements are fine, cannot be said to be under control. This in the face of explicit EO language that no information can be graded for “forever” secrecy. An example needs to be set to compel compliance–and something beyond releasing the notorious 1917 documents. A smart president would pick 20 representative document sets and simply open them, while identifying the 20 officials in the system responsible for the most refusals to declassify and lift their security clearances.
Let me here align myself with Steven Aftergood’s suggestion in his side paper for a “national security history” as a vehicle to encourage/compel material releases (although I differ by advising that the materials released should be done under the regular system and not by special procedures–which will, in my view, offer a new opportunity for mischief).
I have said elsewhere that the only way forward is to deal with classes of information, and I agree with the need to transform the problem. The way to do that is at the supply end, and with the system as a whole, so, since we are thinking out of our hats here, let’s actually talk about a 21st Century classification/declassification system.
As others have acknowledged, we have difficulty defining “damage” as a criterion for declassification. Therefore, the official classifying a document must simultaneously file a “damage memorandum” that identifies the SPECIFIC damage to be expected from release which will serve as guidance for later releasing authorities (and reduce the need for univac knowledge among later reviewers by identifying the special concerns implicated). An originating official by himself will not have final authority, rather his recommendation will go to a secrecy authority who assigns the actual level of classification to the document–and simultaneously attaches the metadata that will accompany that document throughout its life. The metadata will identify the classes of information contained in the document.
I suspect this measure by itself will substantially reduce the number of classified documents generated every year.
Classification levels will be expressed in specific intervals of secrecy–3 year, 7 year, nothing longer than 10 years. After the initial period, the AGENCY must AFFIRMATIVELY request that secrecy be extended for a class of information. This way the system will accomodate “equities” without the present cumbersome arrangement. Metadata will identify what documents belong in each class and are affected by decisions thereon.
Agencies may participate in the expert training of analysts of the central authority and help inculcate in them appropriate regard for the secrecy of information.
Absent extension, documents will automatically open at the end of their initial period of retention. They will then be opened at the National Archives and its associated entities, such as the presidential libraries. At the instant a document opens, ALL copies of that document in all repositories also open.
Should an agency fail to request extension of secrecy for a class of data it will have no further recourse. This feature will compel agencies to track their equities with precision rather than inventing them when a specific document comes up for review.
The central declassification authority will conduct a high-level review every 5 years of all the classes of information embodied in secret materials, based on agency requests for extension. The board will have the authority to require agencies to furnish any and all information required to reach a substantive conclusion regarding the appropriate level of secrecy for a class of information. It will have the authority to regrade a class of information or to open that group of records. (We can discuss whether agencies should have right of appeal to the president.)
Agencies will be afforded a slight amount of leeway due to the anomalies between the time when a record is generated, the moment of expiration of its original authority, and the coincidental timing of the next central authority review. But the system will automatically ensure that no document or class of data can remain secret for more than 10 years without review of the appropriateness of its continued classification.
During intervals between high-level reviews the central authority will audit and compare metadata categories and original documents to obtain greater precision of identification. Once this system is in place we can consider the utility of a laboratory to apply AI techniques.
The central authority will also review FOIA and MDR cases and agency advice on items to incude in the “national security history” or for early release. In examining specific documents reviewers will now have the assistance of the “damage memorandum” which will enable them instantly to evaluate the original agency concerns and determine whether these remain valid. This will also assist in the search for “identiafiable and segregable” information to be left while other material is redacted.
The proposal for a “secrecy tax” will usefully augment line budget resources to administer this system and can, in fact, be broadened into a two-tier levy. A percentage levied against agency secrecy spending will reflect agencies’ decisions (or the lack of them) to accumulate larger numbers of secret records. A second-tier of secrecy tax will be levied based on the number of original and derivative secret records originated at each agency each year. Obviously this tax will also influence the supply side by detering the mindless growth of numbers of records. It will also impose some discipline in agency permissions for others to make derivative classification decisions. (However, whether derivative classification should be permitted at all–though a thorny question–should be re-examined as part of this reform.)
The central authority will report to the Archivist of the United States annually on the overal dimensions of the operating system, and it will render these reports separately to the President of the United States. The authority will include in its report to the president specific recommendations for disciplinary measures against nonresponsive, or overly protective, agencies and individuals. (Agency budgets can be penalized, information arbitrarily taken from under their control, and individuals’ security clearances can be reviewed.) At five-year intervals the authority will also report on its decisions with respect to classes of information.
This arrangement will clarify the concept of “damage” and relate it directly to declassification in a concrete thematic way. It will separate originating officials from specific secrecy decisions while providing a locus for generation of metadata. It will discipline the growth of secret files. It will deal with the equity issue while simultaneously removing it from agency purview and preventing this issue from retarding overall declassification efforts. The system would be self-regulating and it would impose limits on secrecy growth. It would also move documents more quickly into open records, reducing upward pressure on secrecy costs, and hence save money.
I am a prior Military Veteran of 21.5 years who has held a Top Secret clearance and maintained a Secret clearance for most of my career. I have some background in the classification process of DOD and would like to make an input to your board.
I believe one of the largest problems with the Federal system is non-standardization. Classification is made individually based upon each agencies terms and definitions. None of which clearly represent the true and clear ‘United States’ standard, but rather a Department of the United States.
1. Define National Security as it pertains to classification, in particular, what is a threat to the United States, i.e. a 16 year old threatening the President, a telephone call of an angry individual stating he wants to kill someone, a cache of explosives found in a sensitive area, a commander realizing he overstepped his bounds.
All of these are current issues with classification, but really, are any of them a true threat to the United States?? Our current system of classification allows all of these instances to be classified.
2. As with every other instance of the Nation, National Security also needs checks and balances. I think the Executive Department should initiate what they believe as National Security Threats, but should require the approval of congress (or at least a subcommittee) before it can be completed. This would prevent harsh actions taken in the heat of the moment and maintain the true idea of threat to the nation. The courts should maintain their deference to interpret when the two disagree, and have the ability to overturn classification needlessly applied.
3. Apply stiff penalties to those who misuse classification for personal use, i.e. to protect an embarrassment. i.e. Remove the protections put in place against law suites of Government Officials in the performance of their duties and statue of limitations protections for personnel found to have violated or misused classifications. A lot of 2nd thoughts will occur if an individual knows they can be personally accountable for violating the law, even 10 years later. Many leaders will establish protections and barriers for short term events of a few years, after which they are protected by the system.
4. Finally, a review to be completed timely, and a sunset for each classification. If it has not been reviewed timely, then the classification expires. There is no need to claim a National Security Interest for 50 years when the event only traverses a few years at most. The American People deserve to see what is going on to prevent crime from overtaking the otherwise honest folks. One section of the Government should not be prevented from looking at classified documents from another section, after all, we all work for the same team. If the Classification truly is needed, then an oversight authority such as the courts should have no reason not to look at it. One agency should NEVER have the power to make these decisions single handedly without another agencies oversignt…expecially if the other agency is related, i.e. CIA/DOD. I disagree that for example OPM could not understand sources protection, but commonly these are used not only to protect sources, but many times to protect unlawfull activity as well (methods).
Thanks and hope I have helped. The protection of the U.S. is a U.S. concern, not a DOD concern, or a State Department Concern, or a CIA concern. The infighting is 90% of the problem. If the classification is only understood by one Department, then it may not be really a National Security issue at all, but a departmental issue.