26 April 2006. Mary O. McCarthy was recently fired by the CIA for allegedly talking to reporters and releasing classified information, which she has denied.

See also: Mary McCarthy, "The National Warning System: Striving for an Elusive Goal," Defense Intelligence Journal 3-1 (1994), 5-19. http://cryptome.org/mccarthy-nws.htm

Source: Hardcopy of Defense Intelligence Journal.


Defense Intelligence Journal; 7-2 (1998), 17-31

THE MISSION TO WARN: DISASTER LOOMS

Mary O. McCarthy

The views expressed in this article are those of the author and do not reflect the official policy or position of the Central Intelligence Agency, the Intelligence Community, the National Security Council or the US Government.

Successful warning depends on identifying, then communicating, judgments about a developing threat at a point early enough for policymakers to take action to deter, counter, or manage the threat. Neither the identification nor the communication of the threat, which are two distinct phases of the warning process, can be done in a haphazard way. Each step must be deliberate, carefully constructed, and planned.

This article will, first, discuss positive actions that intelligence analysts can take to help identify and assess the threat. It will expand on an earlier article that examined some of the impediments to recognizing a developing threat -- mindset, deadlines, and the emphasis on current intelligence.1 It will, then, discuss the art of communicating warning and will describe how analytic weaknesses can impair or discredit the warning message, sow tensions between the policy and intelligence communities, and create the false impression that policymakers do not want to hear bad news. Finally, it will offer some recommendations for fixing the warning problem.

Intelligence successes are the expected outcome of the collection and analytic processes. By saying "success" we mean providing timely and clear warning of developments that would have deleterious consequences for US security and policy. Each year, American taxpayers pour about $27 billion into the Intelligence Community, with the expectation that it will fulfill its role in protecting the national security by warning of such threats. Yet, they do not always get what they pay for.

Policymakers and the American people are right to question the worth of an Intelligence Community that fails to do its job. If the failure is small, it may not get much attention. If it is large -- such as the Iraqi invasion of Kuwait -- or even medium -- such as the 1998 Indian nuclear test -- it will not be overlooked. Such failures will rightly be seen as indicative of systemic weaknesses. Congress and the public will demand to know why the Intelligence Community is not doing the job for which it is paid. If a cluster of failures and inadequacies occur in close proximity to each other, then policymakers, Congress, and the American public will start to seek radical remedies.

The likelihood of a cataclysmic warning failure is growing. As intelligence agencies "downsized" earlier in this decade; they did so with little regard to shaping their analytic workforces to acquire or maintain the expertise needed to understand future national security threats and, apparently, some present ones as well. Just this year alone, two separate groups of eminent experts reported on critical intelligence failures and warned of even more severe consequences, if steps were not taken quickly to reform the analytic corps. First, there was the Rumsfeld Commission, chaired by Donald H. Rumsfeld. It examined the Intelligence Community's faults, if not outright failures, in the area of the ballistic missile threat to the United States. Then, the Jeremiah Report, completed by Admiral David E. Jeremiah at the request of the Director of Central Intelligence (DCI), George C. Tenet, was issued following the surprise -- to the US Intelligence Community -- Indian nuclear test in May, 1998. Both groups issued press releases which are available on the worldwide web.2

Intelligence reforms, they noted, should focus, both, on the composition of the analytic corps and on the way it practices its craft. Despite these very harsh and public criticisms, change is not apparent. While intelligence managers continue to decry the inability to develop analytic expertise, and to lament the extent to which their analysts must concentrate on thinking and writing current intelligence, they continue to avoid enforcing standards that would bring about the needed changes. Disaster looms!

Is It Really That Serious?

No one expects the Intelligence Community to be able to forecast the future, but policymakers should have a reasonable expectation that they will be warned as a threat becomes greater and as a potential crisis becomes nearer. The modern history of the Intelligence Community, dating from the National Security Act of 1947 -- itself a reaction to the catastrophic intelligence failure at Pearl Harbor -- has seen a number of instances in which the Intelligence Community did not fulfill that reasonable expectation. The Korean War, China's entry into the Korean War, the 1973 Arab-Israeli War, and the Iranian Revolution are the more prominent examples of the failure of our intelligence apparatus to provide warning. There are others: the Tet Offensive, the leftist military coup in Portugal in 1974, the Turkish invasion of Cyprus, the 1974 Indian nuclear test, and the 1998 Indian nuclear test as well.

Following the failure to warn of the Iraqi invasion of Kuwait in 1990, DCI Robert Gates formed a task force to look at the function of warning and to make recommendations on how to fix the system. As similar efforts had in the past, this task force discovered that the root causes of failure seldom lay with individuals, but with the nature of the task and with the system itself. The Gates task force made a number of recommendations, some of which were later laid out in a DCI Decision Memorandum, and which, if implemented, would give the Community a fighting chance at a capability to warn on a consistent basis. It is clear that this memorandum, signed by DCI Gates in 1992, is now largely ignored. The experience of the recent failure to warn of the increasing likelihood of an Indian nuclear test, which was essentially a failure of political analysis, indicates that the old analytic culture still predominates. With some rogue states rapidly acquiring weapons of mass destruction and the means to deliver them, with troubling political change underway in some nations of strategic concern, and alliances forming between non-state actors -- i.e., terrorists -- and unfriendly governments, it is time to take another look at how the warning mission can be fixed. The situation is very serious.

How to Identify a Threat: Heavy Lifting and Street Smarts

The art and science of assessing threats entails more than sticking one's finger in the air to get the wind direction or asking oneself; "How do I feel about this today?" It requires laborious, methodological, rigorous analytical work; it requires imagination; and it requires a diversity of outlooks. With the exception of a few notable pockets of excellence, the Community would appear to need a boost in all three departments. The fact that the Intelligence Community includes many bright individuals who can rightly claim to be accomplished experts in their respective areas is not the point. Rather, it is the fact that they are either not sufficiently trained to do intelligence work, as distinct from policy and academic analysis, or that intelligence managers are asking only that they produce words on paper that sound good. Moreover, the studies which have already been cited -- Rumsfeld and Jeremiah -- as well as others that remain classified, have pointed to the dangerous scarcity of specific skills, particularly technical and linguistic.

The urgent task of the Intelligence Community, then, is to fix this dangerous gap. In the meantime, however, intelligence managers can mitigate the risk of another warning failure by enforcing standards and nurturing analytic rigor, imagination, and diversity.

Analytic rigor. "More rigor needs to go into analysts' thinking when major events take place," concluded Admiral Jeremiah after examining the Community's performance on the Indian nuclear issue.3 The Rumsfeld Commission noted that "The Commissioners believe that an expansion of the methodology used by the IC is needed."4 Rigorous analysis helps overcome mindset, keeps analysts who are immersed in a mountain of new information from raising the bar on what they would consider an alarming threat situation, and allows their minds to expand to other possibilities. Keeping chronologies, maintaining databases and arraying data are not fun or glamorous. These techniques are the heavy lifting of analysis, but this is what analysts are supposed to do. If decisionmakers only needed talking heads, those are readily available elsewhere.

One of the most valuable techniques for determining the scale of a threat is to subject the evidence to an "analysis of competing hypotheses." Each analyst or, better yet, group of analysts should think of a number of hypotheses. For example, one hypothesis might be; "This country intends to attack its neighbor and seek to occupy it." A second hypothesis might be; "This country intends to attack its neighbor but withdraw." A third could be; "This country is posturing."

Then, as each bit of evidence is obtained, the analysts should try to determine whether it is consistent or inconsistent with each of the hypotheses. Eventually one or two hypotheses will claim a preponderance of evidence. An important part of this technique is to conceive of evidence that can eliminate one of the hypotheses and task collectors to find such data. Analysts will also specify which evidence, if found, would be diagnostic of one hypothesis over the others.

By following this methodology, analysts force each other to maintain an open mind about the outcome. Moreover, they are more inclined to think of a number of possibilities. Finally, they will have a means for providing persuasive warning because they can demonstrate to the policymaker, using the arrayed evidence, why they are making a particular Judgment.

Imagination. A reading of the finished intelligence that underlies most of our intelligence failures reveals a common thread over the decades -- i.e., a lack of imagination. It appears that it has been impossible for analysts to imagine that, when faced with a decision, someone else might come to a different conclusion. Rather than thinking as Joe or Jane Analyst, an analyst should imagine that they were Kim Il-Sung. Would they still assess risk in the same way? Would a leader of a highly militarized, or radicalized, country be likely to make the same decisions as a GS-14 intelligence analyst with a liberal arts degree from a nice college? For a good introduction to the importance of studying leaders, analysts need only start with current literature.5 They also. should read biographies of leaders and any books written by the leaders of the countries which they are analyzing. They should examine the choices these leaders made at various times in their lives and compare and contrast them with their own instincts.

Diversity. As long as we continue to fill the analytic ranks with essentially the same kind of people -- i.e., readers vs. actors, risk-avoiders vs. risk-takers, people with books smarts vs. street smarts -- we are going to continue to have a diversity gap. The gap is in style as well as in culture. Again, Admiral Jeremiah, speaking of the failure to warn of the 1998 Indian nuclear test, spotted it immediately: "There is an assumption that the BJP platform would mirror Western political platforms. In other words, a politician is going to say something in his political platform leading up to the elections, but not necessarily follow through once he takes office and is exposed to the immensity of the problem. We had a mind set that said everyone else is going to work like we work."6 Admiral Jeremiah was asked if there is a lack of understanding about other cultures in the Central Intelligence Agency (CIA) and, in particular, about India. His reply: "One of our recommendations is when you have significant events, you seek outside sources which would include those who have spent a good deal of time in India, including Indians, or whatever nationality or culture, could bring more understanding to bear."7

After all the lessons supposedly learned about the dangers of what observers of analytical thinking have called "mind set" and "rational actor" analysis, these two old adversaries have hit us head-on again in the analysis and lack of warning on the Indian nuclear program. Bringing diversity to the analytic effort may not guarantee success, but it needs to be tried.

How to Communicate Warning: Clarity and Persuasion

The old saw that policymakers never want to hear bad news is probably true. Who wants to hear bad news? But would the same policymakers prefer to be surprised? Of course not. Policymakers need good warning, but it must be clear and they must be persuaded. Bad news couched in mushy analysis and backed by little, but mostly ambiguous, evidence with no alternative interpretation, is a waste of everyone's time. Policymakers frequently react negatively to such messages not because they do not want to be warned, but because they see this so-called "warning" as a transparent attempt by the Intelligence Community to get on the record that it has notified the policymaker that a potential outcome is possible.

When policymakers begin to sense that this is the case, a destructive cycle begins in which the analyst and the policymaker progressively lose faith in each other. The cycle begins with the presentation of a threat judgment that is not well argued, or is based on flimsy or questionable sources. Policymakers do not accept the message. Intelligence analysts and managers then interpret this lack of acceptance as an unwillingness to hear bad news on that particular topic. The Intelligence Community then begins to believe that the policymaker is so wedded to the policy that he or she is simply rejecting the substance of the message. Intelligence managers, in an effort to offer a product that will not be rejected by their only market, are often inclined to pull their punches and leave the topic alone entirely, rather than forcing the analysts to go back and make their warning more persuasive. Analysts then begin to believe the intelligence is becoming politicized and lose faith in both their managers and the policymakers.

Policymakers need to be warned and want very much to be warned, but they require these warnings to be sound. It would be very convenient if analysts could tell the future. Will the government of Nigeria fall apart and precipitate an inter-ethnic bloodbath? Will the territorial disputes in the Spratlys lead to military conflict? Can political change evolve in Indonesia without fracturing the country? Will India and Pakistan try to avoid another war or do we have to continue to press them to talk?

However, because policymakers know that analysts, like most mortals, cannot foretell the future, they need, instead, to be persuaded by a clear articulation of rationale and evidence. While analysts are not fortunetellers, they should be able to tell the policymaker whether an outcome is becoming more or less likely, or more or less proximate. Unless analysts can demonstrate why and how the likelihood of a given event is increasing, however, their words are of little use.

Warning must be based on an array of evidence highlighting those developments that are diagnostic of the particular outcome. Consider, for example, a recent one-page Intelligence Community assessment. The names and specific situation have been deleted to protect the classification. Emphasis has been added to demonstrate a point. The piece begins with a lead sentence warning that the country in question is likely to retaliate for a recent action. But, wait. The first sentence of paragraph two says that most analysts do not believe that the country will initiate a military confrontation. Paragraph three notes, however, that a minority of analysts thinks the country is preparing to mount a conventional military operation. Those analysts who do not think there will be a military confrontation reason that it involves risk, including the possibility of incurring international condemnation. Those who think there might be a military operation cite facts on the ground, including the presence of troops at the border, more on the way, and the movement of equipment. The paper finishes, as they often do, by saying that, "We would probably have little additional warning of attack, especially if it were limited in scope." The policymaker is left to ponder; "Additional warning? Was I just warned about something? The judgments were unclear, and did not persuade me one way or another. This must be the Intelligence Community just trying to make sure. they have mentioned every possible outcome again."

Warning Signs for Policymakers: No Rigor Used Here

Looking back over the last four and a half decades of intelligence analysis that produced the prominent failures, it is possible to identify a number of phrases that are indicative of seat-of-the pants analysis, mindset, and a general lack of rigor. These glib and empty phrases often creep into intelligence writing. They are clear in one respect; they tell the policymaker to stop worrying. Policymakers should learn to recognize these suspect phrases for what they are, poor cover for a lack of analytic rigor.

Here are some of the most dangerous; policymakers beware!

Recommendations

After numerous studies of warning failure that pretty much have reached identical conclusions, it is time to stop tinkering around the edges. It is time to make real changes that will crack the analytic culture, provide incentives to do rigorous analysis and add some fail-safe triggers. It is time to make deep and lasting changes to the system.

The following recommendations suggest ways to overcome recurring weaknesses in analytic performance. In particular, these recommendations are designed to bring about changes that facilitate the warning mission.

Bring in more people with real world experience for temporary tours as senior analysts. The dominant analytic culture has repeatedly shown itself susceptible to lapses at critical times in understanding risk. Analysts themselves are not risk-takers and they tend to believe that world leaders are equally risk averse. Exposing them to those who have a history of professional risk-taking might open minds. In any case, the analytic process would benefit from this diversity.

Put outsiders in analytic leadership and management positions. Other government agencies benefit from the influx of new blood into their ranks with changes in Administrations and ends of tours. By contrast, intelligence agencies -- except for the State Department Bureau of Intelligence and Research (INR) and, to some degree the military intelligence agencies -- have ingrown bureaucracies that have become isolated and smug. A person with senior corporate strategic planning experience should be sought to manage and lead analytic efforts at the highest levels.

Establish a Chief Cop for Warning. This position would no longer be characterized as a "special adviser" to the DCI or fulfill the role of trying to cajole and convince analysts to warn. This person instead would wield clout, task studies, impose analytic rigor. Unless this individual is given line authority, the Intelligence Community's warning mission will continue to be depend on the personality of the chief warning officer.

Institute a system of accountability. When people get it wrong, take action. If the Chief Cop for Warning fails to act, he or she is out. In the intelligence business, traditionally, no one is accountable, no one is responsible. The CIA manager who led the analysis on India, for example, was recognized with an Agency-wide award less than a month after the colossal failure he led.9

Require all intelligence analysts to be trained in methodologies. All analysts must demonstrate analytic rigor to be promoted and, ultimately, to stay in the agency. Inexplicably, while all analysts in CIA's Directorate of Intelligerice were required to undergo a period of what is called "tradecraft" training in mid-decade, this extended training included not one segment on the discipline of warning. The week-long National Warning Course should be revived; it should be required of all Community analysts; and it should focus on methodologies and case studies. Senior intelligence managers should take a hands-on approach to the management of intelligence analysis and should train by example. They should demand rigor, command analysts to defend.their hypotheses and their analytic techniques, and they should not allow the unskilled or the poor performer to work on important accounts.

Finally, require all analysts to study cases of intelligence failure. This should include reading the finished intelligence produced at the time. Analysts should know the details of how these failures occurred; they should study the analysis, the evidence, and know the words that were used. They should read the words, "Exercises are more realistic than usual, but there will be no war." "Iran is not in a revolutionary, or even a pre-revolutionary state." The BJP has a more moderate constituency now that it must satisfy."10 They should be required to do a retrospective look at these cases and devise a strategy whereby the failure could have been prevented. Only by becoming familiar with the mistakes of the past, can analysts avoid repeating them.

__________

Notes

1. Mary McCarthy, "The National Warning System: Striving for an Elusive Goal," Defense Intelligence Journal 3-1 (1994), 5-19. [ http://cryptome.org/mccarthy-nws.htm ]

2. URL: http://www.odci.gov/cia/public_affairs/press_release/1998/jeremiah.html [Revised for current URL]. See 2 June for Jeremiah press conference; and http://web.archive.org/web/20000823230841/http://www.cdiss.org/98july2.htm [Revised for Wayback URL]: Key Findings of the Rumsfeld Commission.

3. Jeremiah press conference.

4. Rumsfeld press conference.

5. Margaret G. Herman and Joe D. Hagan, "International Decision Making: Leadership Matters" Foreign Policy, Spring 1998, 124-137.

6. Jeremiah press conference.

7. Jeremiah press conference.

8. For a discussion of the importance of timely warning, see John McCreary, "Warning Cycles," The National Warning Course: Selected Readings.

9. USA Today, June 4, 1998, page A13.

10.These quotes come from fimshed intelligence that appeared the same day as the October 1973 war began; in 1978 as the Iranian Revolution was well undenvay, and following the elections in India that brought the BJP to power in 1998.

__________

Author Biography

Mary O. McCarthy currently serves as a Special Assistant to the President and the Senior Director for Intelligence Programs on the National Security Council Staff From 1994-1996, she was National Intelligence Officer for Warning, and from 1991 to 1994, the Deputy NIO for Warning. Prior to her Government service, Mrs. McCarthy served in the academic and private sectors. She has a Ph.D. in African history and M.A. in Information Science from the University of Minnesota, and B.A. and M.A. degrees in history from Michigan State University.


HTML by Cryptome.