ISBN-13: 9781478351108 / Angielski / Miękka / 2012 / 182 str.
It is a rare season when the intelligence story in the news concerns intelligence analysis, not secret operations abroad. The United States is having such a season as it debates whether intelligence failed in the run-up to both September 11 and the second Iraq war, and so Rob Johnston's wonderful book is perfectly timed to provide the back-story to those headlines. The CIA's Center for the Study of Intelligence is to be commended for having the good sense to find Johnston and the courage to support his work, even though his conclusions are not what many in the world of intelligence analysis would like to hear. He reaches those conclusions through the careful procedures of an anthro-pologist-conducting literally hundreds of interviews and observing and participating in dozens of work groups in intelligence analysis-and so they cannot easily be dismissed as mere opinion, still less as the bitter mutterings of those who have lost out in the bureaucratic wars. His findings constitute not just a strong indictment of the way American intelligence performs analysis, but also, and happily, a guide for how to do better. Johnston finds no baseline standard analytic method. Instead, the most com-mon practice is to conduct limited brainstorming on the basis of previous analy-sis, thus producing a bias toward confirming earlier views. The validating of data is questionable-for instance, the Directorate of Operation's (DO) "clean-ing" of spy reports doesn't permit testing of their validity-reinforcing the tendency to look for data that confirms, not refutes, prevailing hypotheses. The process is risk averse, with considerable managerial conservatism. There is much more emphasis on avoiding error than on imagining surprises. The analytic process is driven by current intelligence, especially the CIA's crown jewel analytic product, the President's Daily Brief (PDB), which might be caricatured as "CNN plus secrets." Johnston doesn't put it quite that way, but the Intelligence Community does more reporting than in-depth analysis. None of the analytic agencies knows much about the analytic techniques of the others. In all, there tends to be much more emphasis on writing and communication skills than on analytic methods. Training is driven more by the druthers of individual analysts than by any strategic view of the agencies and what they need. Most training is on-the-job. Johnston identifies the needs for analysis of at least three different types of consumers-cops, spies, and soldiers. The needs of those consumers produce at least three distinct types of intelligence-investigative or operational, stra tegic, and tactical. The research suggests the need for serious study of analytic methods across all three, guided by professional methodologists. Analysts should have many more opportunities to do fieldwork abroad. They should also move much more often across the agency "stovepipes" they now inhabit. These movements would give them a richer sense for how other agencies do analysis. Together, the analytic agencies should aim to create "communities of practice," with mentoring, analytic practice groups, and various kinds of on-line resources, including forums on methods and problem solving. These communities would be linked to a central repository of lessons learned, based on after-action post-mortems and more formal reviews of strategic intelligence products. These reviews should derive lessons for individuals and for teams and should look at roots of errors and failures. Oral and written histories would serve as other sources of wherewithal for lessons. These communities could also begin to reshape organizations, by rethinking organizational designs, developing more formal socialization programs, testing group configurations for effectiveness, and doing the same for management and leadership practices. Center for the Study of Intelligence, Central Intelligence Agency."