Home » Articole » EN » Society » Philosophy » Methodology » Intelligence Methodologies

Intelligence Methodologies

Methodology, in intelligence, consists of the methods used to make decisions about threats, especially in the intelligence analysis discipline.

The enormous amount of information collected by intelligence agencies often puts them in the inability to analyze them all. According to McConnell, the US intelligence community collects over one billion daily information. (McConnell 2007) The nature and characteristics of the information gathered as well as their credibility also have an impact on the intelligence analysis.

The capability parameter is essential to the current understanding of the threat. (Vandepeer 2011) Analysts use two approaches to capacity assessment: the use of measures and proxy measures. A measure allows a direct assessment of the capacity. Proxy measures are indirect measures used to make deductions in terms of capacity.

For the assessment of a country’s military weapons and armed forces, in addition to capacity measures, there are five direct measures to assess military capability: leadership and C2 (command and control); order-of-battle; force readiness and mission; force sustainability; and technical sophistication, (Joint Publication 2-01 2012) plus proxy measures (military related subjects assessment), including C4 systems (telecommunications and networks); the state’s Defence industries; energy/power; geography; demography; and medical capability. State capabilities may only be known once they are effectively used against an opponent. (Vandepeer 2011)

The nature itself of an intention means that it is not “measurable” like capacity. It is estimated or deduced from observable factors, called indicators (observable factors used to deduce or observe current or future intentions). The indicators provide a means of inferring rather than quantifying.

There are three indicators that appear significantly in assessing state intentions: the military capacity of the state; ideology of the state; and words, actions and behaviors of state leaders. So, military capacity assessments are not enough to infer a state’s intentions. The ideology of a state reflects political leadership, the third indicator of intentions.

Intelligence analysts are “essentially information translators, whose role is to review information and provide reliable intelligence in a practical and operational format.” (Cope 2004, 188) The U.K. National Intelligence Model describes four major products resulted in the analysis process: strategic assessments, tactical assessments, target profiles and problem profiles,. (Association of Chief Police Officers, Bedford 2005) The evaluation of information implies their credibility, together with an assessment of the reliability of the sources. (Palmer 1991, 22) There are few formal information rating systems used by analysts around the world. The most common of these methods is the Admiralty System (referred to as the NATO System), which is used to demonstrate the net value of certain information based on the reliability of the source and the validity of the data. (Besombes, Nimier, and Cholvy 2009) The traditional model is a 6 x 6 matrix. Agencies operating within the National Intelligence Model in the UK use an alternative classification system commonly called the 5x5x5 system. (Joseph and Corkill 2011)

The prism theory of Robert Flood, termed by others as methodological pluralism, uses the metaphor to describe creative thinking and transformation, a prism that decomposes light into its component colors through double refraction. This type of thinking produces multiple different visions on the same thing and a common vision for many different things. Its purpose is to challenge hypotheses, provoke new ideas and generate unexpected prospects. (Flood 1999) (Duvenage 2010, 81)

The concept of prismatic thinking has gained ground in the analysis of information. Jones states that besides convergent thinking, we also need divergent thinking to ensure an effective analysis and problem solving. (M. D. Jones 2009) Divergence helps analysts analyze a more creative issue, while convergence helps to achieve completion. (Duvenage 2010, 82)

Wolfberg proposes a full-spectrum mindset, in which the analyst applies both intuitive and structural methods, depending on the specific context, assuming at the outset that there are multiple interrelated problems that need to be solved simultaneously. (Wolfberg 2006) (Duvenage 2010, 83)

Waltz conceived the integrated reasoning process, (Waltz 2003) an integrated formal and informal methods of reasoning for analysis-synthesis in the operational environment of the intelligence activity. The process stems from a set of evidence, and a question for them that explains the evidence. This process, from a set of evidence to detection, explanation or discovery, detects the presence of evidence, explains the processes underlying the evidence, and discovers new patterns in the evidence. The model illustrates four basic ways that can use the set of evidence: three fundamental ways of reasoning and a fourth way of feedback: deduction (by testing on models/hypotheses previously known), retroduction (when the analyst conjectures a new conceptual hypothesis causes a return to the set of evidence), abduction (creates explanatory hypotheses inspired by the set of evidence), induction (searching for general statements (assumptions) about evidence). (Duvenage 2010, 84–85)

Waltz typifies the analysis-synthesis process as a process of decomposing evidence and building the model, helping the analyst to identify the missing information, the strengths and weaknesses of the model. The model serves two functions: hypothesis (if the evidence is limited), and explanatory (when more evidence matches the hypothesis). The process involves three phases defined using the term “space” and the use of structural analytical techniques: data space (data is indexed and sorted), argument space (the data are reviewed, correlated and grouped into a set of hypotheses) and explanatory phase (models are composed to serve as explanations). (Duvenage 2010, 86)

The flow of cognitive process is identified as: searching and filtering, reading and extracting, schematizing, building the case, telling the story, reevaluating, looking for support, looking for evidence, looking for relationships, looking for information. (Duvenage 2010, 88)

A rigorous analytical model that can help analysts was developed by Zelik, Patterson and Woods in 2007. This model improves Heuer and Pherson’s structured self-critique technique. This model has eight rigorous indicators: exploration of the hypothesis, search for information, validation of information, stance analysis, sensitivity analysis, collaboration of specialists, synthesis of information, explanation critique. This model explains cognitive processes, provides the first metric to test informational products, and provides a framework for collaborative learning. (Duvenage 2010, 91–92)

Duvenage details further the sensemaking concept derived from cognitive and especially organizational theory, (Weick 1995) is used in knowledge to investigate and describe how the individual, the group and, specifically, the organization are confronted with uncertainties and adapt to complexity. (Duvenage 2010, 92–93) At the individual level, sensemaking means the ability to perceive, analyze, represent, visualize and understand the environment and the situation in an appropriate contextual manner. (Cooper and Intelligence 2012) This is known in intelligence analysis as situational awareness or environmental scanning. The relevance of meaning in information analysis becomes clear when seven properties of Weick’s significance are applied to the psychology of Heuer’s information analysis: social context, grounded in identity construction, retrospective, driven by plausibility rather than accuracy, ongoing, extracting from salient cues, enacting. (Duvenage 2010, 94–95)

Fishbein and Treverton cite Klein, Stewart, and Claxton, who argue that empirical research has shown that intuitive judgment is the basis of most organizational decisions and is superior to analyzing problems marked by ambiguity or uncertainty. (Shulsky and Schmitt 2002)

Robert M. Clark proposed a methodology for analyzing information by addressing the target-centric intelligence cycle (Clark 2003) as an alternative to the traditional information cycle. It has redefined the informational process in the form of an integrated network where information can circulate directly between the different stages of the cycle (practically, nor is it a cycle in the traditional sense of the term).

Sherman Kent encouraged arguments and dissent among intelligence analysts to reach a “wide range of outside opinions”, (Davis 1995) encouraging “collective responsibility for judgment” by networking the intelligence with loops of feedback between analysts and various stages of the intelligence cycle.

Conceptual models allow analysts to use powerful descriptive tools to estimate current situations and predict future circumstances. (Clark 2003, 37) After the model was sketched, the analyst populated the model by researching, gathering information and synthesizing. He has to find information from a wide range of classified and unclassified sources, depending on the targets.

The collected data must be collated, organized, and the evidence is evaluated for relevance and credibility. After analyzing the data, the analyst includes the information in the target model, thus determining where inconsistencies exist in the conclusions by further research to support or deny a certain conclusion. The target model shows where there are gaps in the model. Possible discrepancies force the analyst to collect additional information to better describe the goal.

Robert M. Clark’s organizational model helps analysts successfully describe the target organization and see the strengths and weaknesses of the target for predictive and reliable analysis. (Clark 2003, 227)

General Stanley A. McChrystal proposed in 2014 a targeting cycle called “F3EA” used in the war in Iraq, which means:

  1. Find: A target (person or location) is first identified and located.
  2. Fix: The target is then kept under continuous surveillance while a Positive Identification is established.
  3. Finish: A raiding force is assigned to capture or kill the target.
  4. Exploit: Intelligence material is secured and mined, with detainees interrogated.
  5. Analyze: Information is studied to identify further targeting opportunities. (McChrystal 2014)

Richards Heuer states that no method guarantees the success of the conclusions. Analysts need to continually improve it, depending on their specific context and previous personal experiences. (Heuer 1999) Also, in the case of a network cycle approach, it should be borne in mind that these models consume much longer than a traditional cycle. (Johnston 2005)

Structural analytical techniques are used to provoke judgment, identify mentalities, overcome prejudices, stimulate creativity, and manage uncertainty. Examples include verifying the main assumptions, competing hypothesis analysis, the devil’s advocate, red team analysis, and alternative futures / scenarios analysis, among others. (US Government 2009) The following methods are ways to validate the analyst’s judgment:

Opportunity analysis: Identifies, for decision-makers, opportunities or vulnerabilities that their organization can exploit.

Linchpin analysis: results from information that is certain or likely to be safe. (Davis 1999)

Analysis of competing hypotheses:  The analysis of competing hypotheses was a step forward in the methodology of information analysis. More challenges, according to Heuer, are more important than more information, especially to avoid rejecting cheating at hand, as the situation seems to be simple. The steps in the analysis of competing hypotheses are: (Heuer 1999)

  1. Identify the possible assumptions to be considered. Use a group of analysts with different perspectives to understand the possibilities.
  2. Make a list of significant evidence and arguments for and against each hypothesis.
  3. Prepare a matrix with assumptions at the top and evidence at the bottom. Analyze the “diagnosis” of evidence and arguments – that is, identify the elements that are most useful in assessing the relative probability of hypotheses.
  4. Refine the matrix. Review hypotheses and delete proofs and arguments that do not have diagnostic value.
  5. Make tentative conclusions about the relative probability and inconsistency of each hypothesis. Continue trying to reject assumptions rather than prove them.
  6. Analyze how sensitive your conclusion is to some critical evidence. Consider the consequences for your analysis if this evidence was wrong, misleading or subject to a different interpretation.
  7. Report the conclusions. Discuss the relative probability of all hypotheses, not only the most probable.
  8. Identify landmarks for future observation that may indicate that events have a different course than expected.

Analyzing competing hypotheses is auditable and helps overcome cognitive biases. It allows the return to evidence and hypothesis, and therefore the monitoring of the succession of rules and data that led to the conclusion.

  • Realistic ACH activities leave analysts disoriented or confused.

Van Gelder proposed hypothesis mapping as an alternative to competing hypothesis analysis. (van Gelder 2012)

The structural analysis of competing hypotheses provides analysts with an improvement over original limits, (Wheaton and Chido 2007) maximizing possible assumptions and allowing the analyst to divide a hypothesis into two complex assumptions.

A method, used by Valtorta and colleagues, uses probabilistic methods, adding Bayesian analysis to competing hypotheses. (Goradia, Huang, and Huhns 2005) A generalization of this concept led to the development of CACHE (Collaborative ACH Environment), (Shrager et al. 2010) which introduced the concept of the Bayesian community. The work of Akram and Wang applies paradigms in graph theory. (Shaikh Muhammad and Jiaxin 2006)

Pope’s and Jøsang’s works use subjective logic, a formal mathematical methodology that explicitly deals with uncertainty, (Pope and Jøsang 2005) which forms the basis of Sheba technology that is used in intelligence assessment software.

Analogy: Common in technical analysis, but the engineering features that seem the same does not necessarily mean that both have the same mode of operation just because they are similar.

In the process of intelligence analysis, analysts should follow a series of sequential steps:

  1. Definition of the problem: analysts should try to understand both the mind of the opponent and the thinking of their clients and their allies.
  2. Generating hypotheses: based on questions.
  3. Determining information needs and gathering information: the analyst can request specific collection on the topic or, if this is not possible, identify this information gap in the final product
  4. Evaluation of sources: The analyst must evaluate the information for reliability, credibility and possible false or deception.
  5. Assessment of assumptions (tests): Testing by methods such as competing hypothesis analysis or linking diagrams, paying attention to cognitive and cultural prejudices inside and outside the organization.
  6. Production and packaging: Very well-structured written and oral presentations, including electrical messages, printed reports, briefing, or video; three features are essential to the information product: timeliness, scope, and periodicity.
  7. Peer review: Essential for assessing and confirming accuracy.
  8. Feedback and product evaluation: after delivery, the process continues with the interaction between the producer and the customer, through mutual feedback, on the basis of which both analyses and requirements are refined.

Effective intelligence analysis must ultimately be tailored for the end user but without lowering the quality and accuracy of the product. (M. L. Jones and Silberzahn 2013)


  • Association of Chief Police Officers, Bedford. 2005. “Guidance on the National Intelligence Model.” https://whereismydata.files.wordpress.com/2009/01/national-intelligence-model-20051.pdf.
  • Besombes, Jérôme, Vincent Nimier, and Laurence Cholvy. 2009. “Information Evaluation in Fusion Using Information Correlation.” ResearchGate. 2009. https://www.researchgate.net/publication/224577351_Information_evaluation_in_fusion_using_information_correlation.
  • Clark, Robert M. 2003. Intelligence Analysis: A Target-Centric Approach. Washington, D.C: Cq Pr.
  • Cooper, Jeffrey R., and Center for the Study of Intelligence. 2012. Curing Analytic Pathologies: Pathways to Improved Intelligence Analysis. CreateSpace Independent Publishing Platform.
  • Cope, Nina. 2004. “’Intelligence Led Policing or Policing Led Intelligence?’ Integrating Volume Crime Analysis into Policing.” The British Journal of Criminology 44 (2): 188–203. https://doi.org/10.1093/bjc/44.2.188.
  • Davis, Jack. 1995. “A Policymaker’s Perspective On Intelligence Analysis.” https://www.cia.gov/library/center-for-the-study-of-intelligence/kent-csi/vol38no5/pdf/v38i5a02p.pdf.
  • ———. 1999. “Improving Intelligence Analysis at CIA: Dick Heuer’s Contribution to Intelligence Analysis.” 1999. http://www.au.af.mil/au/awc/awcgate/psych-intel/art3.html.
  • Duvenage, Magdalena Adriana. 2010. “Intelligence Analysis in the Knowledge Age : An Analysis of the Challenges Facing the Practice of Intelligence Analysis.” Thesis, Stellenbosch : University of Stellenbosch. https://scholar.sun.ac.za:443/handle/10019.1/3087.
  • Flood, Robert L. 1999. Rethinking The Fifth Discipline: Learning Within the Unknowable. Psychology Press.
  • Gelder, Tim van. 2012. “Exploring New Directions for Intelligence Analysis.” Tim van Gelder (blog). 2012. https://timvangelder.com/2012/12/11/exploring-new-directions-for-intelligence-analysis/.
  • Goradia, Hrishikesh, Jingshan Huang, and Michael N Huhns. 2005. “Extending Heuer’s Analysis of Competing Hypotheses Method to Support Complex Decision Analysis.” ResearchGate. 2005. https://www.researchgate.net/publication/241836758_Extending_Heuer’s_Analysis_of_Competing_Hypotheses_Method_to_Support_Complex_Decision_Analysis.
  • Heuer, Richards J. 1999. Psychology of Intelligence Analysis. Lulu.com.
  • Johnston, Rob. 2005. Analytic Culture in the US Intelligence Community: An Ethnographic Study. University of Michigan Library.
  • Joint Publication 2-01. 2012. “Joint and National Intelligence Support to Military Operations.” https://www.bits.de/NRANEU/others/jp-doctrine/jp2_01%2812%29.pdf.
  • Jones, Milo L., and Philippe Silberzahn. 2013. “Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001 | Milo Jones and Philippe Silberzahn.” 2013. http://www.sup.org/books/title/?id=22067.
  • Jones, Morgan D. 2009. The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving. Crown Publishing Group.
  • Joseph, John, and Jeff Corkill. 2011. “Information Evaluation: How One Group of Intelligence Analysts Go about the Task.” Australian Security and Intelligence Conference, January. https://doi.org/10.4225/75/57a02d74ac5c9.
  • McChrystal, General Stanley. 2014. My Share of the Task: A Memoir. Reprint edition. New York, NY: Portfolio.
  • McConnell, Mike. 2007. “Overhauling Intelligence.” 2007. https://www.researchgate.net/publication/293761677_Overhauling_intelligence.
  • Palmer, Bill. 1991. Strategic Intelligence for Law Enforcement. Canberra: Australian Bureau of Criminal Intelligence.
  • Pope, Simon, and Audun Jøsang. 2005. “Analysis of Competing Hypotheses Using Subjective Logic (ACH-SL).” https://apps.dtic.mil/dtic/tr/fulltext/u2/a463908.pdf.
  • Shaikh Muhammad, Akram, and Wang Jiaxin. 2006. “Investigative Data Mining: Connecting the Dots to Disconnect Them.” Intelligence Tools Workshop. http://www.huitfeldt.com/repository/ITW06.pdf.
  • Shrager, Jeff, Dorrit Billman, Gregorio Convertino, J. P. Massar, and Peter Pirolli. 2010. “Soccer Science and the Bayes Community: Exploring the Cognitive Implications of Modern Scientific Communication.” Topics in Cognitive Science 2 (1): 53–72. https://doi.org/10.1111/j.1756-8765.2009.01049.x.
  • Shulsky, Abram N., and Gary James Schmitt. 2002. Silent Warfare: Understanding the World of Intelligence. Potomac Books, Inc.
  • US Government. 2009. “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis.” https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/Tradecraft%20Primer-apr09.pdf.
  • Vandepeer, Charles. 2011. “Rethinking Threat: Intelligence Analysis, Intentions, Capabilities, and the Challenge of Non-State Actors.” Thesis. https://digital.library.adelaide.edu.au/dspace/handle/2440/70732.
  • Waltz, Edward. 2003. Knowledge Management in the Intelligence Enterprise. Artech House.
  • Weick, Karl E. 1995. Sensemaking in Organizations. SAGE.
  • Wheaton, Kristan J., and Diane E. Chido. 2007. “Structured Analysis of Competing Hypotheses: Improving a Tested Intelligence Methodology.” 2007. https://web.archive.org/web/20070928154654/http://www.mcmanis-monsalve.com/assets/publications/intelligence-methodology-1-07-chido.pdf.
  • Wolfberg, Adrian. 2006. “Full-Spectrum Analysis: A New Way of Thinking for a New World.” Military Review, July-August 2006. http://cgsc.cdmhost.com/cdm/ref/collection/p124201coll1/id/414.

Nicolae Sfetcu
Email: nicolae@sfetcu.com

This article is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International. To view a copy of this license, visit http://creativecommons.org/licenses/by-nd/4.0/.

Sfetcu, Nicolae, “Intelligence Methodologies”, SetThings (March 31, 2019), MultiMedia Publishing (ed.), URL = https://www.telework.ro/en/intelligence-methodologies/

Leave a Reply

Your email address will not be published.