The findings of intelligence assessments are occasionally controversial. This includes products released only for internal consumption, as well as those rare assessments that are disseminated to the broader public. Ideally, the paper is written apolitically, using dispassionate language, and with enough unequivocal support that even a contentious message is given due consideration. Yet, even reports that have met these standards can sometimes lead to push back. Here are a few ways to strengthen your argument and gain audience trust when presenting controversial findings.
1. Anticipate questions: Sometimes the writer of an assessment is so familiar with a subject that it’s hard to see what might be missing. Since you’re not there to answer questions for your reader, you’ll need to anticipate those questions and fill in the missing information up front. It may be helpful to ask a colleague who is less familiar with the topic to review your work. Here is a closer look at this point using the example of the intelligence assessment Assessing Russian Activities and Intentions in Recent US Elections, released to the public by the IC in 2017.
The analysis leans on the premise media source RT America put out Russian propaganda that may have influenced its American audience. Some of the questions that might be anticipated with respect to this assertion include: How relevant is RT America in the United States? It may be “the most-watched foreign news channel in the UK, ” as the assessment cites, but does a UK audience mirror a US audience? Reportedly, the channel has “millions” of followers and subscribers on Twitter and YouTube, however if a key element of the Russia campaign was fake personas–both real and automated–are these numbers accurate or have they been manipulated, too? By anticipating and addressing these questions up front, your audience may find it more reasonable that a US audience could be swayed by propaganda delivered by a Russia-controlled channel.
2. Add context: In the same assessment, analysts cited a “significant escalation in directness, level of activity, and scope of effort compared to previous operations.” But some Russian experts and scholars disagree with this finding, so it’s important to support it by providing strong and detailed context. What is the history of Russian meddling? How has it evolved? What are its successes/failures? Did the Kremlin incorporate these lessons learned into later campaigns? To what end? Do other countries meddle in US elections? Why are the Russians more successful/persistent? Have US audiences shown themselves to be particularly receptive to Russian messaging in the past?
3. Compare and contrast: You can increase the impact of your findings if you compare and contrast. Rather than to simply state this campaign by Russia demonstrated a “significant escalation in directness, level of activity, and scope of effort,” compare and contrast it to previous efforts. [Note: Analysts did include some examples of Russian tactics used previously under the heading “Influence Effort Was Boldest Yet in the US” (page 5). Still, expanding on this key point and then comparing and contrasting past operations would have offered greater clarity as well as support that this was demonstrably a “significant escalation” on the part of the Kremlin.]
4. Use a quantitative method: Data processed and expressed through numbers tend to be more objective than analytic methods where numbers are less easily applied. Caveat: Quantitative methods are only effective if applied correctly. Be sure to gather the largest data set you can reasonably compile as well as across the broadest timeframe, generally 10 years or more. Account for any influences that may have caused fluctuations in your numbers. For example, an increase in Medicare fraud cases or arrests may correlate to an increase in personnel assigned to the violation.
If you find you have too few data to build a strong data set, try widening your timeframe or modifying your intelligence question.
5. Incorporate a structured thinking technique: The use of structured methods is repeatedly encouraged on this site because structured techniques are among the most effective tools an analyst can use to reach an objective conclusion. They are also excellent at leaving audit trails, and for controversial assessments, it’s a good idea to be prepared to “show your work.” [Note: If you’re using the Analysis of Competing Hypotheses, it’s good practice to include notes on the grid about how/why you determined a piece of evidence was consistent, inconsistent, neutral, or not applicable.]
Some readers may challenge and/or reject your findings no matter how solidly you present them. Of course, different perspectives are welcome provided the work is supported with equally strong analysis and argumentation.