I was delighted and thrilled to be invited to join a panel at a workshop entitled “Understanding Narratives for National Security Purposes.” Then I realized that I would be lined up with dozen narrative theorists, and worried what I’d have to contribute.
And then I realized that I know something they don’t. Because all those academics and National Security experts know a lot of stuff. But what I know is digital marketing and it’s relationship to narratives, which means that I know that the information warfare attacks that we have been under, use nearly exactly the same techniques as any other digital marketing campaign, except they are unfettered by resources, laws or ethics.
The workshop is part of the Decadal Survey of Social and Behavioral Sciences for Applications to National Security, and they are seeking input from the public, and from people with expertise in the sciences, so please do check out the page.
As you may know, I have much to say on the topic of narratives, and I only had a few minutes to make my opening statement. So I narrowed my focus down to what digital marketing knows about narratives and why we should focus on Toxic Narratives – those which are intentionally false or misleading.
What follows is an annotated view of my slides, with a few additional thoughts and reflections.
One of the clearest messages to come out of the workshop is that every academic has defined narrative in a different way. We lack a cohesive and a coherent vocabulary – and that’s an issue. So this slide is my definition.
The simplest example of a narrative (my version of it) is this:
If you have a general positive worldview, you’ll see the cup have full. A negative worldview, half empty and if you’re a geek you’ll argue with me about the validity of the metaphor and if the glass is really full of air.
This slide simply says that narratives are powerful because they distill complicated issues into something that’s easy to understand and transmit. They make things matter.
I go on to explain that there are two things that Digital Marketers know a lot about in terms of leveraging narrative power. The first is Narrative Construction, and the second is Digital Amplification.
So let’s talk for a minute about how you can measure the strength of a narrative someone’s building, even before it hits the infosphere. We look at 5 different elements of narrative strength.
- Presentation – This first examines how well presented the narrative is. This involves everything from word choice to imagery to interactivity, video, music font choices. Basically, we’re asking if it’s attractive to people.
- Clarity – Is it easily understood? I started out working for technology companies who struggled mightily to simply explain what they did. So, can the average person understand what you’re talking about?
- Resonance– There are three kinds of narrative resonance:
- Emotional resonance – does it make you feel something?
- Intellectual resonance – does it give you that “oh yeah, that makes sense” feeling?
- Echo – does it pick up on other things you believe or hear or “know”? does it connect to other narratives and reengage those themes?
- Sharability – If I see it, find it attractive, understand it and it resonates with me, can I turn to my neighbor and explain it to them? How easily does it join my thoughts, my language, my conversations? This is the best test of narrative potency.
- Organization – this mostly reflects on how well organized your narrative elements are, how well organized your team is at using them, and whether you have a process that you follow for evolving and maintaining said narrative.
There are two major techniques that Narrative Warfare and digital marketing have in common.
A fully realized narrative is not a slogan, though they often have them. It can withstand some level of scrutiny. So a strong narrative will generally have at least three of these 4 elements.
- Why – What is the big vision? Why does it matter?
- How – What is the philosophy or methodology or point of view that leads to that “Why?”
- Offer – What’s the ask? What are you selling? These are critical in corporate narratives, but are important for many other organizations as well – you may want them to Share, subscribe, register, vote, donate, or volunteer.
- Proof – Why should we trust you? What corroborating information do you have that will give someone confidence in what you’re saying? In the case of corporate narratives, this is usually data, testimonials, awards, analyst reports and the like.
In the case of Toxic Narratives, these are usually lies.
OK – so, how do we put these narratives out there and to what ends? Marketers use a series of techniques to generate and amplify narratives that increase sales- sometimes indirectly – but always to be relevant to their markets. The bad guys are doing this with the intention of sowing discord and amplifying existing points of tension in society.
Here’s how they do it.
Targeting is the art and science of finding the people most likely to be receptive to your message. In the corporate world, this means finding the people who are most likely to have the problem that your product or service is meant to solve. That’s an oversimplification of course, we can (and do) write books on the topic.
Targeting is extremely interesting in the Russian propaganda efforts. They seem to have pursued 3 notable targets:
- Low cognition – Evidence suggests that they sought low information, low cognition voters with anger issues. Low cognition doesn’t mean stupid – it means they tend not to question or examine information. It makes them highly susceptible to false and misleading information.
- Extremists – these were people with extreme and divisive views on the fringes. It appears that they hunted for divisive issues to amplify here.
- Sympathetic, high visibility political and media figures, including members of Congress. We see this in many instances, most recently in the #releasethememo hashtag. This example is particularly well documented in Politico, but, more generally documented in this academic paper showing how more extreme social media and others such as Infowars and Breitbart influenced the content of more mainstream media outlets, such as Fox.
- Others – There’s evidence for several other targeting schemes as well, including “monitoring” and highly local versions.
Techniques in this space are generally well understood. Facebook actually offered help to both campaigns to teach them advanced targeting techniques on the platform. The Trump campaign wisely accepted. The story of that is documented in multiple news reports, but this one actually includes Facebook’s explanation.
- Ads – It may not seem obvious, but many times the primary purpose of ads isn’t to spread and amplify messages but to target different populations and see what messages work for them. This process can be automated and measured to create very potent messages very quickly. This might explain why the reported Russian ad-spend on Facebook was in the $100,000 range, rather than the millions. They may have used these ads to find their audiences, refine their messages, and go after them with Trolls (real people behind false identities) and bots (automated accounts). Note also that while ethical companies don’t use trolls, larger companies will have dedicated social media managers, who post from the brand account. Furthermore, many legitimate businesses and even NGOs automate their accounts to some extent.
For example, most organizations that are “serious” about social media schedule posts. That means they’ll cue up links to a bunch of content to get posted to twitter some number of times a day in some kind of order for days, weeks or months in advance. You almost have to to keep up. Companies frequently have customer service bots listening for complaints and offering to help the aggrieved. This kind of behavior is generally harmless, and, in some cases, even beneficial to people. Legitimate businesses also have bots that will search for certain phrases and follow people who use them. Or RT them. As you can see, context and intent can be important here. But legitimate firms are transparent about the fact that their brand is attached to the “bot”.
- Message alignment – In the corporate marketing world, message alignment means that you’re explaining your narrative in terms that are relevant to the target audience. If you’re really good at this, you’re aligning the message to the aspirations and anxieties of your target, using their vocabulary, as well as imagery and other presentation elements that say something specific to that audience.
In the case of corporate, commercial media, this tends to look something like” “Leave the office behind and enjoy time in your new … house, hot tub, vacation, bottle of gin”, or whatever. In the case of business and often of NGOs and politicians, it’s about telling stories that illustrate the issue in a way that feels personal.
In the case of Russian info-ops, it was about finding divisive, usually false or misleading messages that had caught on with some people who were like the targeted populations. They’d find conspiracy theories that touched on people’s fears and values – gun control, fear of Islam and others. They would be presented in the most emotionally forward way possible.
As mentioned above, you start with an idea of what is going to work with your audience and then you measure how well it does – getting people to click or like or follow, or whatever you ask them to do (buy, subscribe, donate). Then you can test variations on the themes until you know you have something that works. This is how the entire field of big data analytics got embedded in every marketing team of any size whatsoever. It’s all about the metrics, and the Russian information operations campaign has skills.
So – this is where the fun starts. Now that you know what messages are performing best, you start to amplify them in the market. You can do this a number of ways. You can get out there on social and repeat your message – you can engage with others on social and introduce them to your ideas, and hopefully, they will spread them.
Among the most effective techniques is to get as big a network as possible – your clients, friends, employees, etc, to share, repeat, retweet, like or otherwise goose your content to the top of the platform. If you have no resource constraints or ethical or legal constraints, you can hire people to act as trolls – people using fake identities to appear as sympathetic Americans, but who are actually employees of GRU – putting those messages into various places and communities online, engaging with other users, and generally making lots of noise. You can also create hundreds of thousands of bots – automated accounts that search for, repeat, retweet or like or what have you. We see that these accounts – both trolls and bots – have been followed and retweeted by very prominent figures, including senators, well-known Fox News personalities and others, up to and including President Trump and his children.
The trolls and bots will also follow RT and like regular people who sympathize and engage with their message, giving them affirmation and visibility that social media teaches them to crave.
A group called the Alliance for Securing Democracy, under The Marshall Fund, has created a dashboard where they track Russia-connected bot activity on Twitter. It is quite interesting to watch how very active they are even now on every political and quasi-political issue from questioning the legitimacy of the various Russia probes, to immigration and the NFL.
In this way, these trolls and bots, give sympathetic politicians and media the impression of a massive groundswell of public support for what sometimes are otherwise fringe ideas or opinions. The mainstream media then covers them, giving the stories vastly more reach, credibility and impact.
The idea of chaining is that each time a person engages with a story, they are also nudged to read another story, either because the story itself links to another, or because a troll or a bot suggests another story to you if you’ve given some positive indication of the first. In this way, they show people gradually more extreme information, guiding them from relatively credible to the completely false. Low-information, low-cognition voters are particularly vulnerable to this kind of influence. Oh – yes, you can target low-information, low-cognition voters, and there’s credible information to suggest that Cambridge Analytica does just that – and that others likely can too.
It is time to focus on toxic narratives. While there are so many kinds of narratives and so many shades of gray, I think it is clear that there’s a certain kind of narrative that is relatively easy to identify and clearly destructive to civil society. False and misleading narratives, targeted and amplified by unethical means, must be countered.
The point of this talk is that we understand these techniques. We have pretty decent visibility into what is going on. Therefore, that gives us the power to do something about it.
In the talk, I suggested three avenues.
- Identify and block toxic campaigns.
This is certainly tricky in a country fiercely dedicated to free speech. We already have certain limits on free speech – libel and hate speech laws – so we know it is possible to maintain free speech while limiting harmful speech. Possible, but tricky. However, bots and fraudulent accounts do not have these rights.
Technology companies can do far more to detect and suspend these accounts. They can do more to be more transparent and serious about what is and is not acceptable. They can enforce hate speech laws more vigorously. These same technology companies that aid and abet both commercial and hostile targeting of our personal information have every obligation to enforce community standards.
- Inoculate the population.
If the bad guys can identify vulnerable populations, the good guys can too. That means they can warn people that they are being targeted. It means we can educate them. It means we can do everything in our power to remove those narratives from view, educate people to recognize suspicious stories, and how to verify information. There are organizations out there, such as Snopes and Kahn Academy that are great at this, and we should leverage them.
- Provide alternative narratives.
Now, this is a hot topic for me. Why did Americans turn on the TPP? Because not 1/10 of a percent of the population had any idea what it was or why it mattered. Do you remember President Obama’s last State of the Union Address? The one with the four big questions we need to answer in the next decade? Can you name the questions? Can you find them on Google? No. You can’t. I wrote them down and I can’t. The only place they show up on the web is in the transcript of the speech.
Enduring, spreading narratives are those that are repeated and entrenched in our imagination. They are expressed and re-expressed. They are backed up and reinforced by other stories and media. There is no standalone narrative. If you have a message you want to get out there, you need to be thoughtful, intentional and persistent.
The other side is.