Future Now
The IFTF Blog
Thoughts from the Authors
This collection of thoughts was recorded with the authors of "The Biology of Disinformation: Memes, media viruses, and cultural inoculation," Douglas Rushkoff, David Pescovitz, and Jake Dunagan, in order to provide some context for the paper, its themes, and its meaning for society. Download the paper here.
Jake Dunagan: Propaganda goes back thousands of years. The idea of trying to influence and persuade people to not to just think differently but act differently, is as old as rhetoric and storytelling and culture.
David Pescovitz: There's been a long history of trying to figure out ways to use whatever media is at your fingertips to inform or misinform or coerce wide swaths of the population to believe something or do something. And that could be "buy this brand of soda" or "vote for this politician" or "blame immigrants for all your problems." So in that regard it's not entirely new, but what is new is the ability for most anyone to reach large numbers of people without the need for a deep pocketed middle man or a TV network.
Douglas Rushkoff: It was interesting to go back and look at the origins of public relations. The whole field was based on the underlying assumption that people need to be directed from above. They believed the masses are a clueless and dangerous mob, and that we can’t act in our own best interests without a benevolent elite telling us what to do and how to think. Since WWI, government and industry alike have questioned whether human beings in our society are capable of engaging thoughtfully and meaningfully in democracy.
DP: I first learned about memes as a concept before the web through the work of Richard Dawkins and through Doug's book, Media Virus. What I think happened is social media made it very obvious to people how ideas spread.
DR: In the early days of the web, the most aggressive users of media viruses were activists attempting to challenge the controllers of top-down media. They could launch their memes through the net, regardless of what the keepers of television wanted. When it got truly interesting, however, was when news networks like CNN began covering trending Tweets and Facebook posts - as if they were afraid to be left behind. So media activists realized that they now had the power to plant news stories. And they’d do so not by convincing some journalist to cover a story, but by creating an outrageous media phenomenon, or what in the language of viral media I called a sticky viral shell.
Biological viruses, on which I based the viral media metaphor, have two main features: a sticky protein shell, and genetic material inside it that wants to replicate. The virus travels freely through the body as long as its shell is not recognized as a virus. That’s why it needs to be novel; we have not developed an immune response. It latches onto our cells, and injects its genetic material, which then competes with our own. If the viral code can find a weak spot in our DNA in which to nest, it ends up getting replicated along with our DNA every time the cell divides. That happens until we learn to recognize the shell, and grow immune. Likewise, a media virus needs a novel, unprecedented use of media to act as its shell. An outrageous recording, Tweet, surveillance video, or live stream. Once it spreads, the virus only infects our society if it can challenge our cultural code. The weaker our code - as in the case of race, immigration, or gender issues - the more effectively it can embed itself.
JD: I think these are some of the new features of current propaganda and computational propaganda, disinformation, and fake news, and all of those things that are going around now is that they understand at a population level what can hit really effectively, but also they are able to target minds in a way that is very effective. And so these psychological profiles that can be derived from our social media activities, our purchasing power, all these bits of patterns, all these bits of information that come together that allow a much more targeted kind of intervention into our thinking, and be much more effective. I think that's where we're heading, almost hijacking our minds in a way, where we can't even help it. They just work. And they can override our thinking capacity, and they know how to do that.
DP: I think that there's not going to be a technological solution, that's almost like playing Whac-A-Mole. It's really about being able to honestly look at why we are so susceptible and less about how they are so effective. It's about how and why we're so vulnerable to a particular meme and that's where the bigger and the harder conversations can be found. I think having those conversations is really the only way we're going to develop a kind of cultural immunity to weaponized memes and disinformation.
JD: I think that's why the biology metaphor comes in and is more apt, because biological entities change their own environment for their own ends, they adapt, they exploit. They do all of these things that machines don't do. Machines are predictable, not getting into machine learning or anything like that but just typical machines. You turn a crank and you have an expectation that you should be able to know what's coming. With biological entities, that's much less certain. So information acting in a biological system is unpredictable, and by definition, adds unpredictability to its system. So that's what information does, and so it is, as they say, the opposite of neutral. And that's our design challenge: To take that understanding of information and then how do we deal with it? Not to deny it and not to go back to the safety of false certainty, but to design for the complexity that we know that we're living in and that we need to address.
DR: Well, this is an arms race for control of public opinion, so any technological solution is bound to be met with a countermeasure. The only long term answer is to build up the resilience - the cultural immune system - of our society. It's the humans who have to learn how to live in this world, not the machines. The more we ask the machines to solve the problem of human autonomy in a digital age, the more dependent we are on whoever is controlling the machines.
Instead, we must become smarter media users and consumers. When we engage with a media platform, we have to ask ourselves “where am I? What is this technology for? What does it want from me?” Second, we must build our cultural immunity by speaking openly and honestly about the issues that concern us. Repressed issues are fuel for viral assault. Unless we are able to discuss these issues, together, then they’re going to serve as trigger points. We can’t hang onto our confused code any longer. Not in such a highly connected, digital media environment.
DP: This paper came about bouncing ideas around with Marina Gorbis and Douglas Rushkoff, and later Jake Dunagan. Thinking about memes as a biological metaphor for the spread of information, what can that metaphor teach us about disinformation and propaganda. Then we started thinking about the possible technological solutions to these challenges and then thinking about the idea of a cultural immune response, even that phrase provokes some fantastic conversations. So, once we had these kinds of conversations we were able to start to develop a framing. The opportunity to work with people who are and who have been studying and exploring this space for decades and being able to ask and push on the very hard questions with them is something that I don't take for granted.