Adam Caudill

Security Leader, Researcher, Developer, Writer, & Photographer

Dezinformatsiya

I recently wrote a review on Active Measures by Thomas Rid – which helped me to solidify my thoughts on social media, and the impact it has on society. While Active Measures is focused on disinformation campaigns, it also speaks to the vulnerabilities in humans that allow these campaigns to work. Disinformation is a substantial issue today, and not just in terms of election interference, public health, or international relations – but also in much smaller scale unorganized efforts to alter perception.

Social media has come to play a central role in a disinformation campaigns, from the massive efforts of state-level actors with multi-million-dollar budgets, to individuals who, without guidance (and perhaps unwittingly), are using these same techniques.

What is Truth? #

To understand the role of social media and disinformation (and other active measures) in society, it’s important to understand some core concepts of how disinformation works – or, more specifically, what it exploits within the human mind.

The most effective disinformation is built on a couple things:

  • Existing cracks in society. It is far easier to exploit existing mistrust, doubts, and divisions than it is to take a unified group and split them apart. There needs to be an existing fissure that can be widened.
  • Elements of truth. While outright lies can have a useful impact, the most effective efforts are those that are built on at least some degree of truth.

But what is truth? #

In one sense, truth is that which is factual, that which can be demonstrated through empirical evidence, that which is undeniable. This form of binary truth is apolitical, it exists outside the bounds of judgement and opinion, it is absolute. In reality though, truth is rarely binary.

Humans remain, for a large part, tribal creatures, from politics, to sports, to geography – people often prefer those things that are within their tribal identity. This tribal tendency provides the first erosion of truth. For example, many will claim that their preferred team is the best within a sport – and by some measures this may indeed be true, while others will hold the same position about a different team, and by a different set of measures, that is also true. People within a given tribal group likely hold similar positions, which act to reinforce these views of what is true within all members of the group.

Here we begin to see that truth is often far from absolute, but is instead subjective. When applied to religion or politics, this tribal division in truth because far more apparent. When truth is viewed through the lens of tribal identity, socioeconomic status, political and religious affiliations, it ceases being binary, it ceases being absolute.

A simple experiment for this can be seen in the following sentence: “The economic policies of President X did more to improve the economy than any of their predecessors.” The answer is largely predictable knowing only two things, the political affiliation of X and the affiliation of the person being asked. Those that are well-informed will often be able to provide figures supporting their statement. What’s important to understand though, is that each answer is truthful, in absolute terms, to the person that answers the question.

Truth then, in human terms, is much like beauty – it’s in the eye of the beholder.

Enter Social Media #

“What would active measures be without the journalist?” – Rolf Wagenbreth, East German Stasi. 1986

In 1986, the director of the Stasi’s disinformation unit questioned how modern techniques could possibly work without the unwitting aid of journalists and the free media. In the decades that followed, a new form of free media emerged that eliminated the ethics and professionalism of the journalist – social media would develop into an unrivaled opportunity.

With the fall of the Soviet Union, there was decline in disinformation operations and other active measures – though it was only a temporary reprieve. In the late 2000s a new trend in technology would bring disinformation into its ultimate & most efficient form – fast, cheap, effective. Exploiting both technology and human nature to the utmost possibly.

Disinformation and social media are something of a perfect storm:

  • Minimal filtration of the material that users make available. While some platforms are doing more to combat disinformation since the 2016 US election, the reality is that eliminating all intentional misinformation is impossible at the scale that large social media platforms operate at. No matter how much progress a platform claims, the reality is that it will always be a losing battle.
  • Social media companies have historically optimized for engagement, which exacerbates the problem. Content that is highly divisive, highly objectionable, and highly controversial also happens to drive the most interactions. This creates an unfortunate incentive for platforms to not only allow this content, but to promote it.
  • Information can spread quickly, especially with the lack of any fact checking. This is especially true of things that support what a person already believes; confirmation bias plays a substantial role in the effectiveness of these efforts.
  • Effort asymmetry – the level of effort involved in creating misinformation and disinformation is far lower than the effort required to correct & refute it. This means that ex post facto fact checking is always a losing battle; it’s simply impossible to note all of the inaccuracies and lies being spread. There is also substantial evidence that corrections or fact checking spreads far slower, and reaches far fewer people than the original incorrect information.
  • Pseudonymity is the default for social media platforms, with no effective requirement that users are honest about who they are, or where they are located. This makes it trivial for those that wish to manipulate opinion to appear that they are from the same group as those they are influencing, members of the same tribal group – even if they are on the other side of the world. The only upside to be noted is that organized disinformation campaigns often make cultural or linguistic errors that hint to their true nature – at least to those that care enough to look for such hints (such as the infamous @TEN_GOP account).

All of these factors combine to make social media a breeding ground for false and misleading information.

The actual impact of Russian disinformation efforts on social media are debatable – as the impact is hard to quantify by its very nature – though it is clear that social media is having a polarizing impact on discourse in America and around the world.

Informed & Influenced #

While exploiting different views of the truth is important, sometimes being entirely honest and letting the situation play out without injecting anything forged or altered is sufficient.

One of the most striking active campaigns that I’ve witnessed was the attack against the DNC, and specifically the rift that was created when it was revealed that DNC leadership clearly preferred seeing Hillary Clinton nominated versus Bernie Sanders.

First, let’s look at what was going on, and what people were going through:

  • During the nominating process, the party divides into a number of camps, each backing their preferred candidate. This is a complicated and often painful period, as the party itself is at its weakest – the future is most uncertain, deep emotional and political divides are reinforced and driven deeper, and the risk of leaving members spent and disaffected is ever present.
  • Once a nominee becomes clear, their most immediate challenge is to undo the rifts created during the process and unite the party. This is a critical stage of the process, and if it fails, so does the nominee.
  • Many party members are disappointed and upset – they have spent months working for their candidate, donated time and money, just to see them fail and nothing come of the effort. The only consolation is that it was the will of the people, so they have to move on and support the party.

During this difficult process, the worst thing that could happen would be for a candidate (or their supporters) to believe that they had been cheated out of the nomination. Expressing such a belief, even as a rumor, would deal a major blow to the unification and the party itself, and thus, to the odds of the nominee.

Then it happened. Not just as a rumor, but clearly written and unaltered. It didn’t require a lie, or a forgery – the material needed to harm the nominee and the party was sitting in the mailbox of the party leader, and become visible to the entire planet. The wounds of the process were torn open, the supporters of Bernie Sanders now saw two enemies, one of which was the nominee of their party.

This is notable as it had a undeniable impact on the election; it cast a doubt over the entire nomination process, drove members away from the party, resulted in leadership changes, created a lingering doubt that still hasn’t left the minds of some members – and the only lie required was to hide how the emails were acquired.

The timing could not have been better, the authenticity of the material had become difficult or impossible to deny, the short- and long-term impact undeniable, and they managed to influence with real, truthful information. It was, without a doubt, a remarkable event.

Unorganized Disinformation #

Not all disinformation comes from factories of fake, but all too often comes from the mouths of people who are exactly who they claim to be. People that, without critical analysis, repeat & exaggerate, amplify, and distort information to comport with their world view. They twist actual events and reports into something nefarious – because the innocent interpretation doesn’t work with the understand of the world that they have developed.

This tendency isn’t unique to any group, and happens on all sides of the political spectrum – while some groups would like to believe that their group is above such nonsense, the reality is that every group has zealots that are willing to disregard critical thinking when they see something that matches their own beliefs and suspicions. I have seen people that are intelligent and otherwise critical of what they see fall into this trap; in reality, there are likely few that haven’t made this mistake to some degree.

There are of course, some that take this unorganized development and spread of disinformation to an impressive extreme. The followers of the “Q” conspiracy theory are quite effective at drawing in new recruits, and pulling meaning and guidance out of vague and essentially meaningless messages of unknown origin. While I won’t venture a guess as to who is behind these messages (if it is a foreign disinformation campaign, someone has earned a raise); the way that followers become so engrossed and dedicated to the belief is unlike anything in recent memory. They read the tea leaves of current events, finding some plausible meaning in past messages, and spread their view far and wide – which is often well received and further amplified by other believers.

There is also the somewhat similar story of Jade Helm, though I won’t belabor this – it’s more of the same, group identity, world views that do no match reality, disinformation created and spread by groups and individuals, all leading to the spread of fear, distrust, and outright lies.

No matter how extreme someone is, the fact is that social media is a breeding ground for this type of “organic” disinformation, and the mental exploits are the same regardless of source.

If you look at the replies to almost any post on Twitter by a major media outlet (especially those that are even tangentially political), you are likely to see more disinformation, or claims or disinformation, than authentic comments and discussion. If you take the time to look into other posts by the same people making these statements (don’t, trust me), you will almost always find that they are filled with two things:

  1. Disinformation, and claims of disinformation from those outside of their group(s).
  2. Tribal group reinforcement1 – posts that reinforce and declare their membership in certain groups – this may be based on locality, sports, politics, or others.

This type of organic disinformation is difficult to impossible to differentiate from state-backed campaigns – as it uses the same techniques, has the same goals, and presents with such similar characteristics that to the observer, it’s not possible to know if the person is authentic. The dynamics of social media allow this disinformation and misinformation to spread quickly and easily – and is no easier to stop than that produced by state-backed organizations. In fact, it’s harder to stop – as these people are authentic, they are within the rules of most platforms to post as much disinformation as they would like.

State-backed disinformation is a problem, this “organic” disinformation, in my opinion, is a much greater threat to a healthy society – it is a symptom of larger issues, and almost certainly has more impact than state-backed efforts.

Is social media evil? #

No, though it has a societal cost that isn’t insignificant. Social media allows people to find new friends, develop their skills and careers, gain knowledge and insight that they wouldn’t be able to otherwise. Users are also exposed to a vast sea of information that is misleading at best, which can exploit their preexisting views and drive them away from rational discourse into conspiracy theories and developing animosity (and hate) for those that aren’t in their groups.

I generally dislike writing about problems without offering thoughts on a solution – and I’m not sure if there is a solution to the issues that social media has magnified. Education and critical thinking skills are often seen as an inoculation against disinformation, but this offers only some value and is something that has always presented a challenge at a global scale. Healing wounds of the past, and reducing the fissures that exist between groups would certainly be a step in the right direction, but this is also not something that can be addressed easily, or even within a generation – this is simply something that humanity as a whole has never been good at.

Social media isn’t going to go away, and society will continue to pay a price.


  1. There is an anecdotal but apparently strong correlation between those that are highly invested in the tribal groups they belong to, and engaging in behavior that demonstrates distrust and blame on members of other groups of wrongdoing, while spreading disinformation themselves. I won’t speculate on the cause of this correlation, though it may be that people that place more value on group membership are more inclined to distrust those that are not within their group, which makes them more susceptible to the exploits used by disinformation efforts. ↩︎

Adam Caudill


Related Posts

  • Book Review: Active Measures

    Thomas Rid has delivered with Active Measures, it’s clear, surprisingly entertaining, and extremely well sourced. This is a must-read if you want to understand how disinformation operations work, and more importantly, how we managed to find ourselves in a world where it’s hard to trust anything. The book starts with the birth of modern disinformation in the 1920s, following the advances, successes and failures, though to the fiasco that was the 2016 elections.

  • Shadow Brokers, Equation Group, Oh My…

    Yet again, a group known as The Shadow Brokers is in the news, with yet another leak from what is widely accepted as the NSA (Equation Group1 in APT terms). This release is, to many, the most important release of this leaked stolen material from the most elite and secretive hacking operation in the world. This is a collection of a few notes on this highly unusual operation. If you haven’t read this excellent overview of the most recent release by Dan Goodin, you should do that now.

  • Checklist: Starting a Security Consulting Firm

    Recently a friend of mine asked for input on what would be needed to launch a new security consulting company, to help him out I drafted a detailed list of what would need to be done for a successful launch. Here is an expanded version of that list, hopefully others will find this useful as well. This isn’t the simplest route to setting up a new business, but is intended to set the business up for long-term success.

  • LinkedIn: The Breach That Isn't but Is

    The definition of a data breach seems to be reasonably straightforward and easy to understand — but that isn’t always the case. LinkedIn is back in the news thanks to a dataset containing profile information for 700 million records being traded among the darker actors on the internet. But LinkedIn is very clear about how they view this situation: This was not a LinkedIn data breach and our investigation has determined that no private LinkedIn member data was exposed.

  • Proposal: Association of Security Researchers

    Security researchers play an important role in the industry, though one that doesn’t always receive the support needed. In this post, I am proposing the creation of a new non-profit entity, the International Association of Information Security Research Professionals (IAISRP), as a supporting group to push research forward, and provide the tools and resources to improve the quality of work, and the quality of life for those involved in this vital work.