Pocket worthyStories to fuel your mind

Down the Rabbit Hole: How Qanon Conspiracies Thrive on Facebook

A Guardian investigation found more than 3 million aggregate followers and members support QAnon on Facebook, and their numbers are growing.

The Guardian

Read when you’ve got time to spare.


Attendees of a campaign rally for President Donald Trump hold up their phones with messages referring to the QAnon conspiracy theory at the Las Vegas Convention Center on Feb. 21, 2020. Photo by Mario Tama / Getty Images.

In early May, QAnon braced for a purge. Facebook had removed a small subset – five pages, six groups and 20 profiles – of the community on the social network, and as word of the bans spread, followers of Q began preparing for a broader sweep.

Some groups changed their names, substituting “17” for “Q” (the 17th letter of the alphabet); others shared links to back-up accounts on alternative social media platforms with looser rules.

More than just another internet conspiracy theory, QAnon is a movement of people who interpret as a kind of gospel the online messages of an anonymous figure – “Q” – who claims knowledge of a secret cabal of powerful pedophiles and sex traffickers. Within the constructed reality of QAnon, Donald Trump is secretly waging a patriotic crusade against these “deep state” child abusers, and a “Great Awakening” that will reveal the truth is on the horizon.

QAnon evolved out of the baseless Pizzagate conspiracy theory, which posited that Hillary Clinton was running a child sex ring out of a Washington DC pizza restaurant, and has come to incorporate numerous strands of rightwing conspiracy mongering. Dedicated followers interpret Q’s cryptic messages in a kind of digital scavenger hunt. Despite the fact that Q’s prognostications have reliably failed to come true, followers rationalize the inaccuracies as part of a larger plan.

Q’s initial commentary on the Facebook bans was concise: “Information Warfare,” Q posted on the website 8kun. Two days later, in a post that included a collage of dozens of news headlines about the takedowns, Q went further, speculating that there had been a “coordinated media roll-out designed to instill ‘fear’” in believers and dissuade them from discussing QAnon on social media. “When do you expend ammunition?” Q wrote. “For what purpose?”

The anticipated purge never came. Instead, QAnon groups on Facebook have continued to grow at a considerable pace in the weeks following the takedown, with several adding more than 10,000 members over 30 days.

A Guardian investigation has documented:

  • More than 100 Facebook pages, profiles, groups, and Instagram accounts with at least 1,000 followers or members each dedicated to QAnon.

  • The largest of these have more than 150,000 followers or members.

  • In total, the documented pages, groups and accounts count more than 3m aggregate followers and members, though there is likely significant overlap among these groups and accounts.

These groups and pages play a critical role in disseminating Q’s messages to a broader audience and in recruiting more believers to the cult-like belief system, researchers say.

“Facebook is a unique platform for recruitment and amplification,” said Brian Friedberg, a senior researcher at the Harvard Shorenstein Center’s Technology and Social Change Project who has been studying QAnon for years. “I really do not think that QAnon as we know it today would have been able to happen without the affordances of Facebook.”

Moreover, Facebook is not merely providing a platform to QAnon groups. Its powerful algorithms are actively recommending them to users who may not otherwise have been exposed to them.

The Guardian did not initially go looking for QAnon content on Facebook. Instead, Facebook’s algorithms recommended a QAnon group to a Guardian reporter’s account after it had joined pro-Trump, anti-vaccine and anti-lockdown Facebook groups. The list of more than 100 QAnon groups and accounts was then generated by following Facebook’s recommendation algorithms and using simple keyword searches. The Instagram accounts were discovered by searching for “QAnon” in the app’s discovery page and then following Instagram’s algorithmic recommendations.

Receiving QAnon recommendations from Facebook does not appear to be that uncommon. “Once I started liking those pages and joining those groups, Facebook just started recommending more and more and more and more, to the point where I was afraid to like them all in case Facebook would flag me as a bot,” said Friedberg. Erin Gallagher, a researcher who studies social media extremism, said she was also encouraged to join a QAnon group by Facebook, soon after joining an anti-lockdown group.

Facebook’s own internal research in 2016 found that “64% of all extremist group joins are due to our recommendation tools”, the Wall Street Journal reported, primarily through the same “Groups you should join” and “Discover” algorithms that promoted QAnon content to the Guardian. “Our recommendation systems grow the problem,” the internal research said.

Facebook did not directly respond to questions from the Guardian about its policy considerations around QAnon content. “Last month, we took down accounts, Groups, and Pages tied to this conspiracy theorist movement for violating our policies,” a company spokesperson said in a statement. “We also remove Groups and Pages that violate other policies from recommendations and demote in search results. We’re closely monitoring this activity and how our policies apply.”

The company also claimed that “all of the Pages” and “the vast majority of Groups” documented by the Guardian had been removed from recommendation algorithms prior to the Guardian’s query. The company did not provide evidence for this claim, which is contradicted by screenshots of pages and groups appearing in recommendations that were taken in May. The Guardian also continued to receive recommendations to join additional QAnon groups after its initial query to Facebook.

Asked about this discrepancy, Facebook said that the pages and groups in question had been marked as “non-recommendable” as of 8 April 2020 for violations of policies against clickbait, viral misinformation and hate speech, but that a page or group can be restored to eligibility for recommendations if its behavior improves for several months.

Over the course of reporting this article – about one month – the aggregate membership of the documented groups and pages grew from 2.75m to more than 3m, or approximately 8.5 percent. Groups and pages that the Guardian had documented to have been promoted through Facebook’s recommendation algorithms grew 19.9 percent. One page that appeared in recommendations – “We are ‘Q’” – saw its following grow nearly 60 percent, from about 24,000 to about 38,000 over the month – despite the page not having posted any new content since February.

To Friedberg, the window for Facebook to act on QAnon may have already passed. “I’m starting to wonder if we’re just waiting for the next shoe to drop – another act of violence,” he said. “That seems to be what the platforms wait for, and that in and of itself is terrifying.”

A Ban That Stuck

While QAnon thrives on Facebook, another social media site took timely and decisive action against it. Nearly two years ago, Reddit, the link-sharing network of interest-based message boards, carried out a site-wide purge of QAnon – and made it stick.

Reddit had been central to the development of the QAnon movement, which began in October 2017 with the emergence of “Q” on 4chan, the anarchic image board that has served as a launching pad for memes and internet culture but also racist extremism and harassment campaigns. Q, whose cryptic messages and predictions claimed to be based on a high-level government security clearance, quickly decamped from 4chan to the even more extreme 8chan, where believers could read Q’s latest “crumbs” directly from the source.

Q went briefly silent in 2019 when 8chan was forced offline in the wake of the El Paso massacre, but re-emerged on the new site founded by 8chan’s owners, 8kun.

Anonymous internet posters claiming to be high-level government officials are not entirely uncommon; in recent years, other so-called “anons” have emerged with claims that they were revealing secrets from inside the FBI or CIA. But Q is the first such figure to have achieved such a broad audience and real-world political influence. This is largely due to the activism of three dedicated conspiracy theorists who latched on to Q’s posts in the early days, according to an investigation by NBC News. These activists worked to develop a mythology and culture around QAnon and cultivated an audience for it on mainstream social media platforms.


David Reinert holds up a ‘Q’ while waiting to see Donald Trump at a rally on 2 August 2018 in Wilkes Barre, Pennsylvania. Photo by Rick Loomis / Getty Images.

Reddit was significantly easier to use for the kind of crowd-sourced research and interpretation that forms the core of participation in QAnon, and the site was host to a large pool of potential recruits, such as the 1.2m members of the subreddit r/conspiracy. It had also long enjoyed and at times even earned a reputation as one of the danker cesspools of the social web, for years tolerating communities known as “subreddits” dedicated to sharing non-consensual sexualized images of women or advocating rape.

But the violent anger of adherents to QAnon crossed the line for Reddit in less than a year. On 12 September 2018, citing its ban on content that “incites violence, disseminates personal information, or harasses”, the company banned 18 QAnon subreddits, the largest of which had more than 70,000 members.

Social media bans are often difficult to maintain, but Reddit’s move was uncommonly effective. Today, QAnon remains unwelcome on Reddit, with the few subreddits that address it dedicated to either debunking the theory or providing support to people who have lost friends and family members to QAnon.

‘Taking the Red Pill’

QAnon did not disappear after Reddit pulled the plug, however. Instead, its believers moved on to other platforms, including YouTube, Twitter, Discord and – crucially – Facebook. At the time of the Reddit ban, one of the largest closed Facebook groups dedicated to QAnon, “Qanon Follow the White Rabbit” had 51,000 members, according to NBC News. Today that group has grown to more than 90,000 members.

And while YouTube and Twitter have played an important role in providing a broadcast platform for QAnon content, the specific structures provided by Facebook are uniquely suited to the participatory “work” of engaging with QAnon. Facebook also provides QAnon with an even larger pool of potential recruits than Reddit could, especially for the somewhat older, Evangelical crowd that has proven susceptible to QAnon’s messaging.

Will Partin, a research analyst with Data & Society, and Alice Marwick, a professor of communication at the University of North Carolina, describe QAnon as a “dark participatory culture”, which is to say that it is a community that takes advantage of the infrastructure of social networking sites to bring disparate people together and foster discussion, collaboration, research and community, but directs those energies toward anti-democratic, regressive and even violent ends.

“Everything about our research suggests that these people are not irrational; they’re hyper-literate, even if they’ve come to beliefs that are empirically inaccurate ,” Partin said. “That’s partly because they have a fundamentally different epistemology to judge what is true and false.”

The digital architecture of Facebook groups is also particularly well-suited to QAnon’s collaborative construction of an alternative body of knowledge, Friedberg said. The platform has created a ready-made digital pathway from public pages to public groups to private groups and finally secret groups that mirrors the process of “falling down the rabbit hole or taking the red pill”.

“You can mechanically take those steps,” he said. “Very few of the contemporary Q-following base actually need to engage with 8chan at all.”

To Ban or Not to Ban

While Facebook has policies banning hate speech, incitement to violence and other types of content that it considers undesirable on a family- and advertiser-friendly platform, QAnon does not fit neatly into any single category.

Much of what is shared in QAnon groups on Facebook is a mix of pro-Trump political speech and pro-Trump political misinformation. Memes, videos and posts are often bigoted and disconnected from reality, but not all that different from the content that is shared in non-QAnon, pro-Trump Facebook groups.

The pages and groups that were removed in early May violated the company’s ban on “coordinated inauthentic behavior” – ie the kind of digital astroturf tactics that Russian operatives used to support Trump’s presidential campaign in 2016. Those rules are aimed at operations in which actors make false representations about their identities in order to mislead people – a description that could encompass Q – but Facebook only applies its policy to deceptive behavior that occurs on its platform, not on 8kun.

To enact a blanket ban akin to Reddit’s under its current rubric of rules, Facebook would likely have to designate QAnon as a “dangerous organization” – the category it uses to ban both terrorist and hate groups and any content published in support or praise of them. QAnon is hardly an organization, though as a movement it has certainly caused harm and could be considered dangerous.

There are innate societal and individual harms to convincing people of a version of reality that is simply false, as QAnon does, said Data & Society research analyst Will Partin. “When a common sense of what is real and what is correct breaks apart, it becomes nearly impossible to reach a democratic consensus.”

And QAnon followers’ enthusiasm for misinformation is not confined to politics; as the coronavirus pandemic took hold, the groups became a hotbed for medical misinformation – something Facebook has claimed to be working hard to combat. Analyses by Gallagher, the social media researcher, and the New York Times demonstrated how QAnon groups fueled the viral spread of “Plandemic”, a 26-minute video chock full of dangerously false information about Covid-19 and vaccines.

Facebook’s algorithms appear to have detected this synergy between the QAnon and anti-vaccine communities. Several QAnon groups are flagged with an automated warning label from Facebook that reads, “This group discusses vaccines” and encourages users to go to the website of the Centers for Disease Control for reliable information on health.

It appears that anti-vaccine propagandists are also taking notice, and attempting to capitalize. Larry Cook, the administrator of Stop Mandatory Vaccination, one of the largest anti-vaxx Facebook groups, has begun incorporating QAnon rhetoric into the medical misinformation he peddles, as well as making explicit invitations to QAnon believers to join his group.

Cook has begun referencing the “deep state” and stoking fear of forced vaccination and “FEMA camps”.

“I have discussed the concept many, many, many times that vaccines destroy our connection to God and that we are in a spiritual war with Principalities of Darkness that have a death wish for our children, and humanity at large,” he wrote in one QAnon-inflected post. (Cook also uses the site to aggressively promote his various products and a subscription-only platform for “medical freedom patriots”.)

But the potential for damage from QAnon goes well beyond. For those individuals who truly believe in the QAnon narrative, the crimes of the “cabal” are so grievous as to make fighting them a moral imperative. “They’re talking about a group of people who are operating our government against our wishes and they’re molesting and torturing children and destroying our society,” said Joseph Uscinski, a professor of political science who studies conspiracy theories. “It’s an incitement to violence.”

Indeed, there have been numerous incidents of real-world violence linked to QAnon, and in May 2019, the FBI identified QAnon as a potential domestic terrorism threat in an intelligence bulletin. While anti-government conspiracy theories were not new, the bulletin stated, social media was allowing them to reach a larger audience, and the online narratives were determining the targets of harassment and violence for the small subset of individuals who crossed over into real-world action.

Despite this, Uscinski is skeptical of the idea that kicking QAnon off Facebook would help anyone. He regularly polls conspiracy theories and consistently finds that QAnon is “one of the least believed things” out there, well below belief in theories about Jeffrey Epstein’s death, anti-vaccine hoaxes, and Holocaust denialism. Uscinski also cautions against overly exoticizing the QAnon narrative, noting that “most of the component parts of QAnon have been around forever”, with parallels in the Satanic Panic of the 1980s or the plot of Oliver Stone’s JFK. And he’s concerned about the free speech implications of censorship by tech platforms.

“It’s a potentially dangerous belief; it’s very disconnected from reality; I don’t really think we want more people getting into it,” he said of QAnon. “Do the internet companies bear some responsibility? Yes. Would it be better if they took it down? Probably. Does that take care of it? No.”

Partin said that he generally favored Facebook taking a “more aggressive approach to moderation”, including addressing the recommendation algorithms and trying to reduce the spread of misinformation out of dedicated conspiracy communities and into the mainstream.

“If Facebook flipped a switch and every Q post disappeared tomorrow, that probably would be harmful for QAnon,” he said. “But there is resiliency built in. Getting deplatformed is harmful, but the idea that it would somehow make this disappear is fanciful.”

Friedberg worried that it may already be too late. “Facebook should have taken action on this a long, long time ago, and the longer that they wait, the more deeply entrenched in mainstream politics this becomes,” he said. Facebook has been reluctant to appear in any way biased against Republicans, and if (or when) QAnon reaches Congress, it will be even more politically difficult for Facebook to take a stand.

In May, Republican voters in Oregon nominated a QAnon believer to run for the US Senate in November. Another QAnon supporter, Marjorie Taylor Greene, is likely to be elected to the House of Representatives after she came first in a Republican primary in a conservative Georgia district on 11 June.

“In some ways, the second that Trump officially acknowledges QAnon is the second it becomes a partisan political issue that Facebook may not be able to take action against,” said Friedberg. “We’re watching a normalization process of these conspiracies, and I think the beast that is Facebook was really the answer to this all along.”

Indeed, Trump himself has repeatedly retweeted QAnon accounts on Twitter, which believers take as confirmation of their alternate reality. And on 20 June, just before Trump’s campaign rally in Tulsa, Oklahoma, Trump’s adult son Eric posted a QAnon meme on his Instagram account. Eric Trump deleted the image relatively quickly, but not before screenshots spread across the Facebook Q-sphere.

“So Eric Trump posted a pic with a ‘Q’ in the imagery,” an administrator of one of the larger QAnon groups wrote. “The pic has been taken down but the message was received!”

Julia Carrie Wong is a technology reporter for Guardian US, based in San Francisco. Twitter @julliacarriew

How was it? Save stories you love and never lose them.

Logo for The Guardian

This post originally appeared on The Guardian and was published June 25, 2020. This article is republished here with permission.

Be a part of the Guardian’s future.

Become a Guardian supporter.