Apr 15, 2021

What Do We Know? What Do We Not Know?

Michael D. Langone, PhD
ICSA E-Newsletter
Mar 29, 2021

Note: Footnotes may not appear in a phone. You may have to use a computer to view the footnote links. As a workaround, we have added endnotes manually on the last page.

Merriam Webster defines conspiracy theory as “a theory that explains an event or set of circumstances as the result of a secret plot by usually powerful conspirators.” Though commonly associated with outlandish claims, conspiracy theories are sometimes true (e.g., Nixon’s Watergate scandal) and sometimes plausible, even if false.

During the past year there has been a blizzard of media reports about QAnon, in part because of its indirect association with Donald Trump. QAnon is often called a conspiracy theory, but in fact it “is a loosely connected system of conspiracy theories and unfounded beliefs spawned by Q, an anonymous on forums like 8chan (now 8kun) claiming to have high-end military clearance within the Trump administration.” According to some reports, a core QAnon belief, among many bizarre claims made by people in the QAnon network, is that Donald Trump is saving the world from a Satanic cult of pedophiles and cannibals. There is debate whether Q is one person, or if several people have functioned as Q.

The media storm has affected the public’s awareness of QAnon. According to Pew, a Feb. 18–March 2 survey found that

about a quarter (23%) of U.S. adults said they had heard ‘a lot’ or ‘a little’ about QAnon. By September, that number had increased to 47%. At the same time, though, very few Americans have heard a lot about it: 9% as of September, up from 3% in February.

Pew also found that those who had heard of QAnon knew little about it.

I decided to research the QAnon phenomenon and write this paper for two reasons: (a) Most media reports are limited in what they can teach us, in part because of word limits on the articles, and (b) compelling anecdotal accounts of harms resemble what we see among cult victims. Candidly, I did not realize what I was getting into when I began this research. The more I study the phenomenon, the less I seem to know. For this reason, in this essay I will propose hypotheses to test and ask questions, rather than draw firm conclusions.

How Extensive is the QAnon Phenomenon?

The answer to this question depends upon whether one conceptualizes the QAnon phenomenon narrowly (i.e., those who actively participate in the online network), or broadly (i.e., those who may be at least mildly favorable toward QAnon, regardless of their level of knowledge or activity). Let’s take the narrow look first.

Bellingcat researchers created a data set of Q’s posts between October 28, 2017, and September 16, 2020. The researchers list 4,952 “so-called ‘Q drops,’ the cryptic messages that are at the heart of the conspiracy theory.” That works out to an average of about five Q drops per day.

When Melanie Smith of Graphika

first mapped the network of QAnon supporters in June 2018, it was the most dense conspiratorial network Graphika had ever studied. This means that accounts engaged in QAnon theories at the time had an astounding rate of mutual followership and represented an extremely tight-knit online community. The likelihood with a community this dense is that accounts are exposed to, and engage with, very similar content to each other. Despite its significant growth and undeniable ‘mainstreaming’ over the past two years, QAnon continues to be exactly such a Community.

Graphika’s network maps were

deliberately reduced to capture the most highly connected accounts and are not intended to represent the entirety of QAnon supporters on Twitter. These 13.8k accounts . . . alone posted more than 41 million tweets in 30 days between January and February—between July and August this rose to an estimated 62.5 million.

The Bellingcat and Graphika data suggest that at least several tens of thousands of people have actively engaged with the online QAnon network, with members of the core group of nearly 14,000 accounts identified by Graphika making on average about 7,300 tweets per account over the 2-month period studied by Graphika (roughly 120 tweets per day per person). Given this level of activity, one might hypothesize that core QAnon followers are obsessed with the network, especially if most must spend 8 hours a day at work. I have not found data that reveals the level of tweeting of followers outside the core group, though it seems reasonable to hypothesize that there will be a range from many to few tweets within less devoted subgroups.

One of Graphika’s interesting findings is that this network appears to have become increasingly distinct, including from the broader Trump community of supporters:

. . . analysis of these networks on Twitter demonstrates the QAnon community becoming increasingly autonomous over time. In the June 2018 map (below left), the saturation of Trump supporters within a highly interconnected map largely collapses any clear distinction between the two groups. Trump support accounts here are shown in green, and QAnon supporter accounts in yellow. Yet, in a February 2020 update to this map (below right), the Trump support group (again in green) has shifted visibly further from the explicitly conspiratorial QAnon accounts (shown in red). [Go to the Graphika full report to see the maps.]

Let’s now take the broad look at the extent of the QAnon phenomenon.

In October 2020, Brian Schaffner of Tufts University published a report on a survey of 4,057 American adults. Schaffner’s is one of many surveys, all of which have their limitations. Schaffner’s survey is valuable because he asked participants about their knowledge of QAnon, including their belief in eight conspiracy theories, four of which were associated with QAnon. Respondents’ knowledge of QAnon was low. Schaffner describes some noteworthy findings:

  • Conspiracy belief is still fairly widespread; 41% of Americans had heard about and believed in at least one of the eight conspiracy theories we asked about. About one in five Americans recognized and believed in at least one of the four conspiracy claims that originated from QAnon.

  • After accounting for the fact that most Americans have not heard of QAnon, only 7% have a favorable view of QAnon, and a similar percentage say they can trust QAnon to provide accurate information at least most of the time.

  • Views towards QAnon should not be taken as synonymous with conspiracy belief. The average respondent who viewed QAnon favorably had heard less than half of the four QAnon conspiracies we asked about and, on average, believed only one of the four. Thus, QAnon supporters do not even know about, much less believe, all of the QAnon conspiracies (emphasis added).

  • Similarly, conspiracy belief is not limited to QAnon supporters. In fact, 16% of those who did not rate QAnon favorably recognized and believed at least one of QAnon’s conspiracy claims.

Schaffner’s findings suggest two things. First, most people identified as “friendly” toward QAnon do not have much knowledge about the information circulating within the online community and, therefore, should not be equated with the core group identified by Graphika. Second, Americans tend to be rather credulous about unusual beliefs, at least when they are asked about them in surveys. The four QAnon conspiracies Schaffner asked about were the following (the percent of all subjects believing the conspiracy is entered in parentheses):

  1. A global network tortures and sexually abuses children in Satanic rituals. (22%)

  2. Trump is secretly preparing a mass arrest of government officials and celebrities. (18%)

  3. Celebrities harvest adrenochrome from children’s bodies. (12%) 

  4. Mueller was actually investigating a child sex-trafficking network. (15%)

The non-QAnon conspiracies with percent believing were as follows:

  1. The Democratic primary was rigged to keep Bernie Sanders from running. (35%)

  2. The government is trying to cover up the link between vaccines and autism. (22%)

  3. Vaccinations with tracking chips will later be activated by 5G cellular networks. (21%)

  4. The corona virus is a hoax. (15%)

Gallup surveys of belief in paranormal phenomena also reflect Americans’ openness to unusual beliefs. About 73% of Americans affirm at least one paranormal belief, including ESP (41%), haunted houses (35%), ghosts (32%), telepathy (31%), clairvoyance (26%), astrology (25%), mental communication with the dead (21%), witches (21%), reincarnation (20%), and channeling (9%).

Few of the paranormal and conspiracy beliefs listed can claim much if any support from scientific evidence and some (e.g., astrology) are clearly contrary to science. 

Why Do So Many People Accept Dubious Beliefs?

Deficient critical-thinking skills. Those of us in the cultic-studies field have long advocated teaching critical-thinking skills to strengthen resistance to the persuasion tactics cults use. Unfortunately, critical-thinking competency in the general population is not high. The Reboot Foundation found that parents, who strongly support teaching children critical thinking, tend not to realize that they are themselves deficient. A Reboot survey found that

47 percent of them don’t typically plan where they will obtain information while doing research. And around 27 percent use only one source of information while making a decision. … one-third of respondents consider Wikipedia, a crowd-sourced website, to be the equivalent of a thoroughly vetted encyclopedia . . . people believe the accuracy of more than a third of what they read on Twitter and Facebook. . . . less than a quarter of respondents actually seek out views that challenge their own . . . 24 percent of respondents say they avoid people with opposing views.

Reboot’s executive summary of the research concludes:

In other words, many people claim they solicit the views of others. But, in practice, they don’t do nearly enough to “stress test” their opinions, despite the wealth of evidence showing that engaging in opposing views is crucial to richer forms of critical thinking.

Critical thinking is undermined by cognitive errors or distortions, which PsychCentral defines as “ways that our mind convinces us of something that isn’t really true.” Common cognitive distortions that undermine rational thinking include but are not limited to filtering, polarized (black-and-white) thinking, overgeneralization, jumping to conclusions, catastrophizing, blaming, emotional reasoning (if it feels true, it is true), and the need to always be right.

Thus, it seems reasonable to hypothesize that people who are intellectually unprepared to challenge or even question information served to them on the Internet may be more likely to be drawn into conspiratorial communities.

Loss of faith in sources of authority. Much of what we think we know we believe because we attribute credibility to sources of authority that impart information. The Centers for Disease Control (CDC), for example, says social distancing reduces the probability of contracting Covid 19, and we comply because we deem the CDC to be a reliable source of information.

Traditionally, prestigious news media (e.g., The New York Times) have been viewed as reliable sources of information for the public. In recent decades, however, trust in the mainstream media has declined, especially among Republicans, who, it seems reasonable to hypothesize, may constitute the bulk of QAnon activists and supporters. Even before Donald Trump became president, only 7% of the press identified as Republican, compared to 25.7% in 1971. Moreover, trust in the media, according to Gallup, declined from 72% in 1972 to 32% in 2016, and among Republicans that trust had declined to 14% by 2016.

Republican distrust in the media is not without some foundation. An insightful article from Politico asked, “How did big media miss the Donald Trump swell?” They acknowledge that the political homogeneity of the journalism profession led to groupthink and bias. However, they suggest that groupthink is a symptom, not a cause. The dwindling numbers of newspapers around the country, and especially the rapid growth of Internet publishers, has concentrated journalists on the coasts. By 2016, 52% of journalists (75% of Internet publishers) lived in counties that Clinton won by 30% or more, while an additional 21% of journalists (15% of Internet publishers) lived in counties that Clinton won by less than 30%. The authors conclude:

Resist—if you can—the conservative reflex to absorb this data and conclude that the media deliberately twists the news in favor of Democrats. Instead, take it the way a social scientist would take it: The people who report, edit, produce and publish news can’t help being affected—deeply affected—by the environment around them. 

If overall trust in the media declined from 72% to 32%, where do those who no longer trust the media turn for authoritative information?

A Pew survey suggests that many still turn to the media, but the media that they trust depends upon party affiliation. Certain media outlets are trusted by Democrats and distrusted by Republicans, while for other outlets the reverse holds. The media, then, have come to reflect and, perhaps unwittingly, reinforce the partisan polarization that has grown over recent decades. Nevertheless, distrust of specific media still remains high across the partisan divide. Pew says, “And in what epitomizes this era of polarized news, none of the 30 sources is trusted by more than 50% of all U.S. adults.” Compare that finding to the polls beginning in 1972 that named CBS News anchor Walter Cronkite the “most trusted man in America.”

Data from Pew surveys of trust in government are further evidence of growing public cynicism. In 1958, 75% “of Americans trusted the federal government to do the right thing almost always or most of the time.” By 2019, that number had shrunk to 17%. 

Given such widespread distrust of government and the media, it is not surprising that many people will turn elsewhere for authoritative information. If only 10% of the population did this, the number would be about 30 million Americans. This would be a sizable pool of media-government cynics to which conspiracists could market themselves.

Unfortunately, because so many people have deficient critical-thinking skills, the sources to which they turn may be less trustworthy than the sources from which they turn away. This is especially true for those who turn to social media. Washington State University researchers have found a relationship between reliance on social media and proclivity to believe in Covid conspiracy theories. Recall also the Reboot Foundation’s finding that “people believe the accuracy of more than a third of what they read on Twitter and Facebook.” Keep in mind that this is a general finding. Further investigation might reveal that a small yet sizable subset of the population may trust much, most, or nearly all that they read on social media.

Normal human needs. We all want meaning or purpose in our lives. We all want to feel at least a little bit special. And we all want to feel confidence in our view of the world. These are normal human needs. When, however, our customary way of operating in the world isn’t working well, we lose confidence, we may feel inadequate instead of special, and we may doubt things that we once believed. Such psychological disequilibrium may make one vulnerable to a cultic sales pitch and to a Web-based “loosely connected system of conspiracy theories” (i.e., QAnon). 

Psychiatrist Joe Pierre summarizes the limited empirical research on the needs that conspiracy theories may at least partly satisfy:

Some of the psychological quirks that are thought to drive belief in conspiracy theories include need for uniqueness and needs for certainty, closure, and control that are especially salient during times of crisis. Conspiracy theories offer answers to questions about events when explanations are lacking. While those answers consist of dark narratives involving bad actors and secret plots, conspiracy theories capture our attention, offer a kind of reassurance that things happen for a reason, and can make believers feel special that they’re privy to secrets to which the rest of us “sheeple” are blind.

Thus, it appears that there may be predisposing vulnerabilities among QAnon networkers, but scientific research is lacking. Research should be conducted to compare common human needs, attitudes toward authority sources, critical-thinking skills, and openness to unusual beliefs among QAnon networkers, QAnon “friends” (i.e., those with favorable views of QAnon who are not very active on the QAnon message boards), and a general-public control group. Until such research is conducted, all we can do is make speculative extrapolations from areas where we do have at least a modicum of scientific knowledge.

The QAnon Rabbit Hole: Is It a Cult?

QAnon rabbit hole refers to those people who have so bought into the QAnon system of conspiracy theories that their lives come to revolve around QAnon. Pierre calls these people true believers, and he distinguishes them from fence-sitters, who have not totally bought into the system. For Graphika, the rabbit hole might refer to the core believers making 120 posts per day and perhaps those whose participation is somewhat less intense than this core group.

The true-believer group’s dedication to QAnon accounts for suggestions that QAnon is an emerging religion or a cult. The true believers appear to have undergone a conversion experience, but a conversion that is distinct from what we typically observe in cults or noncultic religions. (Keep in mind that this statement is based on anecdotal accounts, not substantial empirical research.)

Traditionally, a religious conversion experience is a fundamental change within the person that occurs in an ethical context; the change is inner generated. In the unethical context of a highly manipulative cult, the conversion often is, at least to a large degree, outer generated or engineered. These manipulated conversions rely heavily on interpersonal influences (e.g., love bombing).

QAnon conversions appear to be different, for they occur in cyberspace and do not rely on interpersonal influences, at least not until QAnon converts meet other true believers in the flesh. (So far as I have been able to determine, we have no idea how often this happens and what percentage of QAnon networkers meet personally—another area calling for research.)

The metaphor of a rabbit hole is useful to understand the changes core QAnon followers may undergo, though the metaphor needs to be modified. Imagine multiple entry holes, each of which divides into two holes, those holes further dividing, and so on, until there are hundreds of branching holes. But each of these hole pathways ultimately empties into one large cavern, the “Cavern of the QAnon True Believer.”

I suggest the branching holes because of an astute observation Walter Kirn made in Harpers. Kirn, a novelist, followed Q’s posts from late 2017. He viewed Q’s posts as an online novel. Initially Q’s plotline sounds like Cold War-era, right-wing conspiracies. Then Kirn recognizes Q’s innovation, an innovation that may have opened another niche for unscrupulous manipulators to exploit in the future:

As the posts piled up and Q’s plot thickened, his writing style changed. It went from discursive to interrogative, from concise and direct to gnomic and suggestive. This was the breakthrough, the hook, the innovation, and what convinced me Q was a master, not just a prankster or a kook. He’d discovered a principle of online storytelling that had eluded me all those years ago but now seemed obvious: The audience for Internet narratives doesn’t want to read, it wants to write. It doesn’t want answers provided, it wants to search for them. It doesn’t want to sit and be amused, it wants to be sent on a mission. It wants to do. [emphasis added] ... Q turned his readers into spies and soldiers by issuing coded orders and predictions that required great effort to interpret and tended to remain ambiguous even after lengthy contemplation.

Chasing after and responding to Q’s crumbs enables QAnon contributors to become “digital soldiers,” not mere observers, and to share their thoughts with the Q world. Kirn says: “By leaving more blanks in his stories than he fills in, he activates the portion of the mind that sees faces in clouds and hears melodies in white noise.” This is why there is a profusion of conspiracy theories within the QAnon network and why Rolling Stones’ Dickson is correct when calling QAnon “a loosely connected system of conspiracy theories” rather than a conspiracy theory.

The branches of my proposed rabbit-hole modification represent the QAnon followers responding individually to Q’s vague crumbs and to posts from other QAnon networkers. By participating in Q’s story of good battling evil, followers become part of a quasi-real video game in which their idiosyncratic actions may influence world events—surely a heady experience for somebody who may have been angry, disappointed, frightened, or depressed. 

Is participation in the QAnon game a prelude to or a vicarious substitute for real-life action? Probably both answers will apply to some people, while only one or maybe neither will apply to others. From a scientific standpoint, we have no idea how the percentages will break down for the relationships between online and physical activity—another area calling for research.

QAnon, then, isn’t like the prototypical cult in which a leader’s utterances and dictates are passed down a hierarchy to the members, who are expected to listen attentively and obey. In these cases, power lies with the cult leader, and members often feel powerless. Q, in contrast, invites participation. In essence, he encourages his followers to share the power, to let their imaginations run with whatever paranoid thought comes to mind and not be restrained by normal rationality. The mentality might be something like this: If it feels true, it must be true. If it seems plausible, it must be a fact. A possible connection is evidence. The paranoid minds of the network obsess as they construct increasingly complex narratives around the core and derivative assumptions of the delusional system, ignoring that which contradicts or undermines the system’s themes, and employing the complete armory of cognitive errors to buttress that which may support the system.

QAnon, then, may be construed as a safe space for paranoid speculation, a mindspace in which participants are marinated in a sort of virtual-reality, fantasy world in which “seeing connections” provides not only the thrill of a personal ah-ha experience, but also an opportunity to be rewarded by others via likes and shares. Like their counterparts in other areas of social media, QAnon digital soldiers may gain status, at least in their own minds, by acquiring a following. The less that following demands rational thinking to give its rewards, the more the participants can indulge in unmoored speculation.

Once the collective speculation crosses a threshold of bizarreness, the network must close in on itself—it becomes an echo chamber—to avoid the scoffing and criticism that is sure to come from outsiders. As with some peculiarly irrational cult ideologies, cognitive isolation is a survival necessity. Scrutiny dismantles nonsense, so nonsense must avoid scrutiny by building walls around itself. Graphika’s research on the increasing structural autonomy of the QAnon core group appears to support this idea.

The picture I am painting is further complicated by the fact that within the QAnon safe space are pockets of entrepreneurship—marketplaces where Q-compatriots can make money pushing videos, selling T-shirts, or whatever. 

The process that brings people into the QAnon network seems to resemble cult recruitment in some respects, except that the recruiters may be Internet algorithms rather than people, if the following from Wired is a correct generalization:

There are some common pathways reported by people who fall into, and then leave these communities. They usually report that their initial exposure started with a question, and that a search engine took them to content that they found compelling. They engaged with the content and then found more. They joined a few groups, and soon a recommendation engine sent them others. They alienated old friends but made new ones in the groups, chatted regularly about their research, built communities, and eventually recruited other people.

Tech companies’ economically based goal of increasing clicks may turn regions of the Web into confirmatory bias traps that, psychologically speaking, may have “exit costs” as high as what we see in cults. This is an intriguing notion that awaits empirical investigation. The tech companies may try to reduce the harm to which they may have contributed by censoring certain sites, but the adherents will tend to migrate to other sites or platforms. The justifiability of online censorship is another topic outside the focus of this paper.

The modified cult model is not the only way to look at the QAnon conversion process. One might also construe the QAnon echo chamber as a video-game addiction, the characteristics of which are

  • Thinking about gaming all or a lot of the time

  • Feeling bad when you can’t play

  • Needing to spend more and more time playing to feel good

  • Not being able to quit or even play less

  • Not wanting to do other things that you used to like

  • Having problems at work, school, or home because of your gaming

  • Playing despite these problems

  • Lying to people close to you about how much time you spend playing

  • Using gaming to ease bad moods and feelings

Social-media addiction has been studied as a behavioral and neurological phenomenon:

Social media addiction is a behavioral addiction that is defined by being overly concerned about social media, driven by an uncontrollable urge to log on to or use social media, and devoting so much time and effort to social media that it impairs other important life areas.

The Addiction Center lists six questions indicative of possible social media addiction:

  • Does he/she spend a lot of time thinking about social media or planning to use social media?

  • Does he/she feel urges to use social media more and more?

  • Does he/she use social media to forget about personal problems?

  • Does he/she often try to reduce use of social media without success?

  • Does he/she become restless or troubled if unable to use social media?

  • Does he/she use social media so much that it has had a negative impact on his/her job or studies?

Perhaps future research may find that video addiction or social-media addiction is a more useful explanation for some apparent QAnon conversions, while cultlike cyber entrapment explains more for others. Of course, another explanation or an integration of two or more of these models may prove superior to the others. My point in this digression is to emphasize that the same set of behaviors can be looked at from different theoretical perspectives, so one should not be exclusively wedded to a cult model for QAnon. There is simply too much that we do not know.

Helping QAnon Casualties

Press accounts and videos include the stories of a variety of people distressed because of QAnon’s impact on them or a loved one., , These stories resemble what we hear from former cult members and families/friends. A previously “normal” person begins to change (sometimes the change is sudden). This change appears to be related to, if not caused by, events in a group with which the person has become affiliated. At some point, the change may become profound—an altered identity, or a “conversion” that alarms loved ones. Among the changes that generate alarm are (a) turning away from previously valued activities, goals, friends, and family; (b) spending inordinate amounts of time in group activities; (c) troubling personality changes (e.g., out-of-character belligerence); and (d) antagonistic response to anyone who questions the group, its leader(s), or its teachings.

The stories of former QAnon followers and families show changes similar to that which we observe in cult conversions. That is why many have called QAnon a cult (I discussed earlier why I think this conclusion is only partly correct). And that is why cult therapists, who have experience treating people harmed by seemingly outer-generated changes in behavior, thinking, affect, or personality, have useful suggestions for helping former QAnon followers and families or friends concerned about a loved one’s involvement in QAnon. 

William Goldberg probably speaks for many therapists in the mental health field when he makes the following observations:

...a direct assault on their “facts,” an approach we might use with other individuals, will not usually be helpful ... I try to bring the client’s unconscious doubts to consciousness. . . When I respectfully raise these questions, which, again, are the unconscious questions that I believe the client has but is repressing, I’m less interested in the answers they give than in the act, for a moment, of having them consider my “confusion.” And that is not different from what we do in therapy all the time, when we offer our clients the possibility of a different interpretation of the world than what they have used all their lives.

Steve Hassan and Steve Eichel also speak for many therapists and exit counselors because they emphasize the need to strengthen relationships and avoid confrontation when trying to help QAnon followers. Rachel Bernstein says that “the first barrier is trying to defuse what is often a charged environment, and turn it into a safe and open forum. But if that happens, the next step is to better understand what motivates that individual to be part of QAnon—which is crucial to bringing them back from it.”

Pierre says that fence-sitters mistrust traditional sources of information, are looking for answers, but haven’t yet lost their capacity for cognitive flexibility and open-mindedness. These people may be reached therapeutically. True believers, in contrast, cling to conspiracy theories with greater conviction as they form a new identity yoked to that of the online community, much as cult conversion can sometimes create a totalistic identity. Pierre adds: “When people’s beliefs become so enmeshed with their identities, giving them up can be viewed as an existential threat akin to death. Needless to say, that's a bad prognostic sign.”

Clinical work in a new area usually begins with case reports and then reports based on multiple cases. John Clark’s seminal paper, “Cults,” for example, describes the author’s examination of more than 60 former cult members. So far as I have been able to determine, nothing comparable exists with regard to QAnon. Articulating such a body of clinical experience is only an early step. That work should be followed by systematic empirical research, such as some of the research recommendations mentioned in this paper.

QAnon Is Not a Cult Theology

An examination of Q drops and QAnon network posts makes clear that the QAnon network is full of bizarre statements. Bellingcat says, “Whenever a Q drop appears, believers around the world eagerly try to interpret its hidden meaning, connecting them to real world events.”

I’ve read things that cult leaders wrote, and some were odd, to say the least. But I’ve never come across anything like Q’s communications revealed in Bellingcat’s dataset of Q statements. Bellingcat researchers split their database into three subsets over time and used a clustering algorithm to demonstrate changes in topic or focus over time. Hence, Q’s communications are not gibberish. But their ambiguity—perhaps intentional ambiguity—motivates the QAnon network to decode the messages and explore its implications. The outcome product is not a philosophy or a theology, as many cult leaders claim to produce. Rather, the QAnon product appears to be a loosely connected system of conspiratorial speculations combined with the usual affirmations, comments, and criticisms one finds in other social media.

Q, then, is not a leader of a group in the way those terms have been used in the cultic-studies field. I submit that Q is a stimulus -- and quite possibly a now unnecessary stimulus -- to an obsessive network of core QAnon followers who, according to Graphika’s data, make about 120 posts per day per person, many more than Q. 

What Do We Know and Not Know?

We do know that active participants in the QAnon cybernetwork number at least in the tens of thousands and that at least several million people had, at least until the 2020 election, a favorable, if wildly uninformed, attitude toward QAnon.

We do not know how many people flirt with QAnon but do not advance beyond flirting.

We do not know what kinds of pathways (emphasis on plural) QAnon true believers follow to get to the QAnon “cavern” or rabbit hole. We should refrain from assuming that there is a causal pathway to the rabbit hole.

We do not know the degree to which predisposing factors, such as distrust of the media and government, may have affected QAnon core believers’ descent into the Q world. Nor do we know the degree to which deficient critical thinking or psychological factors may have affected entry into the rabbit hole.

We do not know whether outside actors, e.g., political or intelligence agency operatives or future cyber “entrepreneurs,” may be able to exploit “rabbit hole dynamics” to direct conspiracy networkers toward behaviors that advance monetary, political, or other agendas of the manipulators.

We do know that some QAnon networkers will tolerate discussion and reconsider their Q-involvement, while others will not.

We do not know what percentage of QAnon core believers may have been mentally unbalanced before they became involved with QAnon. 

We do not know what percentage of active QAnon networkers may become psychotic, exhibit other mental pathologies, or experience grave interpersonal dysfunction after entering the “cavern.” At the same time, we do not know what percentage of true believers may retain at least a modest level of functionality, despite the time spent in the QAnon virtual world. 

We do know that thousands of QAnon adherents have been harmed psychologically and/or in their relationships with loved ones.

We do know that some evidence points toward actual or potential violence among QAnon supporters.

We do not know whether the probability of violence within the QAnon network is greater than in other networks, most of which are smaller than QAnon. Nor do we know the degree to which sporadic violence may be a direct causal result of what occurs in the cybernetwork.

We do know that the QAnon network, and other conspiracy networks, evolve over time, though we do not fully understand what factors determine the pathway that evolution may follow.

We do not know what percentage of the community voluntarily leave the Q world, or why they leave.

We do not know if QAnon will thrive or survive. If my hypothesis is correct that the QAnon network is a safe space for paranoid speculation, the QAnon community may be able to endure without Q and without Trump. If their numbers decreased by 50% or more, there would still be tens of thousands of Q-compatriots obsessively ruminating in the QAnon mindspace.

In summary, we know a little, but there is much more that we do not know. To say that humility and scientific research are needed is an understatement.


Note: On the following page are endnotes manually created as text. They should be visible on phones.







End Notes

  1. https://www.merriam-webster.com/dictionary/conspiracy%20theory

  2. Dickson, E. J. (2020, Sept. 23). Former QAnon Followers Explain What Drew Them In—And Got Them Out (para. 3). Rolling Stone.

  3. Itkowitz, C., Stanley-Becker, I., Rozsa, L., & Bade, R. (2020, Aug. 19). Trump praises baseless QAnon conspiracy theory, says he appreciates support of its followers. Washington Post.

  4. Staff. (2021, Jan. 17). Swiss text sleuths unpick mystery of QAnon origins. America Votes News.

  5. 5 facts about the QAnon conspiracy theories. (2020, Nov. 16). FactTank, Pew Research Organization.

  6. Tian, E. (2021, Jan. 29). The QAnon Timeline: Four Years, 5,000 Drops and Countless Failed Prophecies. Bellingcat.com.

  7. Smith, Melanie. (2020, Aug.). Interpreting social Qs: Implications of the evolution of QAnon. Graphika Special Report, Introduction, para. 3.

  8. Smith, Melanie. (2020, Aug.). Concern 1, para. 2.

  9. Smith, M. (2020, Aug.). Concern 1, para. 1.

  10. Schaffner, B. (2020, Oct. 5). QAnon and Conspiracy Beliefs, (para. 4–7). (Report supported by Institute for Strategic Dialogue and funded by Luminate.) Summary

  11. Shanahan, J. (2021, Mar. 5). Support for QAnon is hard to measure—and polls may overestimate it. The Conversation.

  12. Schaffner, B. (2020, Oct. 5).

  13. Moore, D. (2005, June 16). Three in Four Americans Believe in Paranormal. Gallup News Service.

  14. Stierwalt, S. (2020, June 25). Is Astrology Real? Here’s What Science Says. Scientific American.

  15. Reboot Foundation. (2018, Nov.). The State of Critical Thinking 2018, Executive Summary (excerpts, para. 16–20).

  16. Reboot Foundation. (2018. Nov.) para. 21.

  17. PsychCentral. (n.d.). 15 Common Cognitive Distortions (para. 1).

  18. Gold, H.. (2014, May 6). Survey: 7% of reporters identify as Republican. Politico.

  19. Brenan, Megan. (2019, September 26). Americans' Trust in Mass Media Edges Down to 41%. Gallup Organization.

  20. Shafer, J., & Doherty, T. (2017, May/June). The Media Bubble Is Worse Than You Think. Politico.

  21. Shafer, J., & Doherty, T. (2017, May/June).

  22. Jurkowitz, M., Mitchell, A., Shearer, E., & Walker, M. (2020, Jan. 24). U.S. Media Polarization and the 2020 Election: A Nation Divided (para. 10). Pew Research Center./

  23. Ellis, B. (2017, Mar. 6). Cronkite Voted Most Trusted Man in America—And That’s the Way it Is (Building Fearless Brands/Friday’s Fearless Brand blog, March 6, 2017 post).

  24. Public Trust in Government: 1958–2019. (2019, April 11). Pew Research Center.

  25. Washington State University. (2020, Dec. 14). Social media use increases belief in COVID-19 misinformation. PhysOrg.

  26. Reboot Foundation. (2018. Nov.) para. 18.

  27. Dickson, E. J. (2020, Sept. 23). Former QAnon Followers Explain What Drew Them In—And Got Them Out (para. 3). Rolling Stone.

  28. Pierre, J. (2020, Aug. 12). The Psychological Needs That QAnon Feeds (para. 5). What to Do When Someone You Love Becomes Obsessed With QAnon, Part 1, Psychology Today.

  29. Moss, Candida. (2021, Jan. 21). How a New Religion Could Rise From the Ashes of QAnon. The Daily Beast.

  30. See James, W. (1961/1902). The Varieties of Religious Experience. MacMillan.

  31. Langone, M.D. (2002). Cults, Conversion, Science, and Harm. Cultic Studies Review, 1(2), 178–186

  32. Kirn, W. (2018, June). The Wizard of Q (para. 9, 10). Harper’s Magazine.

  33. Kim, W. (2018, June). Para. 10.

  34. Smith, M. (2020, Aug.). Concern 1, para. 1.

  35. DiResta, Renee. (2013, Nov. 13). Online Conspiracy Groups Are a Lot Like Cults (para. 6). Wired.

  36. Zablocki, Benjamin. (1998). Exit Cost Analysis: A New Approach to the Scientific Study of Brainwashing. Nova Religio: The Journal of Alternative and Emergent Religions, 1(2), 216–249.

  37. Dickson, E. J. (2020, Sept. 23).

  38. Ratini, M. (2019, Mar. 19). Is Video Game Addiction Real? (para. 6), WebMD.

  39. What Is Social Media Addiction? (n.d.). AddictionCenter.com

  40.  What Is Social Media Addiction? (n.d.), para. 9.

  41. QAnon videos. Vice.com.

  42. Watt, C. S. (2020, Sept. 23). The QAnon orphans: people who have lost loved ones to conspiracy. The Guardian.

  43. Maverick, T. K. (2020, Aug. 16). I’m dating a conspiracy theorist. But it feels like I’m the one going crazy. The Washington Post.

  44. Goldberg, W. (2021, in press). Conspiracy Theories: Some Observations. ICSA Today

  45. Hassan, S. (2021, Jan. 15). How to Help People Involved in QAnon.

  46. Schulson, M. (2021, Feb. 24). Can Cult Studies Offer Help With QAnon? The Science Is Thin. Undark.

  47. Karlis, N. (2021, Mar. 14). Cult recovery experts explain how to "deprogram" QAnon adherents. Salon.

  48. Dubrow-Marshall, R. (2010). The influence continuum—the good, the dubious, and the harmful—Evidence and implications for policy and practice in the 21st century. International Journal of Cultic Studies, 1, 1-12.

  49. Pierre, J. (2020, Aug. 12), How Far Down the QAnon Rabbit Hole Did Your Loved One Fall? (para. 17). What to Do When Someone You Love Becomes Obsessed With QAnon, Part 2, Psychology Today.

  50. Clark, J. (1979). Cults. Journal of the American Medical Association, 242, 279–281.

  51. Tian, E. (2021, Jan. 29). The QAnon Timeline: Four Years, 5,000 Drops and Countless Failed Prophecies (para 4). Bellingcat.com.

  52. See https://docs.google.com/spreadsheets/d/11MhW-P-9el9dg_cTjutwtIiQGMfL8jfH3SOaLZSBV2g/edit#gid=1596710080

  53. See forums of families and survivors: https://www.reddit.com/r/QAnonCasualties/ https://www.reddit.com/r/ReQovery/

Winter, J. (2019, Aug. 1). Exclusive: FBI document warns conspiracy theories are a new domestic terrorism threat. Yahoo News.




No comments: