The Virality of Conspiracy Theories: Examining the Influence of Memes and Instagram Reels
The proliferation of the Internet and the advent of social media have revolutionized the ways in which information is disseminated, accessed, and consumed. While these technological advancements have empowered individuals by democratizing access to information, they have also produced unintended consequences, notably serving as conduits for the propagation of conspiracy theories (Enders et al. 795). With the increasing integration of social media into our daily lives, understanding its role in shaping public opinion, influencing behaviors, and nurturing particular belief systems has become a subject of pressing importance. The impact of this medium extends beyond mere individual influence, reaching into the social and collective psyche. Within this intricate landscape, the current study aims to explore the multifaceted relationships between social media and conspiracy theories, particularly focusing on the role of Instagram as a medium for the dissemination and acceptance of such theories.
Utilizing Framing Theory as a foundational conceptual framework, this study will investigate how the arrangement and presentation of information within social media platforms can influence public perception and belief. Developed initially by Erving Goffman and expanded further in the context of media studies by scholars like Robert Entman, Framing Theory offers a powerful lens through which to analyze how media platforms, including social media, can serve to frame or re-frame issues, thereby influencing public discourse and perception.
To delineate the mechanisms through which conspiracy theories gain traction on social media, this study will examine three critical components that contribute to the phenomena: ease of access to information, the veil of anonymity, and the platform's community-building capabilities. The Internet's architecture inherently allows for easy access to vast reservoirs of information, but it is the interactive and participatory nature of social media that has magnified the speed and reach of conspiracy theories. Coupled with the anonymity that these platforms afford, individuals find not only a stage to voice opinions without immediate ramifications but also find communities of like-minded individuals who validate and perpetuate these often baseless beliefs.
Focusing on Instagram as a case study, the research will delve into the platform's unique features—such as the Explore Page and the use of hashtags—that serve to increase user exposure and engagement. Instagram, with its visually-centric interface, stands as a fertile ground for examining how images, memes, and reels contribute to the conspiracy theory ecosystem. The study will dissect how the simplicity, emotional appeal, and ease of distribution of memes make them effective tools for framing complex conspiracy narratives. Concurrently, Instagram reels, with their brevity and visual impact, will be scrutinized as channels that reinforce and validate these framed narratives.
Lastly, the study aims to explore the social interactions—comments, likes, and shares—that take place on Instagram and contribute significantly to the social proof and ensuing acceptance of conspiracy theories. These interactions serve as both symptomatic and constitutive elements of a larger social discourse that grants legitimacy to conspiracy theories.
Theoretical Framework
Framing Theory is an analytical framework used primarily in communication studies, political science, and sociology to understand how media and other discourses shape the organization and interpretation of information, events, or issues. Originating from the works of Erving Goffman in his seminal book, Frame Analysis: An Essay on the Organization of Experience (1974), this theory posits that the way an issue is presented or "framed" significantly influences public perception and understanding.
In the context of media studies, framing refers to the process by which media outlets choose specific aspects of a subject to focus on while excluding others. By doing so, the media engages in a form of agenda-setting, highlighting what they deem to be the most relevant or critical components of the subject matter. The framing mechanism thereby guides the audience's attention toward certain interpretations, evaluations, and decisions about the topic at hand (Entman 51).
Framing Theory offers a valuable conceptual lens for understanding the mechanics behind the virality of conspiracy theories in the digital age. Digital media like memes and Instagram reels have made the application of Framing Theory increasingly relevant. They encapsulate and present information in ways that engage emotional responses, guide cognitive processes and impact social interactions. The framing carried out by these digital products could thus play a significant role in the propagation and acceptance of conspiracy theories.
By incorporating Framing Theory into my theoretical framework, I will dissect the strategic ways in which information is packaged, presented, and interpreted, thereby providing a comprehensive understanding of how and why conspiracy theories gain traction in the digital age.
Social Media as a Hotbed for Conspiracy Theories
Ease of Access
The advent of smartphones, coupled with an ever-expanding network infrastructure, has rendered geographical location and temporal constraints virtually obsolete. While this ease of access undoubtedly offers myriad advantages, such as democratization of information and enhancement of global connectivity, it also engenders specific challenges, particularly in the realm of conspiracy theories.
The pervasiveness of internet connectivity has engendered a landscape where information—be it factual or fallacious—is readily accessible at the fingertips of a global audience. In previous eras, the spread of conspiracy theories was somewhat limited by barriers such as publication costs, lack of credibility, and the logistical challenges of reaching a wide audience. However, the internet has essentially dismantled these barriers, facilitating the dissemination of a plethora of ideas without rigorous scrutiny or gatekeeping.
In this environment, the principles of supply and demand manifest differently than they do in traditional media. Given that the 'supply' of conspiracy theories can be endlessly generated and readily disseminated online, the 'demand' for such theories can be spontaneously created and perpetually sustained. (Ziegele et al. 115) A user interested in a particular narrative can easily locate a multitude of sources corroborating that narrative, irrespective of its factual accuracy. Furthermore, algorithmic mechanisms often create 'filter bubbles,' wherein users are continually exposed to content that aligns with their preexisting beliefs and preferences. Such technological nuances contribute to the entrenchment of conspiracy theories within the public discourse.
Moreover, the absence of a credible authority in the digital landscape often results in a dilution of informational quality. When anyone can publish content, the locus of authority becomes fragmented, thereby making it increasingly challenging for users to discern credible information from conspiracy theories. Consequently, the ease of access becomes a facilitator for the uncritical consumption of information, heightening the risk of public belief in unverified or misleading narratives.
In sum, the ease of access to information in the digital age functions as a double-edged sword. While it empowers individuals by making an unprecedented volume of information available, it also exposes society to the risks associated with the unfiltered proliferation of conspiracy theories. The implications of this phenomenon are far-reaching, affecting not just individual belief systems but also the collective social fabric, thus warranting rigorous academic scrutiny and societal discourse.
Anonymity
The architecture of the Internet inherently provides a layer of anonymity that stands as both an asset and a liability in the public sphere of information dissemination. While anonymity can serve noble purposes such as protection of privacy, freedom of speech, and avoidance of political persecution, it simultaneously creates a fertile ground for the propagation of conspiracy theories and misinformation.
In traditional settings, the dissemination of information is often subject to a series of gatekeeping processes. Journalists, editors, and publishers have professional and ethical guidelines that they are bound to follow, which minimizes the spread of unverified or misleading information to the public. However, the Internet bypasses these institutional filters, as the curtain of anonymity permits individuals to disseminate content without accountability.
Anonymity, especially in online platforms, empowers individuals to vocalize thoughts or share information that they might otherwise be hesitant to present publicly. In the context of conspiracy theories, this means that individuals can postulate theories, disseminate misinformation, and speculate without concrete evidence, free from the risk of public backlash or legal repercussions. The decentralized nature of the Internet means that tracking the origin of such theories or information becomes an immensely complex task. Consequently, this creates an environment where misinformation is not only generated but is also readily propagated and perpetuated.
Furthermore, the anonymous character of the Internet has psychological implications known as the online disinhibition effect. Hollenbaugh and Everett explain that “the anonymity afforded by the Internet has been found to result in decreasing inhibitions and increasing self-disclosures, a condition known as the online disinhibition effect” (284). This phenomenon explains the willingness of individuals to behave more impulsively and take greater risks online than they would in a face-to-face interaction. Thus, anonymity does not merely offer a shield for the dissemination of conspiracy theories but actively encourages behaviors that would be considered socially or ethically unacceptable in other contexts.
In addition to the individual-level effects, the collective dynamics also facilitate the snowballing of conspiracy theories. Online platforms often feature forums or groups where like-minded individuals converge in “echo-chambers”, explained in the next section. Because members within these groups are often anonymous, the collective identity of these chambers takes precedence, further solidifying extreme viewpoints and making them resistant to contradictory information or logical scrutiny.
The provision of anonymity by the Internet acts as a double-edged sword in the realm of public information. While it grants the freedom for underrepresented voices to be heard, it equally provides a perilous avenue for the dissemination and acceptance of conspiracy theories. The issue of anonymity, in the propagation of misinformation, calls for concerted efforts both in technological design and policy formulation aimed at mitigating these negative effects while preserving the benefits that anonymity can offer.
Community-building Capabilities
In the realm of digital communication, the capability of social media platforms to foster communities around shared interests, ideologies, or beliefs is unparalleled (Sweet et al. 2). While the facilitation of such communities undeniably has merits—ranging from the promotion of marginalized voices to the formation of support networks—the community-building capabilities of social media platforms also have a darker side, particularly when it comes to the perpetuation of conspiracy theories.
Social media platforms are ingeniously designed to connect individuals based on shared interests and beliefs. Algorithms analyze user data to suggest groups, forums, or pages that align with a user's existing tendencies, thereby enabling the formation of specialized communities (Sweet et al. 3). In the academic literature, these virtual congregations are often described as "echo chambers," wherein members are exposed predominantly to information that aligns with their pre-existing beliefs. The validation from like-minded individuals within these chambers often confers an undue sense of credibility and importance to shared viewpoints, irrespective of their factual basis (Cinelli et al. 5).
The danger inherent in this community-building capability is the solidification and amplification of conspiracy theories. Unlike more traditional forms of media that aim for objectivity and offer multiple perspectives, these echo chambers foster homogeneity of thought and prevents alternate schools of thought. When a user posts a conspiracy theory within such a community, the immediate feedback often includes validation and agreement, thereby reinforcing the user's belief in the theory. Over time, members of these communities become increasingly insulated from dissenting viewpoints, and the conspiracy theory gains a foothold.
Moreover, the community structure itself serves to repel criticism from outside sources. Those who attempt to introduce evidence that contradicts the prevailing narrative of the community are often dismissed, ridiculed, or even excluded, thus preserving the internal homogeneity. This leads to a form of collective confirmation bias, where evidence that supports the community's shared beliefs is highlighted, while evidence to the contrary is ignored or discredited.
It's also worth noting that the dynamics of these communities can serve to radicalize their members further. As community members strive for internal recognition and status, there's a tendency towards escalating extremity in the beliefs and theories
espoused, taking the original conspiracy theories to even more elaborate and implausible heights. These extreme beliefs can also act as a means for group members to differentiate themselves from even more "radical" members, anchoring them to more mainstream conspiracy theories.
While the community-building capabilities of social media platforms serve various beneficial social functions, they simultaneously create environments that are conducive to the perpetuation of conspiracy theories. Such communities act as echo chambers, reinforcing unverified information and shielding their members from external critique (Cinelli et al. 4). This phenomenon not only strengthens the belief in conspiracy theories among community members but also has broader societal implications, including the polarization of public opinion and erosion of trust in established institutions. Therefore, understanding these dynamics is crucial for any comprehensive analysis of the spread and entrenchment of conspiracy theories in the digital age.
Role of Instagram
The Explore Page
In the contemporary digital landscape, Instagram stands as a critical player in determining public content consumption. The Explore page on Instagram epitomizes the marriage between algorithmic curation and user engagement. It is designed to offer a personalized feed that considers a myriad of variables, including a user's past engagement, the popularity of posts, and the engagement metrics of users who have exhibited similar behavior or interests. While the primary objective of algorithmic curation is to elevate user experience by offering tailored content, the impact is far-reaching and complex.
The term "filter bubble" was coined to encapsulate this phenomenon, where algorithms selectively present information that confirms a user's pre-existing ideologies. The echo chamber effect, combined with the filter bubble, can prove to be especially concerning in the context of conspiracy theories (Seargeant and Tagg 2). When a user shows even slight interest in conspiracy-oriented content, the algorithmic curation methods of platforms like Instagram's Explore page can disproportionately surface such topics. This leads to a self-reinforcing cycle of exposure and belief reinforcement, which makes the uncritical acceptance of conspiracy theories more likely.
Researchers have consistently demonstrated the real-world consequences of this algorithmic amplification. The issue transcends academic interest; it is a pressing ethical concern. The ease with which echo chambers and filter bubbles form online calls for an urgent reevaluation of the ethical dimensions of algorithmic curation methods.
As the influence of features like Instagram's Explore page continues to grow, shaping public discourse and facilitating the spread of conspiracy theories, there is a compelling need for further academic scrutiny to explore its ethical implications. Such research could include the design of algorithms that offer a broader spectrum of viewpoints to mitigate the risks associated with the reinforcement of extreme beliefs.
Hashtags
Building on the concept of algorithmic curation and user engagement discussed in the context of Instagram's Explore page, it is important to consider another salient feature that serves a similar purpose: hashtags. Originating from Twitter and now ubiquitous across various social media platforms, hashtags are instrumental for both content discovery and categorization. They offer a complementary layer to algorithmic curation, further shaping the thematic contours of user experience on platforms like Instagram.
Much like algorithmic curation, hashtags operate as thematic markers that serve to aggregate individual posts under unified categories. By facilitating an easily searchable dialogue, hashtags create topical focal points that help to organize social interactions on these platforms. Users can follow specific hashtags to allow content tagged with them to appear on their feed. This feature aligns closely with the objectives of algorithmic curation—both are designed to draw users deeper into specific topics or discussions, effectively serving as avenues for increased engagement.
However, the role of hashtags takes on a particular significance in the realm of conspiracy theories. While algorithmic features like the Explore page serve to reinforce existing beliefs by offering content that resonates with user engagement, hashtags amplify this effect by fostering communities around these very narratives. As users follow or search for conspiracy-related hashtags, they contribute to and become part of these online communities, thereby deepening their involvement and belief in these theories in their respective echo chambers.
Moreover, the trending nature of hashtags can impart a false sense of validity or popularity to conspiracy theories. When users see a conspiracy-related hashtag trending, it can be misconstrued as an indication of a theory's credibility, thus catalyzing its spread. This is akin to the ethical implications surrounding algorithmic curation, raising concerns about the role social media platforms play in the propagation of misinformation.
Hashtags function not just as a tool for categorization but also as a significant agent in the spread and reinforcement of conspiracy theories, operating synergistically with features like the Explore page to potentiate the risks associated with the uncritical acceptance and propagation of these narratives.
Memes and Virality
Originating from Richard Dawkins' theory of cultural evolution, memes have evolved into impactful tools that disseminate complex ideas by combining simplicity with emotional resonance (Ermakov and Ermakov 8). Memes excel as concise vectors of information, encapsulating easily digestible elements of commentary or insights. Their format not only fosters rapid comprehension but also stimulates emotional engagement through undertones that may span humor, irony, or outrage. This synergistic combination has been identified as a key driver for their viral nature. Psychological and sociological research corroborates the propensity of emotionally charged content to be shared, thereby contributing to a meme's widespread distribution via mechanisms like social proof and the bandwagon effect.
The format's efficiency and emotional engagement make memes an effective medium for communicating conspiracy theories. Their brevity allows these complex narratives to be condensed into single images or catchphrases, eliminating the need for exhaustive background knowledge. Moreover, the emotional components inherent in memes enhance the palatability of these theories, thereby fostering their dissemination. In this manner, memes contribute to the increased visibility and perceived validity of conspiracy theories, much like the algorithmic features of platforms like Instagram.
However, these characteristics also introduce ethical complications. The efficacy of memes in information dissemination is accompanied by a risk of spreading misinformation. Their viral nature and emotional resonance enable the rapid propagation of potentially inaccurate yet compelling narratives. This poses the challenge of unregulated content trivializing or even glorifying serious issues, thus fostering an environment susceptible to the unchecked proliferation of conspiracy theories.
Consequently, memes serve as a double-edged sword. While they democratize the flow of information by facilitating the broad dissemination of ideas, they also act as conduits for misinformation. The dynamics underlying meme virality, particularly its intersection with emotional psychology and social mechanisms, call for additional academic research to inform more nuanced public policy considerations.
Examples such as "Pizzagate" and "QAnon" provide compelling illustrations of the power memes hold in disseminating conspiracy theories. These memes have not only achieved virality but have also transformed into overarching narratives, demanding serious public and academic attention due to their real-world implications which will be discussed below.
The "Pizzagate" conspiracy theory emerged during the 2016 U.S. Presidential election, claiming a pizza restaurant in Washington, D.C., was the center of a child sex-trafficking ring involving high-profile politicians like Hillary Clinton. While this theory has been debunked and discredited (Robb 2020), it gained widespread attention, in part due to the virality of memes and social media posts that simplified and emotionalized the narrative. Memes featuring imagery and phrases related to "Pizzagate" proliferated across various platforms, distilling a complex and false narrative into bite-sized, emotionally charged snippets that were rapidly shared and consumed.
Another illustrative example is "QAnon," a conspiracy theory alleging a secret cabal of Satan-worshipping pedophiles is plotting against former U.S. President Donald Trump. The "QAnon" memes served as a gateway into a broader, more intricate world of associated theories and falsehoods (“Roose 2021”). Initial memes provided a simplified and emotional entry point, featuring buzzwords and catchphrases that were designed to pique curiosity and incite emotional reactions like fear and indignation. These memes were essential in generating interest and recruitment into the broader QAnon community, serving as both an introduction and an endorsement of the wider conspiracy theory.
What makes cases like "Pizzagate" and "QAnon" particularly noteworthy is their evolution from individual memes into full-fledged narratives with extensive followings. These memes act as seeds that, once planted in the fertile ground of social media, can grow into expansive and often dangerous belief systems. The memes serve as entry points or gateways, simplifying and emotionalizing complicated and often unfounded narratives to make them accessible and appealing.
The influence of such memes extends beyond the digital realm, often inspiring real-world actions that can range from protests to violence. For example, the Pizzagate conspiracy theory led to a man firing a rifle inside the implicated pizza restaurant (Haag and Salam), showcasing the potential for memes to incite dangerous activities based on false premises.
The Emergence of Instagram Reels
Instagram reels have rapidly materialized as a significant vector for the dissemination of conspiracy theories, supplementing pre-existing mediums like memes in a complex ecosystem of digital information. Instagram reels are distinguished by their short duration—typically limited to 60 seconds—and their predilection for mobile-viewing formats, features that facilitate elevated levels of user engagement.
Unlike text-based memes, reels offer a multi-sensory experience incorporating visual imagery, sound, and motion. This synergistic combination affords a heightened emotional engagement, enhancing the capacity for reels to serve as vehicles for the rapid spread of conspiracy theories. Concurrently, the platform's algorithmic functionalities operate to augment the visibility of these reels, thereby reinforcing their viral potential.
The influence of Instagram reels as a medium for conspiracy theory propagation is not a speculative supposition but an observable reality. For instance, reels positing unfounded claims such as "COVID-19 is a hoax" or raising "5G radiation concerns" have attracted considerable public attention. These cases exemplify the platform's innate features that lend themselves to the quick dissemination and perceived legitimization of conspiracy theories.
In the case of reels advocating the baseless assertion that COVID-19 is a hoax, the brevity and emotive potency of the content foster high levels of user engagement (Allington et al. 7). This engagement is further heightened by contentious discussions in the comments section, amplifying the reel's visibility through algorithmic mechanisms and thereby exacerbating public misconceptions about an extant health crisis.
Conversely, reels highlighting purported concerns about 5G radiation exploit prevalent societal anxieties regarding technological advancements. Why do these conspiracy theories are important to challenge? How do they disseminate so fast? Flaherty et al. in their spatial analysis of both COVID-5G radiation conspiracy theories argue:
Urban geographical health data are prone to exhibit these kinds of relations and distributions, we should expect conspiracy theories to develop more, equally damaging correlations using errant data, patternicity, and conjunction logical fallacies (6).
Conspiracy-filled reels also exploit the platform's features—brevity and emotional charge—to gain rapid traction, thus perpetuating misinformation and sowing public distrust.
The efficacy of Reels as a medium for conspiracy theory dissemination is not confined to Instagram; the inherently shareable nature of reels enables their transmission across multiple social media platforms, from Facebook to Twitter and WhatsApp. This cross-platform dissemination amplifies their reach and serves to normalize and legitimize the underlying conspiracy theories, extending their societal impact.
Social Interactions on Instagram: The Mechanisms of Validation and Acceptance in the Context of Conspiracy Theories
Instagram, as a visual-centric social media platform, has evolved into more than a mere showcase of photographs and videos. It has become a platform for social engagement, where the features like comments and likes serve as key indicators of audience reception and sentiment. While these features ostensibly promote interaction and discussion, their dynamics also hold significant implications for the propagation and validation of conspiracy theories
Comments as a Platform for Discussion and Validation
The comment section of an Instagram post provides a space for user engagement that is both social and discursive. Users can offer praises, critiques, or expand upon the content in the post. In the context of conspiracy theories, this feature often turns into a forum for further speculation, discussion, and, unfortunately, validation.
Comments supporting the conspiracy theory lend it an air of legitimacy, especially for new users or those who are undecided on the topic. The comments effectively act as micro-endorsements, each one adding a layer of perceived credibility to the theory. Furthermore, the comment section also allows for the sharing of additional 'evidence' or related conspiracy theories, making it a fertile ground for the cross-pollination of conspiratorial ideas (Miller 15).
Likes as Indicators of General Acceptance and Their Misleading Nature
The 'Like' feature, represented by a heart symbol on Instagram, is another critical metric for user engagement. A high number of likes is often interpreted as a sign of general acceptance or approval. However, this seemingly straightforward indicator can be particularly misleading in the context of conspiracy theories.
For one, the act of liking a post is often impulsive and does not necessarily reflect a deep engagement with or understanding of the content. Moreover, algorithms tend to show posts with higher engagement, including likes, to a broader audience. This can create a snowball effect where the high number of likes draws more visibility, which in turn attracts even more likes, thus perpetuating a cycle that lends the post—and by extension, the conspiracy theory—an undeserved air of validity.
It's crucial to consider the synergy between comments and likes. A post with many likes but few comments may indicate surface-level agreement, but a post that has both high likes and a high number of supportive comments suggests a deeper level of engagement and acceptance. This dual endorsement can significantly contribute to the theory's overall perception as valid, further embedding it into the collective consciousness.
In the evolving digital ecosystem, the interplay between memes and Instagram reels offers a unique symbiosis that holds significant implications for the dissemination of conspiracy theories. While each medium possesses its own set of characteristics that make it effective in engaging public attention, their intersection amplifies the inherent strengths of both, thereby constituting a potent vehicle for achieving virality.
Within the complex interconnections of social media platforms, memes and reels often function collaboratively to extend their respective reach and engagement levels (Plaisime et al. 5). For instance, a conspiracy theory initially popularized through a meme can be subsequently adapted into an Instagram reel, thereby leveraging the platform's algorithmic advantages to further its reach. Conversely, a trending Instagram reel may give rise to a plethora of associated memes, diversifying its impact across various platforms and demographic sectors. This cross-pollination enhances the propagation potential of conspiracy theories, increasing the probability of their widespread acceptance.
Memes and reels offer complementary methods for distilling and conveying complex narratives. Memes excel at simplifying intricate theories into digestible visual and textual formats. Reels add a dynamic layer to this simplification through their incorporation of movement and audio. The capacity of reels to animate the concepts encapsulated in memes and the ability of memes to condense the visual and auditory complexity of reels contribute to a balanced yet comprehensive portrayal of conspiracy theories.
Both mediums excel in fostering emotional engagement, albeit through distinct channels. Memes often employ humor, irony, or shock value to achieve emotional resonance. Reels, by virtue of their multi-sensory format, offer a heightened level of emotional immersion. When integrated, these formats potentiate the emotional engagement with the content, rendering the underlying conspiracy theory more relatable, memorable, and consequently, more shareable.
A key advantage in the confluence of memes and reels lies in their potential for algorithmic synergy. Both mediums are intrinsically designed for high levels of shareability and interactivity—critical metrics for algorithms governing content dissemination on platforms like Instagram. Enhanced user engagement with both formats triggers algorithmic recognition, potentially accelerating the content's promotion across various features such as the Explore page or associated hashtags (Plaisime et al. 15).
The amalgamation of memes and Instagram reels constitutes a robust, synergistic mechanism for the rapid and expansive spread of conspiracy theories. The union of memes' textual simplicity and emotional resonance with Reels' visual dynamism and brevity forms a potent composite tool in the digital dissemination landscape. Given this efficacious collaboration in perpetuating misinformation, the ethical implications warrant meticulous scholarly investigation.
Works Cited:
Allington, Daniel, et al. “Health-protective Behaviour, Social Media Usage and Conspiracy Belief During the COVID-19 Public Health Emergency.” Psychological Medicine, vol. 51, no. 10, Cambridge UP, June 2020, pp. 1763–69. https://doi.org/10.1017/s003329172000224x.
Cinelli, Matteo, et al. “The Echo Chamber Effect on Social Media.” Proceedings of the National Academy of Sciences of the United States of America, vol. 118, no. 9, National Academy of Sciences, Feb. 2021, https://doi.org/10.1073/pnas.2023301118.
Enders, Adam M., et al. “The Relationship Between Social Media Use and Beliefs in Conspiracy Theories and Misinformation.” Political Behavior, vol. 45, no. 2, Springer Science+Business Media, July 2021, pp. 781–804. https://doi.org/10.1007/s11109-021-09734-6.
Entman, Robert M. “Framing: Toward Clarification of a Fractured Paradigm.” Journal of Communication, vol. 43, no. 4, Oxford UP, Dec. 1993, pp. 51–58. https://doi.org/10.1111/j.1460-2466.1993.tb01304.x.
Ermakov, Dmitry S., and A. N. Ermakov. “Memetic Approach to Cultural Evolution.” BioSystems, vol. 204, Elsevier BV, June 2021, p. 104378. https://doi.org/10.1016/j.biosystems.2021.104378.
Flaherty, Eoin, et al. “The Conspiracy of Covid-19 and 5G: Spatial Analysis Fallacies in the Age of Data Democratization.” Social Science & Medicine, vol. 293, Elsevier BV, Jan. 2022, p. 114546. https://doi.org/10.1016/j.socscimed.2021.114546.
Goffman, Erving. Frame Analysis: An Essay on the Organization of Experience. Northeastern UP, 1986.
Haag, Matthew, and Maya Salam. “Gunman in ‘Pizzagate’ Shooting Is Sentenced to 4 Years in Prison.” The New York Times, 27 June 2020, www.nytimes.com/2017/06/22/us/pizzagate-attack-sentence.html.
Hollenbaugh, Erin E., and Marcia K. Everett. “The Effects of Anonymity on Self-Disclosure in Blogs: An Application of the Online Disinhibition Effect.” Journal of Computer-Mediated Communication, vol. 18, no. 3, Wiley-Blackwell, Feb. 2013, pp. 283–302. https://doi.org/10.1111/jcc4.12008.
Miller, Daniel. “Characterizing QAnon: Analysis of YouTube Comments Presents New Conclusions About a Popular Conservative Conspiracy.” First Monday, University of Illinois at Chicago, Jan. 2021, https://doi.org/10.5210/fm.v26i2.10168.
Plaisime, Marie V., et al. “Social Media and Teens: A Needs Assessment Exploring the Potential Role of Social Media in Promoting Health.” Social Media and Society, vol. 6, no. 1, SAGE Publishing, Jan. 2020, p. 205630511988602. https://doi.org/10.1177/2056305119886025.
Robb, Amanda. “Rolling Stone.” Rolling Stone, 20 July 2020, www.rollingstone.com/feature/anatomy-of-a-fake-news-scandal-125877.
Roose, Kevin. “What Is QAnon, the Viral Pro-Trump Conspiracy Theory?” The New York Times, 3 Sept. 2021, www.nytimes.com/article/what-is-qanon.html.
Seargeant, Philip, and Caroline Tagg. “Social Media and the Future of Open Debate: A User-oriented Approach to Facebook’s Filter Bubble Conundrum.” Discourse, Context and Media, vol. 27, Elsevier BV, Mar. 2019, pp. 41–48. https://doi.org/10.1016/j.dcm.2018.03.005.
Sweet, Kayla S., PhD, et al. “Community Building and Knowledge Sharing by Individuals With Disabilities Using Social Media.” Journal of Computer Assisted Learning, vol. 36, no. 1, Wiley-Blackwell, July 2019, pp. 1–11. https://doi.org/10.1111/jcal.12377.
Ziegele, Marc, et al. “Deprived, Radical, Alternatively Informed.” European Journal of Health Communication, vol. 3, no. 2, Sept. 2022, pp. 97–130. https://doi.org/10.47368/ejhc.2022.205.