COVID-19 Conspiracy Tribes Across Instagram, TikTok, Telegram and YouTube

Team Members

Instagram: Rebekka Eick, Eirini Malliaraki, Róisín Moloney, Bryan Steffen, Fabio Votta

TikTok: Selin Ashaghimina, Alla Rybina, Carlos Osorio, Fabio Votta

Telegram: Agustin Ferrari Braun, Carlos Osorio, Daniel de Zeeuw

YouTube: Christoffer Bagger, Daniel Jurg, Eleni Maragkou, Tatiana Smirnova

Contents

Summary of Key Findings

This research project sought to show how several popular conspiracy theories (involving China, 5G, Bill Gates, QAnon, flat earth, and the deep state) spread across social media, focusing on four platforms of analysis: Instagram, TikTok, Telegram, and YouTube. Using the concept of a “conspiracy tribe” as a heuristic for gauging how online communities produce and consume conspiratorial content in a variety of ever-shifting social media constellations, we wanted to know whether the four platforms saw an increase of engagement with these theories and how this engagement changed over time, in terms of the user base and types of conspiratorial content. Moreover, we sought to test our hypothesis that previously distinct conspiracy theories started to coalesce and converge during the pandemic. Looking at all the platforms, we found that engagement with the aforementioned conspiracy theories remains relatively low when the virus was still contained to China (January/February) but started to increase in March and peak in April/May when COVID-19 became a global health and economic crisis (first in Italy, other European countries, the US and elsewhere).

Each platform analysis also yielded specific findings. For Instagram, we found that in January and February COVID-19-related hashtags were present in much lower volume than in March and April, when they also started to act as a linchpin bringing disparate conspiracy theories together. On TikTok, conspiracy theories appear to spread through individual accounts rather than hashtag labelling and there seems to be a dominance of far-right and Trumpist conspiracy theory accounts among the content creators. For Telegram we found that different conspiracy theories dominate the discussion at different moments (China initially, later 5G and Bill Gates), but also that content is often engaged with in a spirit of mockery rather than true belief. Finally, for YouTube, hosting practices and co-appearances incur legitimacy to and increase the visibility of creators, forming a network of influence. During our analysis of two principal nodes in the network we found that creators engage with various established conspiracy theories which converge around the coronavirus from January onward. This engagement, however, is suspect, as these creators often adopt promotional techniques common among brand influencers, leveraging their audience into profit.

1. Introduction

On 9 January 2020, the World Health Organisation (WHO) put out a statement regarding a cluster of pneumonia cases in Wuhan, China. Due to this, a preliminary discovery was made of a novel coronavirus. Different to MERS-CoV that was first identified in Saudi Arabia in 2012 (WHO Newsroom) or SARS-CoV in 2002, that infected animals which passed from other animals and then onto humans with more than 8,000 cases (Who.int) in the Guangdong province of southern China, the world has sat back and watched as COVID-19 wreaked havoc upon our world. At the time of writing (15 july 2020), the world has seen 13.3 million cases of the virus (The Lancet), with the Americas being the current epicenter. As a result, people in countries hit by the virus have had to adhere to various health measures such as frequent hand washing, wearing a mask in public, temperature checks, and social distancing.

In analogy with the biological domain as the material “medium” of societal exchange, society is also shaped by the information about itself, and, arguably, more so than ever before. Borrowing from medicine and genetics, digital media are often imagined in terms of their “viral relationality” or “memetic” properties (Sampson). Following through on this (somewhat lopsided) analogy, we could say that information about COVID-19 spreads “like” the virus itself - and can in turn affect the spread of the actual virus: scepticism regarding COVID-19 health effects was found to influence willingness to comply with health measures (Bursztyn et al.). In what is a highly complex and volatile global media ecology, information competes for attention; the quantity of which may offer a simulacrum of veracity, especially when it involves so-called conspiracy theories.

Competing on the information market (to deploy another analogy), a substantial part of the COVID-19 info universe indeed consists of conspiracy theories, whose simple and (sometimes) appealing narrations of events are well-attuned to an attention-scarce social media economy. Multiple COVID-19 conspiracy theories have emerged as collectively narrated vernacular explanations for the origin and transmission of, as well as cures for, the virus. In a recent article in The Atlantic, Uscinski et al. speak of a “Coronavirus conspiracy boom”, arguing that “COVID-19 has created a perfect storm for conspiracy theorists”. These range from pointing the finger to global elites, the pharmaceutical industry and vaccination programs, new technologies, and the deep state. Coronavirus is, alternatively, a Chinese bioweapon designed in a Wuhan lab, a distraction from the worldwide introduction of 5G technology, which is thought to cause all kinds of illnesses, and a planned effort by a global cabal headed by Bill Gates. Moreover, it seems that several pre-existing and previously unrelated conspiracy theories have started to cross-pollinate and coalesce around the COVID-19 crisis, e.g. combining anti-vaxx rhetoric with New World Order and anti-5G discourse. One could say COVID-19 potentially acts as a fertile “host” for conspiracy theories to symbiotically (or parasitically) latch on to. These conspiracy narratives pose obvious new challenges for social media platforms grappling with the rapid and large scale dissemination of dis- and misinformation in their networks.

We might think of what could be called “alternative belief communities” as forming “conspiracy tribes”, each adhering to its own conspiratorial world view and “version of reality”. Such tribes function as informatic petri dishes for alternative knowledge cultures that challenge the prevailing consensus, and furthering what has been referred to in the context of QAnon as the “the emergence of the unreal” (Zuckerman). A tribe is conventionally defined as “a social division in a traditional society consisting of families or communities linked by social, economic, religious, or blood ties, with a common culture and dialect” (Virgin). This in contrast to modern forms of socio-economic organization that consist of loosely affiliated networks of individuals and secular institutions. In a pejoratively modern sense, the term “tribal” signifies “blind following” (e.g. of a leader, or religious fetish), in a way that undermines individual and rational decision making. Thus understood, it seems that the “networked individualism” characteristic of social media is anathemic to tribal behavior. The time of the tribe, in modernity, seems irreversibly lost.

In some strands of media theory, however, what was then often referred to as “electronic media” supposedly brought about a return of the tribal mindset, one based on high-technological socio-cultural formations and practices that could thereby be justifiably deemed “tribal”. Marshall McLuhan ’s notion of the “global village” in The Gutenberg Galaxy (1962) comes to mind. So does Walter Ong, who argued in Orality and Literacy: The Technologizing of the Word (1982) that electric media create forms of “secondary orality” characterized by its “participatory mystique, its fostering of a communal sense, its concentration on the present moment, and even its use of formulas” (cited in Bounegru, who describes the emergence of social media and microblogging as “re-tribalizing” our cultures). From a more sociological perspective, in Time of the Tribes Michel Maffesoli similarly argued that society has disaggregated from mass culture to a focus on groups of people fragmented into “postmodern”, “pseudo” or “neo” tribes.

The DMI Summer School project set out to map various COVID-19 conspiracy tribes, to see what theories they adhere to and how they are connected. Based on our own expertise and available data we looked at four platforms these tribes were known to inhabit: Instagram, TikTok, Telegram, and YouTube. Do the analyses of conspiratorial engagement on these platforms support the notion of a “conspiracy boom” around COVID-19 as is commonly believed?

2. Initial Data Sets

On all four platforms, we collected data from the agreed upon timeframe from January 1st until June 1st 2020. As the World Health Organization made their announcement of a cluster of pneumonia cases in Wuhan on January 9th, we took the beginning of that particular month to start observing data from. We designated January and February as representative of pre-lockdown times in English speaking countries (mainly US, Europe) as, although there was a confirmed case of COVID-19 in Washington on January 20, no public health measures regarding a lockdown had been put into place as of yet in the US and Europe which chose for our object of analysis. March and April are considered to be representative of the COVID-19 timeline as various public health measures are introduced to curb the spread of the virus such as frequent handwashing, the wearing of masks and social distancing. Finally, May and June can be considered as a time-frame representative for the easing of restrictions in the United States which they now have to adapt to.

Instagram

In order to retrieve the Instagram data we used the Python library called instaloader (https://instaloader.github.io/). The Instagram data collection started with 294,191 posts that were obtained by querying searches with 82 COVID-19 conspiracy related hashtags. To enrich this dataset with more data, the 60 most active users in the sample were selected and their entire post history was also retrieved. The combined final dataset includes 397,129 posts from both conspiracy-related hashtags as well as users frequently using these hashtags. From this dataset we derived the 600 top shared hashtags and top active users that we considered for categorisation per conspiracy tribe (See Methodology).

TikTok

The initial TikTok dataset, consisting of 27870 videos, is based on 106 COVID-19 conspiracy related hashtags. From this initial dataset, three derived sub-datasets were created for further analysis of content creator engagement. The first derived dataset listed the 200 most frequently appearing content creators within our dataset and the second one only the conspiracy theory accounts among them (78 in total). The third derived dataset listed the 100 most frequently used hashtags from our initial dataset. Lastly, to gain a better understanding of the viewer engagement with COVID-19 related content, we composed a dataset with the 200 most liked videos labelled with particularly relevant COVID-19 conspiracy hashtags (see Methodology).

Telegram

The Telegram dataset contains all the posts published by fifty different public channels (see annex) from January to June 2020. Those channels were selected through an ethnographic analysis conducted on the platform’s conspiratorial milieux earlier this year by Agustin Ferrari Braun. A number of variables were considered when selecting them, including:
  • High following
  • Coverage of news and demonstrated interest in the coronavirus crisis
  • Intra-platform referencing (ex: their posts were re-posted by other channels, they were quoted as authoritative sources etc.)
  • Public notoriety outside of the platform
Although a wide variety of political positions were represented in the dataset, ranging from alt-health to open neo-nazism, the majority of channels leaned towards the far-right of the political spectrum, as it seemed to be both more present on the platform and more prone to disseminate conspiracy theories. Overall, we analysed more than 55.000 posts. Due to time-constraints and the technical limitations of the 4cat tool, we decided to exclusively focus on texts, and not to include images and memes that were shared by the channels.

YouTube

The YouTube dataset contains video information and metadata from 75 conspiracy YouTubers with over 10,000 subscribers. We arrived at this list after manually collecting data via a snowball approach, beginning with a curated query list, adding channels to our list and then looking at the creators featured on those channels and adding their channels in turn. The final, curated list contains channels with a significant subscriber count and posting consistency. Video and channel metadata, including descriptions, keywords, view and subscriber counts, was also collected in order to gain insights on engagement with issues, as well as hosting practices.

3. Research Questions

As an encompassing frame for more platform-specific research approaches, we developed the following three general research questions:

  1. How do COVID-19 conspiracy theories form and spread across different platforms, and how are these theories connected with each other?
  2. Are there significant differences in the ways these theories are engaged with across platforms and within different conspiracy tribes?
  3. Is there a growth and/or convergence of conspiracy tribes/theories around COVID-19?
Concerning the second and third research questions, our hypothesis was that there are significant differences in how COVID-19 is engaged with on different platforms, and that previously separate conspiracy tribes and theories (e.g. anti-vaxxers, NWO, anti-5G, Bill Gates, China) indeed start to grow and converge around COVID-19 discussions online.

Besides general research questions, we also formulated various platform-specific questions:
  • For TikTok we were particularly interested in the status of the conspiracy content creators and spreaders (conspiracy theorists vs. ordinary users) as well as the preferred mode of circulation used (hashtags vs. algorithmic spread through individual accounts).
  • For YouTube we wanted to know to what extent conspiracy theory YouTubers operate as a network of mutually beneficial influencers. Moreover, we wanted to know the impact of hosting practices in terms of engagement, as well as what were the most prominent concerns among these conspiracy influencers. Our initial hypotheses here were that co-appearances increase visibility and legitimacy among conspiracy influencers on YouTube and that previously distinct conspiracy tribes and theories converge around COVID-19 discussion on the platform.

4. Methodology

All platforms

In order to compare results across the different platforms, a general coding scheme was agreed upon. We distinguished between two types of labels: Conspiracy tribes and conspiracy theories (see Table 1 below). This coding scheme allowed us to analyze how tribes can engage with multiple theories on the respective platforms. However, whenever a label didn’t match a significant part of the dataset, it was left out of consideration; just as, conversely, when a significant part of the dataset required a label that was not part of the general list, it was added on a per platform basis (e.g. “China” as a conspiracy theory label on Telegram, “anarchist” as a conspiracy tribe label on TikTok). As for our analyses, we leaned on Rogers’ (2019) digital methods approach that repurposes platform-specific affordances, such as hashtag engagement and the distinguishing of different actors, to analyse the development of social issues on digital platforms.

Conspiracy Tribes

COVID-19 Conspiracy Theories

Alt-health

5G

Trumpists

QAnon

Far-right

Bioweapon

Religious-fundamentalists

Bill Gates

Alt-media

NWO/illuminati

Other

Flat earth

Deep state

Other

Table 1: common coding scheme

Instagram

For Instagram we deployed a mixed methods approach, combining both qualitative coding techniques (using the coding scheme above) and quantitative, algorithmic image analysis. Elizabeth Berman (2017) in her paper denotes the importance of mixed methods. “The exploratory sequential mixed methods design” denotes qualitative analysis conducted first, followed by quantitative data collection and subsequent analysis, finally followed by “integration or linking of data from the two separate strands of data” (1).

Bruns and Burgess (2011) in Highfield and Leaver (2015) outline how hashtags facilitate the flourishing of particular communities, allowing them to consolidate and come together “forming and responding very quickly in relation to a particular event or topical issue. This was in keeping with our analysis of how tribes manifest on these platforms.

For instagram we coded 397, 129 posts from conspiracy related hashtags and users related to these hashtags.

The next step involved analysing the data we had collected and making hashtag mention network visualisations displaying the pertinent information. We also used Gephi to produce cocoon diagrams displaying the number of hashtags associated with coded conspiracy categories. The final step was to interpret our findings we had discovered during the course of our research which will be further outlined in the subsequent section.

During the qualitative coding process of hashtags, it proved to be quite difficult to separate tribes from theories. This is due to the affordances given by Instagram’s design. As there are no features for isolated groups, or rather, as any kind of “interpersonal and topical connectivity” is established through hashtags, group affiliation as well as the subjects of the posts are expressed through hashtags, which explains the aforementioned “issue” with hashtags being both assignable to conspiracy theories and/or tribes. We also observed the phenomenon hashtag stuffing amenable to Instagram’s design, which denotes spam-centric behaviour brought on by the use of irrelevant hashtags in order to attain a bigger audience (Wayne, 2020; Rodriguez, Social Captain; Anderson, 2019; Rogers, 2019).

Network graphs:

In order to gauge the convergence of conspiracy theories in our Instagram dataset we chose a network visualization approach. We created hashtag mentions networks that show the relationship between users and their mentioned hashtags. So if users start using the hashtags related to corona and their particular branch of conspiracy that could be seen as taking advantage of the corona crisis to promote conspiratorial content. Furthermore, if we find that users over time start sharing hashtags associated with different conspiracy theories we could see this as evidence for the idea that conspiracies from distinct tribes are converging.

From all the data available we decided to code 600 Hashtags. We included 800 users and an image analysis of 1000 posts.

TikTok

Like most social media platforms and apps, TikTok can be considered a “tool for empowering participatory culture” (Jenkins), drawing together and facilitating multi-sided relationships between different user groups (i.e. various types of content creators and viewers). Following Rieder et al.’s (2018) approach, we aimed to include both sides of this relational structure in our content engagement analysis – the content creators and viewers of COVID-19 conspiracy related videos on TikTok.

After a qualitative study of the platform and close look at trending hashtags, we created our initial dataset with the TikTok scraper “tiktokr” (GitHub TikTokr), using a compiled a list of COVID-19 related hashtags (Appendix 1). Due to the app’s API restrictions, it is unclear how the video samples were selected by the app (e.g. not based on recency or popularity) and the dataset does not show all videos labelled with the respective hashtags. However, the retrieved data resembles the exact output an ordinary user would receive when searching for the hashtags. This initial dataset provided us with information on the user identification, video URL, publishing date, video duration, user name, sounds and hashtags used, as well as engagement metrics, such as like, comment, share and follower counts.

In the next step, we created two derived datasets: One with the top 200 users within our initial dataset and one listing only the conspiracy theory (CT) accounts among them. Considering our hypothesis that COVID-19 conspiracy hashtags are used opportunistically to promote unrelated content published by non-conspiracy-theory-related users, it would have been counterproductive to take the follower count as a metric for the compilation of a top user account dataset, as popular accounts might all the more make use of “trending” COVID-19 conspiracy hashtags for their own agenda. Therefore, instead, the publishing frequency was taken as the sorting metric.

To get an impression of which types of users create content labelled with COVID-19 conspiracy related hashtags, we color coded the top 200 users with the following content creator type labels: “conspiracy theorist”, “influencer”, “celebrity”, “ordinary user” (i.e. users that post random personal videos and cannot be considered an influencer or celebrity), “political affiliation” (i.e. political groups/parties, politicians), “religious affiliation” (i.e. religious groups/leaders general), “private” (i.e. accounts that were set to “private” at the moment of analysis), “health professional”, “news outlet”, “deleted user” (i.e. users that could not be found anymore at the moment of analysis), “other”. During our analysis, we quickly noticed that the categories “religious affiliation” and “news outlet” were, however, not present in our dataset. Similarly, we color coded the CT accounts with the respective tribe affiliation labels from our common coding scheme and created a graph showing the CT accounts’ volume output over time.

Following the user analysis, we created another derived dataset with the 100 most frequently used hashtags within our dataset and color coded them using the conspiracy theory labels from our common coding scheme. To support our findings from this analysis, we also created a network visualization that illustrates the connection of the individual hashtags with one another.

Finally, to ensure that all sides entangled ‘in the context of specific sociocultural issues’ (Rieder et al. 52) were taken into consideration, a new dataset with the 100 most liked videos labelled with hashtags centred around three specific, trending COVID-19 related issues (anti-vaccination, Bill Gates and the Plandemic movie), was created to gather insights into viewer engagement. The dataset was created using the hashtags #antivaxxers #vaccinateurkids #idonotconsent #vaccinesharm #plandemic #plannedemic and #planneddemic. Using the platform-specific user-type labels, as well as the overarching tribal affiliation labels, this dataset, too, was color coded and the results thereof visualised.

Furthermore, it must be mentioned that the selection of items analysed per sub-dataset is based on the researchers’ understanding of a representative sample within the scope of this research project. Moreover, our analysis was based on a mixed-methods approach, making use of both quantitative and qualitative research methods.

Telegram

Much like the other platforms, research on Telegram was done through mixed methodologies, combining qualitative and quantitative approaches within the wider frame of Digital Methods put forward by Richard Rogers (2019). Following Rogers’ call to analyse specific digital objects generated by the affordances of online devices, we focused our attention on Telegram’s public groups, as a platform-specific form of expression.

Telegram is a messaging app whose main focus is encrypted communications, either one-on-one or through private groups with several individual users. However, in 2015, the platform introduced public channels, enabling one-to-many communication. By introducing this function, Telegram allowed for a “private sociality” (Rogers 2020), where users come to the app to engage in private conversations, socialising with strangers being a secondary aspect of the experience, thus inversing the traditional mechanisms of social networks. This particular user experience places public channels in an interesting and unique position, being both means to communicate with thousands of individual users, while not playing a central role in the platform.

Telegram’s specific affordances, geared towards the generation of private sociality, favours an ethnographic engagement with its objects, which was done prior to the Summer School by one of the team members (see Initial Datasets). Our first collective engagement with the data was qualitative, generating a list of most viewed posts per channel and eliminating all posts unrelated to covid-19. Through this process we managed to have a relatively clear understanding of each channel’s political leanings and the covid-related content that generated more engagement. We then moved to a more quantitative analysis, querying the overall dataset for the terms that we associated with different conspiracy theories and examining their recurrence over time. Finally, we combined both approaches to clearly see which covid-related posts were generating more engagement and which were the political leanings of their authors.

YouTube

This subproject builds on theorization of ‘networks of opinions’ and repurposes Rebecca Lewis’ (2018) methodology of mapping co-appearances in her research on the Alternative Influence Network of the alt-right, translating these insights to the rising popularity and dissemination of conspiracy theories on YouTube. Situated within the broader phenomenon of online conspiracy tribes that form different online hubs for influence, this project moreover classifies these influencer practices from a variety of tribal themes.

Beginning with a curated list of COVID-19-related queries across platforms, a list of 22 conspiracy influencers was produced; of these, 20 had YouTube channels. We then visited these channels to determine who they hosted and visited their own channel, if one existed. The outcome of this was a list of 160 channels, not all of whom were active or uploaded content related to COVID-19. Due to practical limitations and time constraints, this is not a complete list of all co-appearances. For example, new guest appearances have occured and several channels in our initial dataset no longer exist. The method illustrated here provides a template for future research in this direction. Afterward, qualitative analysis of the content of these channels was performed. Additionally, channel metadata, such as keywords, as well as descriptions, subscriber counts, and locations were collected via YouTube Data Tools (Rieder, 2015). Finally, a shorter, curated list emerged, containing 75 channels with subscriber counts ranging between 10,000 (Jenny Constantine) and 4.5 million subscribers (StevenCrowder). The cut-off point was chosen to eliminate creators with negligible reach and inconsistent posting habits. Channels with non-COVID-19 content were catalogued but not included in the final network.

Based on the agreed-upon coding scheme, these 75 channels (see Appendix) were categorized into conspiracy tribes through an analysis of their keywords and descriptions, as well as their content if necessary. We also obtained video metadata from the 25 most subscribed channels and performed qualitative analysis of titles and descriptions. After visualizing the network of conspiracy creators, we determined the two most prominent nodes: Dustin Krieger (known as Dustin Nemos) and Jason Goodman. Using our coding scheme, we categorized their videos based on the different COVID-19-related conspiracy theories, as well as other common themes, in order to gauge their evolution overtime, the convergence of established conspiracies around the coronavirus, as well as the effect of co-appearances on these creators’ visibility.

5. Findings

Instagram

Professor Karen Douglas in Ruby Lott Lavinga’s Vice article demonstrates how people become more interested in conspiracy theories to meet unmet needs in times of crisis. During these times, these unmet needs become more pronounced as people are met with fear of the unknown, and especially with regards to this pandemic and the lack of information known about this virus. So as a result, influencers during this time, it was found, may turn to conspiracy theories as a coping mechanism in order to deal with these overwhelming feelings and unmet needs (Ruby Lott-Lavigna, Vice). They may also find that including conspiracy theory related content in their Instagram accounts brings traffic to their pages, and so may be using this pandemic as an opportunistic means to gain followers.

It was also observed that those who believed in conspiracy theories were more likely to consume their news from unregulated social media (like influencers, as people see them as a source of reliability and trustworthiness) as opposed to reliable, trustworthy sources (Easton, BBC). Shockingly, it was observed that Instagram is likely to overtake Twitter as a source of news (Wood, CityAm) in the UK. Our image analysis data supports this due to the overwhelming text and fonts being observed on the photos we analysed. During lockdown, Instagram became a seemingly reliable source of news for young people who respond positively to visual imagery. Their reliance on Instagram overtook the trust they put in traditional (print) media (Wood) in favour of the visual imagery of Instagram.

Throughout our investigation on Instagram we found notable indications that the different conspiracy tribes, as outlined in our methodology, are converging in their use of conspiracy hashtags (Figure 1).

Figure 1: No of hashtags associated with coded conspiracy categories from Feb-June 2020 (minrad:2, max rad:18)

Hashtag Analysis

In our visualisations of January and February (Figure 2), we observed in our user network according to hashtag mention use, that QAnon (Tuters, OILab, 2020), deep state and Trump related hashtags have a strong presence and are very centrally located in the network visualization of our data set.

QAnon has become a conspiracy theory defined as “conspiracy without theory” coined by political scientists Nancy Rosenblum and Russell Muirhead (Cline, The Conversation). It gesticulates towards some sort of conspiracy theory without ever backing it up with reliable, trustworthy information and basing its information on baseless accusations against the Obama administration. Although QAnon has been around since 2018, it is nothing new and comes from the same fringes as other conspiracy theories (Marc-André Argentino, The Conversation). André Argentino observed through extensive research undertaken of QAnon movements that the majority of QAnons are Republican. Perhaps, due to this, it receives considerable mainstream attention and is no longer a conspiracy that exists on the fringes (André Argentino). Here we can see relations to the phenomenon of normification - the entrance of conspiracy theories into narrative discourse and mainstream media.

The below network graphs will illustrate the hashtag mentions networks for Jan-Feb, March-April and May-June. The network shows the mentioned hashtags by users in our sample. The colours represent our labels for hashtags and the grey nodes signify users mentioning hashtags in their posts.The nodes and labels are scaled by degree (the bigger the node and title in size the more connections it has to other nodes). This network visualizations are useful to understand the way different conspiracy narratives embodied in hashtags are being spun up by individual accounts.

Figure 2: https://imgur.com/a/kvhpJs0 Hashtag mention networks January and February.

Figure 3: https://imgur.com/w1P74mZ Hashtag mention network March and April.

In posts we observed from March and April (Figure 3), we observed that hashtags related to COVID-19 became a kind of linchpin that brought previously disparate conspiracy theories together. While COVID-19-related hashtags served their linchpin function to connect all of the conspiracy theories we studied, this effect was particularly pronounced for Deep State and Bill Gates conspiracy theories. This can be observed in our network visualization for March and April hashtags. Goldenrod circles representing COVID-19 hashtags cluster in the center of the visualization, while purple circles representing Deep State-related hashtags and dark green circles representing Bill Gates-related hashtags are among the most common in the immediate vicinity. During these months, we observed that the conspiracy theory clusters and hashtags shifted around COVID-19 hashtags themselves, standing in contrast to the visualization of the 2 prior months where they were observed as scattered outliers in the dataset. Also notable and central to our findings was the presence of QAnon hashtags.

Figure 4: https://imgur.com/a/CMeqUPH _Hashtag mention networks May and June_

During May and June (Figure 4), we can observe that activity and the density of connectedness has increased dramatically since the previous two months. The interconnectivity seems to have increased in particular around conspiracy related hashtags and while COVID-19 related hashtags are still very present and central in the network, they are not necessarily the focus of the hashtag discourse during May and June. Other conspiracy related hashtags such as deep state, Trump and QAnon related (all interconnected) have their peak in June. COVID-19 related hashtags still being present and central within the network may indicate that the peak of COVID-19 may have been the peak time frame where the most newcomers came into the “hate multiverse” (Bell and Maxmen) which consequently leads to a subsequent peak increase in conspiracy related discourse.

On Instagram, we observed 3 notable types of engagement with conspiracy theory hashtags.

1. Ironic, trolling use

2. Genuine believer use

3. Opportunistic, attention-hacking, commercial use

The third type of engagement could perhaps be an effect of the social-mediatizing of the web: the professionalization of influencers, etc. We can observe users in this category using the pandemic opportunistically to gain traffic to their accounts. Whether that information is accurate or not, it was found in these sources that women “are uniquely well-positioned to open people’s minds to dubious and false information” (Cook, Huffpost) (Greenspan, Insider; Cook; Flora, Glossy) due to their presence in the social media sphere as occupying the position of influencers. People put a great amount of trust in these influencers as sources of reliability as they feel a great personal relationship with them. They see them as vectors of “parasocial interaction” and attaining a “microcelebrity” status (Cook) with which reliability is implied.

Bell also outlines that the problem on Instagram is the spread of misinformation on the platform is mainly facilitated by memes and visual imagery presented by these influencers. This however makes it difficult for the platform to detect misinformation. This sentiment is also echoed by Rebecca Lewis in Lorenz’s article, as she notes on Instagram, “memes pages and humor (are) a really effective way to introduce people to extremist content,” [...] “It’s easy, on Instagram, to attach certain hashtags to certain memes and get high visibility” (Lorenz). We observed that the tribes we had devised shared similar visual tactics.

As the network distance (in our network visualisation graphs) increases in our visualisations, we can observe that non-COVID-19 conspiracy theories show up in separate clusters. We can also observe a shift in hashtag discourse as time progresses during the months of our timeframe.

During January and February COVID-19 hashtags were spread out and were observed in a low presence. During March and April, COVID-19 hashtags become central to the narrative around conspiracy theories. And finally, during May and June, we can observe that our hashtag discourses shift away from COVID-19 hashtags and instead conspiracy theories themselves become the center of conversation (activity increased, as time went on).

Visual Analysis

We scraped the photos (Figure 5) associated with the 2000 most liked instagram posts with conspiratorial content from February until June. Almost half of this content was unavailable either because it was taken down or because the link to the photos had expired, which suggests possible deplatforming by Instagram. From the remaining thousand instagram photos, we observed that almost half of them consisted of screenshots from Twitter which are used to present evidence for various claims or as probes for further discussion and/or ridicule. Other visual motifs include user generated memes and the use of iconic faces such as politicians.

Figure 5: 2000 most liked Instagram posts February-June with conspiratorial content - ImageSorter

TikTok

TikTok is an entertainment-oriented video-hosting platform which is distinguished by the immersive user experience and stimulation mechanisms (Zhou). The videos on TikTok can range from 15 seconds to 60 seconds. One of the distinctive features of TikTok is the possibility for users to choose background music of the videos they want to post within the platform. Founded as “Musical.ly” in 2012, the Chinese video-sharing social networking service owned by ByteDance (Tognini), has thus far evolved into the most downloaded social media app with over two billion global installs (Singh). In June 2018, TikTok announced that the number of monthly active users worldwide added up to 500 million (Jon).

It has been reported that knowledge sharing is one of the most popular categories of content on TikTok (CBNData). Apart from its increasing relevance for ordinary users and advertisers alike (Sehl), it also presents itself as a fruitful environment for the spread of alternative news and misinformation, as content moderation is hampered by the use of obfuscation techniques, such as the concealment of conspiracy theories behind trending pop songs and filters (Dickson). These difficulties in content moderation, paired with the platform’s predominantly young – nearly 70 percent of its users are between 13-24 years old (Stehl) – and seemingly knowledge-hungry target group (Dellinger), result in a particularly dangerous space for a mainstream adoption of COVID-19 conspiracy theories (Dickson) and establishment of different conspiracy tribes.

The spread of misinformation about COVID-19 in TikTok got very limited attention from the research community so far. To name just a few of them, Weimann and Masri (2020) conducted a study on the far-right’s use of TikTok based on a systematic content analysis of TikTok videos, posted in early 2020. Waghre and Seth (2020) analysed the responses of digital platforms to the information disorders around COVID-19 in India. A group of scholars from the Reuters Institute for the Study of Journalism identified some of the main types, sources, and claims of COVID-19 misinformation by analysing a sample of 225 pieces of misinformation published in English between January and the end of March 2020 (Brennen, Simon, Howard & Nielsen, 2020). While TikTok was part of the codebook, the study does not provide any distinctive results concerning this particular platform.

From a new media research perspective, then, it is imperative to gain a better understanding of through whom and which modes of reproduction COVID-19 conspiracies spread on TikTok and which – if any – conspiracy tribes exist on the platform.

Our study on conspiracy theories about COVID-19 on TikTok focused on the top users publishing content labelled with COVID-19 conspiracy-related hashtags and the most frequently used hashtags within that dataset. Both the content creator and viewer engagement were analysed.

User analysis

As figure 6 shows, the analysis of the top users within our dataset revealed that the amount of conspiracy theorists (45%) and ordinary users (35%) publishing TikTok videos labelled with Covid-19 related hashtags is very similar. Other user groups, such as health professionals, politically affiliated accounts, celebrities or influencers appear comparatively seldom. An equally small share of users that could not clearly be categorized with any of our user labels, was classified as “other”.

Figure 6: TikTok user analysis results

Conspiracy theory tribes among conspiracy user accounts on TikTok.

By taking a closer look at the accounts of the conspiracy-theorists within our dataset, we discovered that the most prominent conspiracy theory-tribes are Trumpists (27%) and far-right users (27%), followed by alternative media accounts (14%). The qualitative analysis and close reading/watching of the videos published by these conspiracy-related accounts showed that most of their content was not focused on COVID-19 per se but that they rather started posting COVID-19 conspiracy videos after the outbreak of the virus. Furthermore, figure 8 illustrates the development of these accounts’ volume output over time and clearly presents a stark increase in postings from the beginning of the COVID-19 crisis, in January 2020, to the end point of our analysis in July 2020.

Figure 7: conspiracy theory accounts analysis results

Figure 8: Volume output of conspiracy theory accounts over time

Analysis of most frequently used hashtags

The analysis of the 100 most frequently used hashtags within our dataset showed that the largest share of hashtags (40%) was not related to specific conspiracy theories, in addition to nearly 15% of hashtags that referred to general content on COVID-19. However, the second largest share of top hashtags was categorized as “QAnon”-conspiracy related (28%), while less than 10% of the most frequently used hashtags within our dataset is affiliated with Bill Gates or “NWO” conspiracy theories. Surprisingly, even though 5G conspiracy related hashtags appeared to be trending at the time of our initial qualitative study of the app, they only accounted for one percent within our top hashtags dataset (as did “deep state” conspiracy theory hashtags). These findings are coherent with the results of the network analysis of the most frequently used hashtags. As figure 10 shows, while a cluster of QAnon conspiracy theory related hashtags can be observed, hashtags affiliated with other conspiracy theories and general COVID-19 content appear rather unconnectedly within our dataset.

Figure 9: conspiracy theory affiliation of 100 most frequently used hashtags in dataset

Figure 10: Network analysis of most frequently used hashtags

Viewer engagement analysis

The analysis of the most liked videos within our dataset showed that 90 out of the 100 most engaged with videos on TikTok were published by dedicated conspiracy theory accounts. Moreover, a closer look at the individual users showed that all ten non-conspiracy user videos, labelled as “influencer”, were published by the same account. Similarly, among the conspiracy theory accounts, we noted a recurring set of content creators. Other user categories, such as health professionals, celebrities or politically affiliated users did not appear in the 100 top liked videos. These findings stand in stark contrast with the results of our user analysis that present a more diverse distribution of user categories, as well as an almost equal share of conspiracy theory accounts and ordinary users within our dataset.

Figures 11: recurring accounts within content creators of most liked videos (red: conspiracy account, blue: influencer)

As for the conspiracy theory accounts’ tribal affiliation within the 100 top liked videos, the largest share of accounts is affiliated with the far-right (32.2%), followed by religious fundamentalists (22.2%), alt-health (18.9%) and trumpists (13.3%). A comparatively small amount of users was labelled as “anarchist” and “other” (6.7% respectively). Except for the absence of alt-media accounts within the most liked videos, these findings are similar to the results of the top user analysis.

Figure 12: tribal affiliation of conspiracy theory users publishing most liked videos

Telegram

Figure 13:Posts containing specific conspiracy terms in relation to their engagement

Figure 14:Posts combining two or more conspiracy terms in relation to their engagement

Telegram research was particularly interesting insofar it provided evidence contradicting one of our initial hypotheses. As we can see in fig 14., the number of posts combining multiple conspiracy theories remains relatively stable over the entire period, indicating that there wasn’t a convergence effect but rather that conspiracy theories related to the COVID-19 crisis were presented together from the outset on this platform.

This initial finding is contextualised by the way in which Telegram channels interacted with the outside world. The evolution of the posts clearly mirrors that of the virus, starting to generate content and engagement in February, when the epidemic started in Asia, and reaching its peak in early March as it moved to Europe and the United States. The channels react to events in real time, including to the spread of conspiracy theories, as we can see by the multiplication of posts related to 5g in mid-April. However, engagement does not mean approval. Fig 15 was the most popular post related to 5g conspiracy theories but the author does not seem to really believe in them. Instead, he seems to be delighted by the way in which people reacted to the pandemic, without expressing any thoughts regarding the veracity of the conspiracy.

Figure 15: Most engaged with post related to the 5G conspiracy theory

Moreover, some conspiracy theories that were popular on other platforms are openly mocked on Telegram. For instance the QAnon conspiracy theory, that has a large audience on Instagram, Tik Tok and Youtube, appears only 17 times on Telegram. None of those 17 posts show any signs of belief in the conspiracy theory, and most of them openly mock it, as we can see in fig 16.

Figure 16: Post mocking QAnon conspiracy theories

On the other hand, the conspiracy theories that did find some purchase on Telegram were associated with more conventional actors, such as China. Discussions on the geo-political implications of the virus and national reactions dominate the conversation. While these discussions were tainted with racist, xenophobic and conspirational perspectives, the most engaged-with posts were news pieces covering international reactions to the virus (fig 17). Specific conspiratorial posts gathered less readers and seemed less interesting for users than the news coverage.

Figure 17: Most engaged-with post related to Covid-19, describing diplomatic tensions between China and Iran.

All in all, the Telegram findings indicate a platform-specific engagement with COVID-19 conspiracy theories that stand in stark contrast to that of the other platforms. The public channels combined conspiracy theories from the outset and throughout the crisis and engaged with many of them from an outsider’s perspective. The conspiracy theories that did feature on Telegram were less inclined to provide overarching narratives such as QAnon and rather try to explain certain aspects of current affairs.

YouTube

Figure 18: The COVID-19 Conspiracy Influence Network. This is an exploratory look into the network of conspiracy dissemination and influence on YouTube.

Our research points to the convergence of tribes around the theme of the COVID-19 in what we call the COVID-19 Conspiracy Influence Network, which links creators active in the dissemination of ‘conventional’ conspiracies (e.g. 9/11, deep state), as well as Trumpist theories (e.g. QAnon), as well as alternative medicine aficionados and religious fundamentalists.

From our visualization, it becomes clear that Dustin Nemos and Jason Goodman are the principal nodes in this network, which means they host the most creators. As this research is still in an exploratory phase, other channels in our dataset are also potential powerful connectors in this network; in this report we focus on an analysis of the two. Both of these channels fall into the ‘alt-media’ category, offering a platform to a wide variety of viewpoints, and therefore provide a space of convergence of different conspiracy tribes.

Nemos describes himself on his YouTube channel as ‘a freedom-maximalist, Voluntaryist, Autodidact Polymath, Husband, Father, Entrepreneur, Farmer, Trend Watcher, Avid Researcher and hobbyist Economist, holistic researcher, Philosopher, And Political Talking Head’ (Krieger, n.d.) and defines his goal as being ‘a good dad, be a good person, be free, and destroy evil with logic, evidence, reason, and compassion’ (ibid.). Known as a conspiracy theorist and author of a bestselling book about QAnon, he recently attempted to capitalize on the pandemic by selling colloidal silver as a supposed treatment for COVID-19 through RedPill Living, an online shop on ecommerce platform Shopify (Hananoki, 2020). In fact, it is not uncommon for the influencers in our network to maintain some type of online shop or to collaborate with brands and offer promotional codes to their audience, both of which are prevalent practices among brand influencers (Abidin, 2016b).

Jason Goodman, on the other hand, hosts a show called Crowdsource the Truth on his YouTube channel, which he describes as a ‘community sponsored news source’ (Goodman, n.d.). This tendency towards claiming to offer an alternative source for news and political analysis is prevalent in Lewis’ Alternative Influence Network and generally appears to be the trend with many of the content creators on our list, who brand themselves as alternative media and antidotes to state propaganda. A number of influencers in our list maintain one or multiple backup channels and are active on alternative platforms popular with the alt-right, such as Gab, Bitchute, or Telegram. These are practices which they view necessary in the face of the supposed censorship and suppression they experience from mainstream platforms and the liberal establishment.

Figure 19: Topical evolution of Dustin Nemos’ channel. Size based on video view count. Visualization made with RAWGraphs.

Figure 20: Topical evolution of Justin Goodman’s channel. Size based on video view count. Visualization made with RAWGraphs.

Additionally, we have found that already established conspiracy theories, for example regarding 5g or the deep state, have consolidated around COVID-19 and this engagement has provided a surge of viewership and visibility for these channels. The practice of featuring other creators is quite significant, as it produces a kind of networked visibility for both the host and the guest, which is especially pertinent in the cases of extreme and marginal figures.

Figure 21: Engagement with Dustin Nemos with guest appearances highlighted.

Figure 22: Engagement with Jason Goodman with guest appearances highlighted.

In Figures 21 and 22 we zoom in on the engagement and hosting practices of these two influencers of the network, Dustin Nemos and Jason Goodman, in order to underline three points. Dustin Nemos’ channel which is highly connected to other conspiracy theory YouTubers through co-appearances and frequently features thought leaders and fringe celebrities—individuals like Robert Kiyosaki, author of the self-help book Rich Dad, Poor Dad and Judy Mikovits of Plandemic hoax fame. These are people with no known explicit connections to conspiracy theories networks on YouTube and who don’t owe their popularity or platform to digital media.

We can hypothesize that the main purpose of inviting them is to build credibility and increase the visibility of videos. Among the guests of Nemos: Robert David Steele, a former CIA case officer and proponent of open-source intelligence (51.4k subscribers on YouTube) and Dr. Dave Janda (Operation Freedom on YouTube), a retired orthopedic surgeon with 157k subscribers and 49k Twitter followers. Among Goodman's guests: Field McConnell, a career military and airline pilot and 9/11 conspiracy theorist (13.1k Twitter followers); Quinn Michaels, a QAnon blogger and AI specialist whose YouTube videos are distributed through different channels and have up to 80k views; Kevin Shipp, a former CIA whistleblower (54.3k YouTube subscribers and 164.4k Twitter followers); Dan Hanley, a former United Airlines B-777 captain and 9/11 conspiracy theorist; and Larry Nichols an American political commentator/grifter known for alleging various conspiracies involving Bill Clinton (10.2k YouTube subscribers).

Most of these guests have an established presence on various social media platforms and have cultivated an audience, for example via publications and media appearances. This means that a certain audience is already formed around each guest, which can develop into potential viewership for the channel.

6. Discussion

Instagram

Data and visualizations seem to indicate that there has been a convergence of conspiracy tribes around COVID-19-related hashtags and discourse on Instagram. Judging from the shifts from little, unrelated presence, to convergence and envelopment of COVID-19 hashtags, to the formation of a dense cluster that has recentered itself around conspiracy theory hashtags and away from COVID-19 hashtags, one could hypothesize that conspiracy tribes opportunistically instrumentalized the COVID-19 pandemic, related events and socio-cultural tensions to expand their own network. The notable increase in activity and conspiracy-related but non-COVID-19-related hashtag use that we observed in our data also supports that hypothesis.

Based on previous research on normiefication (Briones et al., OILab, 2019; Greenspan, Insider) - it can be observed that the entrance of these theories into mainstream media, the vernacular and the surface-level web population, in the form of the data that was collected in the framework of this research, could be interpreted as further proof that conspiracy theories have further entered public awareness and discourse.

Another concept to consider is the Overton window, related to deep vernacular conspiracy theories. (Peeters, OILab 2020), As the subject of research has been the use of conspiracy theory-related hashtags by Instagram users, the data and visualizations indicate that these theories have not only penetrated into the mainstream awareness and vernacular (normiefication), but also, as the increase in hashtag use and particularly the overlap of the QAnon- and Trump-related hashtag clusters seem to show, that the use of these concepts and the language associated to it has been accepted to a certain degree into the mainstream, which would be a widening of the Overton window.

Instagram is described as a “hotbed” (Bell, Engadget) for anti-vaccination conspiracy theories. This information gains traction through the algorithms and search recommendations used by the platform. It can be observed that the platform fails to fact-check its content to a great degree and does not ban anti-vaccination content (Bell), despite strides made by other platforms e.g. Facebook. However, there is evidence that the fact-checking on these platforms could be making fake news and misinformation worse (Gilbert, Vice). The platform recommends many anti-vaccination accounts which consequently sow fear in the people on the platform. It is notable of course that these accounts do not proffer trustworthy and reputable information. Many spout the connection between vaccines and autism, or support the conspiracy theory that Bill Gates had a hand in the pandemic (Bell). These accounts also recommend content to people concerning other conspiracy theories which supports our hypothesis about the convergence of different tribes.

Closely related to the anti-vaccination conspiracy theories, we observed a notable presence in our research, of the 5G hashtag and conspiracy theory related hashtags related to 5G (Moonshot).This is linked to 5G towers being the cause of COVID-19, and linked to elite members such as Bill Gates and George Soros (n.p, Deccan Chronicle). Whitney Phillips in the same article has observed these elite members as “abstract boogeym(e)n” (Deccan) when it comes to assigning blame to a bad actor during these unprecedented times. This link consequently decreases the amount of trust people have in vaccinations and consequently, people do not trust vaccinations to protect their children (Deccan). Gates seems to be the scapegoat of many conspiracy theories over the years, it has just received renewed vigour during the pandemic.

For future research, assigning multiple coding categories to the individual hashtags could give more nuanced insights into the dynamics of conspiracy tribes as we found a lot of hashtags to be related to multiple theories and / or tribes.

TikTok

To make sense of the way in which COVID-19 conspiracies spread on TikTok and ultimately answer the research questions posed within this project, first, an illustration of the platform’s recommendation system is required to gain a better understanding of the thought process behind the formation of our initial hypothesis.

By the platform’s own account, TikTok distributes content through a recommendation system that suggests content to viewers based on their user preferences and previous interaction with the app. Recommendations rely on a number of factors, including user interactions (i.e. liked or shared content, followed accounts), video information (i.e. captions, sounds and hashtags), as well as basic device and account settings. Furthermore, content is spread through the so-called “For You” page which aims to “invite new users to select categories of interest […] to help tailor recommendations to their preferences” (TikTok Newsroom). Content that initially appears on an unpersonalised “For You” page is based on its popularity and “trend-factor”. Thus, popular videos with a high retention rate (i.e. videos that are watched all the way through by a large number of viewers) spread exponentially on the platform (Feldman). As “getting on a lot of users’ For You pages is the fastest way to get a follower base” (ibid.), users try to growth-hack their way into the “For You” page by using trending sounds and hashtags to label their content. The popularity of hashtags, in particular, can easily be uncovered through a quick look at their current view counts on the app. Additionally, the platform clearly prohibits the publishing of misinformation in its Community Guidelines, actively calls for reporting content that contains intentionally deceptive information (Community Guidelines) and partners with third-party fact-checking organisations to remove misinformation (Strapagiel). Therefore, to circumvent the removal of actual COVID-19 conspiracy content, users “hide” COVID-19 conspiracy theories behind trending pop songs, own sounds and filters, rather than labelling them with the respective hashtags (Dickson). This still allows their content to spread virally and be recommended to a large number of viewers, while slipping through the cracks of the content moderation system.

Considering that, one could assume that actual conspiracy content is made unrecognizable at first sight through obfuscation techniques and rather spreads through individual accounts than hashtags, while COVID-19 conspiracy related hashtags are used opportunistically by ordinary users to promote their own unrelated content. This would then result in the majority of videos labelled with COVID-19 conspiracy related hashtags being unrelated to specific conspiracy theories and ordinary users dominating the space. A closer look at the findings of our user and hashtag analyses reveals that this hypothesis can, at least partially, be confirmed.

Even though the results of our user analysis show a dominance of conspiracy theory accounts within our dataset, an almost equal share of accounts are ordinary users. This indicates that they do, indeed, use COVID-19 conspiracy related hashtags due to their “trend-factor” and label their own unrelated content to reach a broader audience and gain a larger follower-base rather than actually being interested in publishing COVID-19 conspiracy content on their accounts.

This phenomenon can even be observed among the conspiracy theory accounts within our dataset, as our analysis revealed that very little accounts specifically focus on COVID-19 conspiracy theories but only started producing content thereon after the outbreak of the crisis. The fact that the largest amount of CT accounts in our dataset is affiliated with the far-right and Trumpists, rather then conspiratorial tribes that are arguably stronger associated with COVID-19 conspiracies, such as alt-media or alt-health accounts, even more so speaks for an opportunistic use of COVID-19 related hashtags among TikTok content creators.

Moreover, our hashtag analysis showed that the most frequently used hashtags within our dataset are unrelated to specific conspiracy theories which clearly indicates that hashtags are not the most dominant and successful mode of distribution on TikTok but that actual COVID-19 conspiracy content rather spreads through individual accounts and original sounds. Considering the platform’s stance towards misinformation and users’ discovered moderation loopholes, the claim can be made that actual conspiracy theorists or users intending to spread COVID-19 misinformation truly “hide” their content to avoid censorship by the platform.

Yet, the results of our viewer engagement analysis indicate that viewers are highly interested in videos published by actual conspiracy theorists and find said content, regardless of its hashtag labelling. However, users do not seem to turn to TikTok as a source of alternative news but appear to show more interest in rather propagandistic content on COVID-19 published by the far-right, Trumpists and religious fundamentalists. In light of this finding, a further analysis of the viewer engagement with COVID-19 conspiracy videos – as for example through a close reading of the comment sections – could be an interesting starting point for further research, as it would help answer the question whether viewers actively search for this kind of content or whether the platform’s mainstream character simply absorbs them into the world of conspiratorial tribes without their apperception.

Telegram

The platform-specific findings for Telegram stand in stark contrast to those of the other platforms. The data paints a complex picture where it would seem that Telegram users are hyper-aware of both the political and social dimensions of the virus, and the conspiracy theories that were formed to explain them. While the public channels that we analysed were reticent to partake in many of the most popular conspiracy theories, they still broadcasted some of the most extreme positions. Our findings seem to indicate that the Telegram audience is more knowledgeable about conspiracy theories and considerably more radicalised than the public in other platforms.

In order to understand this divergence, we must first understand how Telegram as a platform differs from the other social media present in this research. For starters, Telegram is considerably smaller, with only 400 million users (Telegram 2020), as opposed to Tik Tok’s 800, Instagram’s billion and YouTube ’s 2 billions (Clement 2020). Privacy and security are at the core of Telegram’s brand, and users who join this social network tend to be looking for very specific affordances that provide them with a sense of control (Rogers 2020). These affordances are at the core of the platform’s “private sociality” mentioned in the methodology, which is an inversion of the traditional social media logic present in the other three platforms (idem). Last, but not least, Telegram has an extremely loose approach to content moderation, allowing most media to be uploaded and even hosted in the website. The combination of these different factors creates a welcoming environment for some of the most extreme views online. By privileging instant messages with specific users rather than the creation of an online self broadcasted to the community, Telegram has become a privileged space for exchanges between groups of people that have little to no interest in cultivating a public profile (Shehabat et al. 2017).

Telegram’s emphasis on privacy places the platform at the rear-end of the radicalisation path. Those who end up in these public channels, and in the private groups that reference them, tend to be embedded in radical movements and do not need to be convinced (Ebner 2019). While most conspiracy theorists online seek to spread their positions and gain followers, the Telegram public is already convinced and has a defined ideological position. It is interesting to compare our results to Robert Evans and Jason Wilson’s journalistic investigation of the Boogaloo Movement (2020). Evans and Wilson examine the evolution of the movement across platforms, starting from 4chan and including Facebook groups and websites associated with the militia movement. Their investigation brought them to some of the Telegram public channels that were included in our research, and they concluded that these spaces were the most radical of them all, where the Boogaloo Bois intersected with fascism and open neo-nazism.

A similar process seems to be at play in relation to COVID-19 conspiracy theories, insofar they are co-opted and embedded in ideological systems of belief that transcend individual theories. For instance, a neo-nazi channel might spread the conspiracy theory that the virus is a bioweapon created in China and spread to the West, but rather than present the theory and let it run, it will be weaponised to expose the decadence of the West and a supposedly imminent race war. That being said, other ideologically-similar groups might ditch this approach altogether and effectively treat it as nonsense, instead arguing that the virus is a climate catastrophe that brings to the front the contradictions of neoliberalism. Regardless of whether they agree with the theory or not, these discussions tend to be considerably more sophisticated than the conversations surrounding conspiracy theories in other platforms.

Telegram as a platform displays a social logic contrary to that of most social media, where private conversations are privileged and public interactions are a secondary feature. This design encourages the use of the app by closed and ideologically-cohesive groups where the large majority of members is already convinced about certain ideological positions. The logic at play here differs considerably from that of mainstream social media, where conspiracy theorists aim at persuading a large audience and radical perspectives cannot expect to find widespread agreement without forcefully engaging in a debating process.

YouTube

Since 2018, YouTube has become the target of much criticism for providing a platform for extreme political influencers, thereby earning the title as ‘one of the most powerful radicalizing instruments of the 21st century’ (Tufekci, 2018). On July 1st, 2020, after years of public backlash and deliberation, the platform concluded its 48 hour purge, removing major ‘problematic’ figures such as David Duke, Richard Spencer and Stefan Molyneux (Newton & Shiffer, 2020). As Lewis has pointed out in her work on the Alternative Influence Network on YouTube: ‘Influence is not created in a vacuum—it occurs within, and propagates through, social networks. Part of the way influencers build followings is by becoming nodes around which other networks of opinions and influencers cluster. One of the most effective ways to network on YouTube is by referencing and including other people in video content’ (p. 5).

The alt-right and conspiracy theory space on YouTube is a contingent one, with conservative influencers often migrating to fringe platforms like Gab, Bitchute, and Telegram (Rogers, 2020). However this ‘deplatforming’ is by no means final, as the presence of these creators, and therefore their ideas, can still be found in the channels of those who feature them. As Lewis argues, such influencers are not only visible on their own channels, but through guest appearances on other channels. For instance, Stefan Molyneux is still found on YouTube in conversation with influencers such as Steven Crowder. In fact, this practice of hosting does not only incur legitimacy to the creators, but also increases their visibility within the network. These guest appearances are still available to vast audiences and provide them with a pathway to Molyneux’s website and his books; allowing him to continue enticing and persuading audiences. Thus, while a creator’s account has been terminated, their influence on the platform has not. The question that follows is whether YouTube would, or should, go as far as to remove all videos of influencers hosting such problematic figures from its platform.

While Lewis has been critiqued for assuming a passive audience uncritical of the information it is presented with (see Holt, 2019), it does seem that the formulation of an argument—and possible policy—for the removal of extreme political content from YouTube requires a broader understanding of the influence dynamics on the platform and how hosting practices can potentially help radicalize moderate viewers. One of the most effective ways to network on YouTube is by referencing and including other people; when moderate influencers host or are hosted by extreme figures, the latter not only increase their visibility, but are also arguably legitimized in the eyes of the public (Lewis, 2018). While the term influencer is often associated with feminized and apolitical forms of digital labor and has often been disregarded as a conduit of political influence, hypervisible online celebrities do in fact contribute to the dissemination of political ideas (Abidin, 2016a; Lewis, 2018).

Hosting practices help consolidate the visibility of these conspiracy channels and enable their mutual legitimization in a network of co-appearances. Through these co-appearances, links connecting different conspiracy theories and tribes are created, forming a complex web and enmeshing such diverse groups as QAnon supporters and Trump followers with proponents of alternative medicine and spiritualism. As COVID-19 has been mobilized by diverse actors, we can hypothesize that it potentially serves as an attention hacking technique in order to increase visibility and help disseminate or ‘normify’ deeply entrenched conspiracy theories via engagement with a topical issue, thus helping potentially shift the Overton window to the right (see Peeters et al., 2020). However, as these creators often capitalize on their viewership in ways similar to brand influencers, it is also possible that the elevated visibility is desirable as it is financially lucrative. This puts into question engagement with fringe and conspiratorial content more broadly.

New entries are introduced to the network and new links are formed as creators collaborate with each other; some of the individuals in our dataset are particularly prolific and frequently feature other creators on their channel, while others have since removed their content or had their channels deleted. The COVID-19 conspiracy landscape is therefore always in flux; we acknowledge that our work, much like Lewis’ work, cannot capture it in its entirety. Rather this research should be viewed as an initial, exploratory glimpse into what we call the COVID-19 Conspiracy Influence Network. Therefore, more work should be conducted in this direction in order to capture a more complete approximation of this network.

All in all, these content creators appear to have established an alternative media ecology for the purposes of disseminating reactionary ideology, premised on conventional influencer virtues —relatability, authenticity, and accountability—on one hand, and a countercultural and anti-establishmentarian social and political identity on the other.

7. Conclusion

COVID-19 has become the perfect facilitator of conspiracy theories (Quattrociocchi in Ball and Maxmen, Nature). Social media being used now more than ever, online platforms have become the marketplace for rumours to spread (Bell and Maxmen; Chandra and Pal, 2019). The pandemic has become part of a “hate multiverse” (Bell and Maxmen) which gives fuel to other conspiracy theories and “focus(es) an initially rather diverse and incoherent set of messages into a few dominant narratives” (Bell and Maxmen).

One feature that makes these networks interesting to analyse is their “capacity to draw in outside users through what Johnson and his team call “wormhole” links. These are shortcuts from a network engaged with quite different issues. The hate multiverse, therefore, “acts like a global funnel that can suck individuals from a mainstream cluster on a platform that invests significant resources in moderation, into less moderated platforms like 4Chan or Telegram” (Bell and Maxmen). As a result, Johnson says, racist views are starting to appear in the anti-vaccine communities, too. “The rise of fear and misinformation around COVID-19 has allowed promoters of malicious matter and hate to engage with mainstream audiences around a common topic of interest, and potentially push them toward hateful views,” his team says in the paper. This “global funnel” (Bell and Maxmen) effect has turned social media platforms into a vector for hate and “an entry point into the internet’s darkest corners” (Lorenz, The Atlantic).

This research project aimed to understand how COVID-19 conspiracy theories spread across Instagram, TikTok, Telegram, and YouTube. We found that engagement with such theories across platforms remains relatively low when the virus was still contained to China (January/February) but started to increase in March and peak in April/May when it became a global health and economic crisis. The findings thus seem to confirm the aforementioned notion of a “conspiracy boom” around the corona crisis. The hypothesis of a convergence of different conspiracy tribes and the theories they adhere to was harder to prove, although some platform-specific findings do point in that direction.

8. References

Abidin, C. (2016a). ‘Aren’t These Just Young, Rich Women Doing Vain Things Online?’: Influencer Selfies as Subversive Frivolity.’ Social Media + Society 2(2), doi:10.1177/2056305116641342.

Abidin, Crystal. Please subscribe!: influencers, social media, and the commodification of everyday life. 2016. The University of Western Australia, PHD Dissertation. Api Research Repository, < https://api.research-repository.uwa.edu.au/portalfiles/portal/9781681/Abidin_Crystal_2016.pdf >

Anderson, Dwayne. Instagram Follower Magnet Training Guide. United States: Estalontech, 2019.

André-Argentino, Marc. “QAnon conspiracy theory followers step out of the shadows and may be headed to Congress.” The Conversation. July 8 2020. Accessed July 9 2020. < https://theconversation.com/qanon-conspiracy-theory-followers-step-out-of-the-shadows-and-may-be-headed-to-congress-141581 >

Bell, Karissa. “How Instagram’s anti-vaxxers fuel coronavirus conspiracy theories.” Engadget. 15 May 2020. Accessed 9 July 2020. < https://www.engadget.com/instagram-anti-vaxxers-coronavirus-conspiracy-theories-173021562.html >

Berman, Elizabeth A. "An Exploratory Sequential Mixed Methods Approach to Understanding Researchers’ Data Management Practices at UVM: Integrated Findings to Develop Research Data Services." Journal of eScience Librarianship 6(1), 2017. Pp. 1-24.

Brennen, J. Scott, et al. "Types, sources, and claims of Covid-19 misinformation." Reuters Institute (2020).

Bounegru, Liliana. “Secondary Orality in Microblogging.” Masters of Media. University of Amsterdam. October 13 2008. Accessed 15 July 2020. <http://mastersofmedia.hum.uva.nl/2008/10/13/secondary-orality-in-microblogging/>.

Bucher, Taina, and Anne Helmond. The SAGE Handbook of Social Media. The United Kingdom: SAGE UK Incorporated, 2017.

Bordalo, Pedro, et al. Older People are Less Pessimistic about the Health Risks of Covid-19. No. 27494. National Bureau of Economic Research, 2020. https://www.nber.org/papers/w27417

CBNData, (2017). 2017 短视频行业大数据洞察, CBNData 专业报告.

Chandra, Priyank, and Joyojeet Pal. "Rumors and Collective Sensemaking: Managing Ambiguity in an Informal Marketplace." Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 2019. Pp. 1-12.

Cline, Andrew. “How to understand Obamagate - Donald Trump’s latest conspiracy theory.” The Conversation. 26 May 2020. Accessed 13 July 2020. < https://theconversation.com/how-to-understand-obamagate-donald-trumps-latest-conspiracy-theory-138987 >

Clement, Jessica (2020). Most popular social networks worldwide as of April 2020, ranked by number of active users. Accessed 13 July 2020.

https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/

Community Guidelines. TikTok. Accessed 16 July 2020. < https://www.tiktok.com/community-guidelines >

Cook, Jesslyn. “A Wave of Radicalized Influencers is Mainstreaming COVID-19 Conspiracy Theories.” Huffpost. 19 May 2020. Accessed 9 July 2020. < https://www.huffpost.com/entry/radicalized-influencers-coronavirus-conspiracy-theories_n_5ec3fc10c5b6297713ce66ff >

Covid19.WHO.int._ “WHO Coronavirus Disease (COVID-19) Dashboard.” Updated daily. World Health Organization. Accessed 15 July 2020. <https://covid19.who.int/?gclid=Cj0KCQjw0rr4BRCtARIsAB0_48NKHV7fFfSfAOqrdy97n-O5MdKCh9PWprf53GlBizR82uVpK8CJwtAaAviFEALw_wcB>

danah boyd & Kate Crawford. Critical Questions for Big Data. Information, Communication & Society, 15:5, 2012, 662-679.

Dellinger, A. “Conspiracy theories are finding a hungry audience on TikTokMic. Accessed 16 July 2020. https://www.mic.com/p/conspiracy-theories-are-finding-a-hungry-audience-on-tiktok-23620372

Dickson, E. “On TikTok, COVID-19 Conspiracy Theories Flourish Amid Viral Dances” RollingStone. Accessed 16 July 2020. https://www.rollingstone.com/culture/culture-features/tiktok-conspiracy-theories-bill-gates-microchip-vaccine-996394/

Dong, Ensheng, et al. “An interactive web-based dashboard to track COVID-19 in real time.” The Lancet. Volume 20, issue 5 (2020): 1-2.

Easton, Mark. “Coronavirus: Social media 'spreading virus conspiracy theories'.” BBC. 18 June 2020. Accessed 9 July 2020. < https://www.bbc.com/news/uk-53085640 >

Ecdc.Europa.eu_. “COVID-19 situation update worldwide, as of 14 July 2020.” 2020. European Centre for Disease Prevention and Control. Accessed 15 July 2020. < https://www.ecdc.europa.eu/en/geographical-distribution-2019-ncov-cases >

Ebner, Julia. Going Dark: The Secret Social Lives of Extremists. London: Bloomsbury, 2019.

Evans, Robert and Jason Wilson. “The Boogaloo Movement Is Not What You Think”. May 27 2020. Bellingcat. Accessed 15 July 2020.

https://www.bellingcat.com/news/2020/05/27/the-boogaloo-movement-is-not-what-you-think/.

Farrar, Sarah. “The importance of tribal leadership.” Virgin. Virgin Media. 5 September 2017. Accessed 15 July 2020. < https://www.virgin.com/entrepreneur/importance-tribal-leadership#:~:text=The%20dictionary%20defines%20a%20tribe,years%2C%20humans%20have%20joined%20tribes. >

Feldman, B. “Unraveling the Mystery of the TikTok ‘For You’ Page” New York Magazine Intelligencer. New York Magazine. Accessed 16 July 2020. < https://nymag.com/intelligencer/2019/11/how-to-get-on-the-tiktok-for-you-page.html >

Flora, Liz. “‘I love you my beautiful #QAnon!’: When lifestyle influencers also peddle conspiracy theories.” Glossy. 24 June 2020. Accessed 9 July 2020. < https://www.glossy.co/beauty/i-love-you-my-beautiful-qanon-when-lifestyle-influencers-also-peddle-conspiracy-theories >

Gilbert, David. “It’s Official: Facebook’s Fact-Checking Is Making Its Fake News Problem Even Worse.” Vice. 5 March 2020. Accessed 10 July 2020. < https://www.vice.com/en_us/article/epgqxa/its-official-facebooks-fact-checking-is-making-its-fake-news-problem-even-worse >

Goodman, Jason. Description [YouTube channel], n.d. Accessed 9 July 2020. https://www.youtube.com/user/JasonGoodmancrowdsourcethetruth

Greenspan, E. Rebecca. “Lifestyle influencers are using COVID-19 to spread QAnon conspiracy theories: 'I truly believe I owe it to my audience to be more for them during this turning point in our culture'” Insider. 15 May 2020. Accessed 9 July 2020. < https://www.insider.com/lifestyle-influencers-using-covid-19-to-spread-qanon-conspiracy-theory-2020-5 >

Hananoki, Eric. (2020, April 8). “A QAnon grifter was selling colloidal silver as a supposed coronavirus treatment and cure”. Media Matters for America. 8 April 2020. Accessed 10 July 2020. https://www.mediamatters.org/coronavirus-covid-19/qanon-grifter-dustin-nemos-was-selling-colloidal-silver-supposed-coronavirus

Hayes, Rebecca A. et al. "One Click, Many Meanings: Interpreting Paralinguistic Digital Affordances In Social Media". Journal Of Broadcasting & Electronic Media, vol 60, no. 1, 2016, pp. 171- 187.

https://github.com/benjaminguinaudeau/tiktokr

Holt, Kristoffer. Right-Wing Alternative Media. London: Routledge, 2019.

Jenkins, H. Convergence Culture: Where Old and New Media Collide. New York:New York University Press, 2006.

Jon, R. “China’s ByteDance leapfrogs Uber to become world’s most valuable startup”. TechCrunch. 26 October 2018. Accessed 13 July 2020. < https://techcrunch.com/2018/10/26/chinas-bytedance-leapfrogs-uber-to-becomesworlds-most-valuable-startup/ >

Krieger, D. Description [YouTube channel], n.d. Accessed 9 July 2020. < https://www.youtube.com/channel/UCQ7VgW7XgJQjDEPnOR-Q0Qw >

Lewis, Rebecca.Alternative Influence: Broadcasting the Reactionary Right on YouTube. Data & Society. 2018. Accessed 9 July 2020. < https://datasociety.net/wp-content/uploads/2018/09/DS_Alternative_Influence.pdf >

Lorenz, Taylor. “Instagram is the Internet’s New Home for Hate.” The Atlantic. 21 March 2019. Accessed 9 July 2020. < https://www.theatlantic.com/technology/archive/2019/03/instagram-is-the-internets-new-home-for-hate/585382/ >

Lott-Lavigna, Ruby. “Inside the Worrying World of Coronavirus Truther Facebook Groups.” Vice. 15 May 2020. Accessed 9 July 2020. < https://www.vice.com/en_uk/article/m7jjp3/coronavirus-conspiracy-theories-5g-lockdown >

Maffesoli, Michel. The time of the tribes: The decline of individualism in mass society. London: Sage Publications Ltd, 1995.

McLuhan, Marshall. "The Gutenberg Galaxy. New York: Signet, 1969.

Maxmen, Amy and Philip Ball. “The epic battle against coronavirus misinformation and conspiracy theories.” Nature. May 27 2020. Accessed July 9 2020. < https://www.nature.com/articles/d41586-020-01452-z >

Moonshotcve. N.p. “COVID-19: the #5GCoronavirus conspiracy on Instagram and Twitter.” April 29 2020. Moonshot. Accessed 9 July 2020. http://moonshotcve.com/covid-19-the-5gcoronavirus-conspiracy-on-instagram-and-twitter/

N.P. “Belief in 5G COVID-19 conspiracy theories linked to violence, reveals Northumbria study.” Northumbria.ac.uk. 21 June 2020. Accessed 9 July 2020. < https://www.northumbria.ac.uk/about-us/news-events/news/belief-in-5g-covid-19-conspiracy-theories-linked-to-violence-reveals-northumbria-study/ >

N.P. “Conspiracy theories thriving online accuse Bill Gates of starting the virus outbreak.” Deccan Chronicle. 18 May 2020. Accessed 9 July 2020. < https://www.deccanchronicle.com/technology/in-other-news/180520/conspiracy-theories-thriving-online-accuse-bill-gates-of-starting-the.html >

OILab.eu. Marc Tuters. “The Birth of QAnon: On how 4Chan Invents a Conspiracy Theory.” 9 July 2020. Digital Methods Initiative. < https://oilab.eu/the-birth-of-qanon-on-how-4chan-invents-a-conspiracy-theory/ >

OILab.eu. Stijn Peeters. “Normification of extreme speech and the widening of the Overton window.” 15 May 2020. Digital Methods Initiative. < https://oilab.eu/normiefication-of-extreme-speech-and-the-widening-of-the-overton-window/ >

Ong, Walter. "Orality and literacy: The technologizing of the word." London and New York: Taylor and Francis Group, 1982.

Rieder, Bernhard. YouTube Data Tools (Version 1.10) [Software]. 2015. < https://tools.digitalmethods.net/netvizz/youtube/ >

Rieder, Bernhard, Ariadna Matamoros-Fernández, and Òscar Coromina. "From ranking algorithms to ‘ranking cultures’ Investigating the modulation of visibility in YouTube search results." Convergence 24.1 (2018): 50-68.

Rodriguez, Mario. “10 Hashtag Mistakes Every Marketer Should Avoid on Instagram.” 30 November 2018. Social Captain. Accessed 9 July 2020. < https://socialcaptain.com/blog/instagram-hashtag-mistakes/#:~:text=As%20mentioned%20earlier%2C%20Instagram%20does,that%20appears%20to%20be%20spam.>;

Rogers, Richard. Doing Digital Methods. London, California, New Delhi, Singapore: Sage, 2019.

Rogers, Richard. “Deplatforming: Following Extreme Internet Celebrities to Telegram and Alternative Social Media.” European Journal of Communication 35, no. 3 (June 2020): 213–29. doi:10.1177/0267323120922066

Briones et al. “Understanding Normiefication: A Cross-Platform Analysis of QAnon.” 2019. Digital Methods Initiative. < http://salhagen.nl/dmi19/normiefication >

Sampson, Tony D. Virality: Contagion theory in the age of networks. Minneapolis and London: University of Minnesota Press, 2012.

Sehl, K. “20 Important Tiktok Stats Marketers Need to Know in 2020” Hootsuite. Accessed 16 July 2020. https://blog.hootsuite.com/tiktok-stats/

Shehabat, Ahmad, Teodor Mitew, and Yahia Alzoubi. "Encrypted Jihad: Investigating the Role of Telegram App in Lone Wolf Attacks in the West." Journal of Strategic Security 10, no. 3 (2017): 27-53. Accessed July 16, 2020. www.jstor.org/stable/26466833.

Singh, M. “TikTok tops 2 billion downloads” Tech Crunch. April 28 2020. Accessed 16 July 2020. https://techcrunch.com/2020/04/29/tiktok-tops-2-billion-downloads/

Strapagiel, L. “COVID-19 Conspiracy Theorists Have Found A New Home On TikTokBuzzFeed News. May 27 2020. Accessed 16 July 2020. < https://www.buzzfeednews.com/article/laurenstrapagiel/pandemic-conspiracy-theorists-disinformation-tiktok >

Telegram. 2020. 400 Million Users, 20,000 Stickers, Quizzes 2.0 and €400K for Creators of Educational Tests. Accessed July 16, 2020. <https://telegram.org/blog/400-million?ln=r>

TikTok Newsroom “How TikTok recommends videos #ForYou” TikTok . June 18 2020. Accessed 16 July. < https://newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you >

Tognini, G. “Meet Zhang Yiming, The Chinese Billionaire Behind TikTok ” Forbes. November 4 2019. Accessed 16 July 2020. < https://www.forbes.com/sites/giacomotognini/2019/11/04/meet-zhang-yiming-the-chinese-billionaire-behind-tiktok/#88fe81475b3e >

Tufekci, Zeynep. YouTube, the Great Radicalizer. The New York Times. March 10 2018. Accessed 15 July 2020. < https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html >

Uscinski, Joseph E., and Adam M. Enders. "The Coronavirus Conspiracy Boom." The Atlantic, April 30 2018. Accessed 11 July 2020. https://www.theatlantic.com/health/archive/2020/04/what-can-coronavirus-tell-us-about-conspiracy-theories/610894/

Waghre, P., & Seth, R. (2020). Analysing Digital Platforms’ Responses to COVID-19 Information Disorder.

Wayne, Raymond. Hashtag Stories Strategy To Instagram Ads Success. United States: Estalontech, 2020.

Weimann, Gabriel, and Natalie Masri. "Research Note: Spreading Hate on TikTok." Studies in Conflict & Terrorism (2020): 1-14.

WHO.int. “Coronavirus.” 2020. World Health Organization. Accessed 15 July 2020. < https://www.who.int/health-topics/coronavirus#tab=tab_1 >

WHO.int. “Middle East respiratory syndrome coronavirus (MERS-CoV).” 2020. World Health Organization. Accessed 15 July 2020. < https://www.who.int/news-room/fact-sheets/detail/middle-east-respiratory-syndrome-coronavirus-(mers-cov) >

WHO.int. “Naming the coronavirus disease (COVID-19) and the virus that causes it.” 2020. World Health Orgnization. Accessed 15 July 2020. < https://www.who.int/emergencies/diseases/novel-coronavirus-2019/technical-guidance/naming-the-coronavirus-disease-(covid-2019)-and-the-virus-that-causes-it >

WHO.int. “SARS (Severe Acute Respiratory Syndrome.” 2020. World Health Organization. Accessed 15 July 2020. < https://www.who.int/ith/diseases/sars/en/ >

WHO.int. “WHO Statement regarding cluster of pneumonia cases in Wuhan, China”. 2020. World Health Organization. Accessed 15 July 2020. < https://www.who.int/china/news/detail/09-01-2020-who-statement-regarding-cluster-of-pneumonia-cases-in-wuhan-china >

Wood, Poppy. “Instagram likely to 'overtake Twitter' as a source of news.” CityAm. 16 June 2020. Accessed 9 July 2020. < https://www.cityam.com/instagram-likely-to-overtake-twitter-as-a-source-of-news/ >

Zhou, Q. (2019). Understanding User Behaviors of Creative Practice on Short Video Sharing Platforms–A Case Study of TikTok and Bilibili (Doctoral dissertation, University of Cincinnati).

Zuckerman, Ethan. "QAnon and the Emergence of the Unreal." Journal of Design and Science 6 (2019): 1-15.

Appendix 1: List of TikTok hashtags

#plandemic, #plannedemic, #billgatesisevil, #scamdemic, #arrestbillgates, #firefauci, #event201, #id2020, #plandemic2020, #coronalies, #saynotobillgates, #coronafake, #fakevirus, #stopbillgates, #qarmy, #adrenochrome, #pizzagate, #qanon, #nwo, #fearmongering, #illuminati, #coronavirus, #viruscorona, #panran, #vaccinate, #covidconspiracy, #5gtower, #protecthealthworkers, #pandemic, #panicdemic, #vaccinateurkids, #medicalfreedom, #vaccinesharm, #vaccineswork, #antivaxxers, #covid19test, #testcovid19, #covid19testing, #billgates, #stop5grollout, #stop5g, #stop5gflorida, #stop5gglobal, #freedomkeepers, #stop5gaustralia, #stop5guk, #stop5gitalia, #stop5gcalifornia, #stop5ginternational, #stop5gtowers, #stop5gespaña, #stop5gbarcelona, #stop5gusa, #stop5gcentralcoast, #stop5gpennsylvania, #stop5gtoday, #stop5ghawaii, #stop5gitaly, #stop5gworldwide, #stop5geverywhere, #governmentlies, #fuckbillgates, #informedconsent, #vaccineinjury, #markofthebeast, #parentalrights, #readtheinsert, #projectbluebeam, #mindcontrol", #cabal, #healthfreedom, #medicalrights, #wakeupsheeple, #reopenusa, #populationcontrol, #idonotconsent, #believemothers, #givegatesnochance, #medicalfreedomofchoice, #gibgateskeinechance, #widerstand2020, #protruth, #fucknwo, #attilahildmann, #medicalexemption, #coronalüge, #bodoschiffmann, #vaccinationchoice, #davidicke, #wedonotconsent, #betweenmeandmydoctor, #andrenochrome, #truther, #outofshadows", #hollyweirdisevil, #hisnamewassethrich, #georgesoros, #wherewegoonewegoall, #wearethenewsnow, #filmyourhospital, #qanonarmy, #thestormisuponus, #darktolight, #weareq, #nonewnormal, #rfidchip

Appendix 2: List of YouTube influencers and channels

Name

Channel

Subscribers

Steven Crowder

StevenCrowder

4520000

Robert Kiyosaki

The Rich Dad Channel

1320000

Stefan Molyneux

Stefan Molyneux

928000

X22Report

800000

SGT Report

600000

Nathan Rich

Nathan Rich

475000

Joseph Mercola

Mercola

363000

Dave Hayes

prayingmedic

358000

Tom Fitton

Judicial Watch

352000

Steven Greer

Dr. Steven Greer

309000

Steve Allen

Think About It

292000

Jeff Berwick

The Dollar Vigilante

288000

Lynnette Diamond Hardaway and Rochelle Silk Richardson

Diamond and Silk - The Viewers View

274000

JustInformed Talk

266000

Dana Ashlie

Dana Ashlie

246000

Max Igan

thecrowhouse

235000

Jordan Sather

Destroying the Illusion

228000

Thomas

TRUreporting

198000

Del Bigtree

The Highwire with Del Bigtree

189000

Liz Wheeler

Tipping Point With Liz Wheeler on OAN

179000

Dave Janda

Operation Freedom

156000

Adam aka ‘Marf’, Dex James

Marfoogle News

156000

Gregory Mannarino

Gregory Mannarino

130000

Sarah Westall

Sarah Westall

116000

Jason Goodman

Jason Goodman

111000

Dustin Nemos

Dustin Nemos

109000

Linda Paris

McAllisterTV

104000

George Webb

George Webb

102000

Rick Rene

Blessed To Teach

95100

Lynette Zang

TM TRADING INC.

92200

Jim Bakker

The Jim Bakker Show

87000

Rodney Howard-Browne

Rodney Howard-Browne

86700

Michael Tellinger

Michael Tellinger

80700

Mike Cernovich

Mike Cernovich

80500

Mark Passio

Mark Passio

77300

Dylan Wheeler

Educating Liberals

68200

DeAnna Lorraine

DeAnna Lorraine

65900

E. Michael Jones

E. Michael Jones

60500

Kevin Shipp

Kevin Shipp

52400

The Supernatural - God is NOT dead

52300

Tiffany FitzHenry

Tiffany FitzHenry

48500

Chris and Sheree Geo

Beyond The Veil

47400

Robert David Steele

Robert David Steele

45200

Rick Joyner

MorningStar Ministries

45100

Larry Klayman

Freedom Watch

44300

Andrew Kaufman

Andrew Kaufman

44300

Jerome Corsi

Jerome Corsi

43200

Titus Frost

Titus Frost

43000

Adam Riva

Dauntless Dialogue

42300

Craig Sawyer

Craig Sawyer

41800

Chandler Crump

Chandler Crump

38900

Sacha Stone (founder)

New Earth Project

38900

Pete Santilli

The Pete Santilli Show

37900

James Gilliland

ECETI Stargate Official YouTube Channel

36400

Victurus Libertas VLTV

35400

PortalToAscension

35400

John Cullen

John E Hoover

29200

Dr. Duke Pesta

FreedomProject Media

25100

David Whitehead

24900

Shepard Ambellas

Shepard Ambellas

21000

Cyrus A. Parsa

The AI Organization

20100

Michael Salla

Michael Salla

20100

Cyrus A. Parsa

The AI Organization

20100

Ann Vandersteel

Steel Truth

20000

Dan Duval

BRIDE Ministries International

18800

Mr. Burgandy

18500

Michael Oddane

WakeUpGlobe SE

16600

Andrew Bartzis

Galactic Historian

16400

Lisa Harrison

lisamharrison

15700

American Freedom Radio

15600

Zach Bush

ZachBushMD

15000

Alex Newman

The New American Video

14900

Joe Hoft

The Gateway Pundit

11900

Ronald Bernard

Ronald Bernard

11400

Jenny Constantine

Jenny Constantine

10500

Appendix 3: List of Telegram Channels

Name of the channel

url

subscribers

Paul Joseph Watson

https://t.me/pjwnews

23062

Sargon of Akkad

https://t.me/sargon_of_akkad

21770

MILO

https://t.me/MiloOfficial

18017

Nicholas J. Fuentes

https://t.me/nickjfuentes1

15256

Vincent James

https://t.me/RealVincentJames

10658

LAURA LOOMER

https://t.me/loomeredofficial

9841

Bellum Acta - Intel, Urgent News and Archives | COVID Crisis Edition

https://t.me/BellumActaNews

9822

Gavin McInnes

https://t.me/RealGavinMcInnes

8856

INFOWARS.COM 🚫🚫🚫

https://t.me/infowarslive

8770

Jack Dawkins

https://t.me/JackDawkins

8284

David Avocado Wolfe

https://t.me/davidavocadowolfe

7756

/pol/ news

https://t.me/politically_incorrect

7357

Racism Inc.

https://t.me/WhiteIsRight

6575

Police Frequency

https://t.me/police_frequency

6542

🇨🇳 Wuhan Virus Updates 🦇

https://t.me/wuhappening

6373

Michelle Malkin

https://t.me/MichelleMalkin

6332

Faith Goldy

https://t.me/FaithGoldy

6228

GALLIA DAILY | 🇫🇷 IN 🇬🇧

https://t.me/GalliaDaily

5572

Nick Monroe

https://t.me/nickmon1112

5322

Defend Europa News

https://t.me/defendevropa

4490

Franssen

https://t.me/stevenfranssen

4422

Infowars Alex Jones

https://t.me/infowarsalexjones

3622

Boogaloo Intel Drop📡

https://t.me/boogaloointel

3507

Nogals Redpills

https://t.me/nogals_redpills

3379

🅱️eady-Anglos-Я-Us Propaganda Storage Facility

https://t.me/beadymanor

3134

The White Space

https://t.me/thewhitespace

3132

/CIG/ Telegram | Counter Intelligence Global

https://t.me/CIG_telegram

3111

Corona Chan News COVID-19 🦠 #RentStrike

https://t.me/CoronaChanNews

2975

There Is No Political Solution

https://t.me/TINPS

2909

Uncle Paul

https://t.me/Uncle_Paul_GTKRWN

2381

Tales of a Libertarian

https://t.me/toalibertarian

2375

Survive Now!

https://t.me/SurviveNow

2308

TruthGraphs

https://t.me/TruthGraphs

2135

JaydaFransen.Online

https://t.me/JaydaFransen

2133

Third Position Army

https://t.me/TheThirdPosition

2059

The Reality Report

https://t.me/therealityreport

2007

RedPill Video Archive

https://t.me/RedPillVideoArchive

1964

Europe Lives

https://t.me/EuropeLives

1954

Agents of Truth

https://t.me/agentsoftruth

1919

On the Offensive

https://t.me/otohugh

1564

Vinnie Sullivan

https://t.me/vinniesullivan

1523

The Absolute State Of Britain

https://t.me/AbsoluteBritain

1359

🍤 The Ochs Report 🦠

https://t.me/TheOchsReport

1298

Kang Kato's Kwarantine Korner

https://t.me/KatoMemesmith

1238

European Tribalism - Blood & Soil, European culture, survival

https://t.me/EuropeanTribalism

1237

Nick Griffin

https://t.me/NickGriffin

1034

Lee Garrett Channel

https://t.me/leegarrett

747

STOP 5G AND CORRUPTION CHANNEL

https://t.me/stop5gchannel

682

MARK STEELE CHANNEL

https://t.me/marksteele5g

573

This topic: Dmi > WinterSchool2016 > WinterSchool2016ProjectPages > SummerSchool2020Conspiracytribes
Topic revision: 24 Jul 2020, DanielDeZeeuw
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback