Dark Telegram

Team Members

Facilitators: Silvia Semenzin, Lucia Bainotti
Team members: Richard Rogers, Bernhard Rieder, Emilija Jokubauskaite, Matteo Banal, Nelly Marina Elizalde, Leonardo Sanna, Horacio Sivori, Yitong Tang


Summary of Key Findings

Our research aimed at developing a methodological approach to study Telegram. Starting from a general discussion around ethical issues for studying Telegram groups and channels, we delved into the analysis of public groups and channels pertaining to both the far right and the far left. We build a methodological protocol for the analysis of Telegram that hopefully might be useful for future research on messaging platforms.

Following the above mentioned methodological protocol, we also focused on the role of Telegram for deplatforming extreme voices that are increasingly been banned in other mainstream social networks, arguing that Telegram imagined affordances encourage the creation of groups and channels for the networking of extreme groups.

1. Introduction

Digital research on messaging apps has just started to expand. While much attention has been paid to social media platforms, research has focused less on apps such as WhatsApp, Telegram or Discord. Among such messaging apps, Telegram appears to be particularly interesting as a case study because of its architecture and affordances. Indeed, differently from other messaging apps such as WhatsApp, Telegram stores all the information on a built-in (distributed, encrypted) cloud backup. Moreover, it allows for the creation of public channel with an unlimited number of participants, chat groups up to 200.000 users, and even secret chats that provide the perception of increased privacy. According to its creator, these features would make Telegram more secure than WhatsApp, and the suitable platform for building large online communities more protected from online surveillance. While this might represent an advantage for minorities and activists, Telegram has also become a preferred platform for spreading violent and illicit content (e.g. revenge porn) and coordinating extremist movements (e.g. jihadism).

So far, works on Telegram have mainly focused on jihadi. Mazzoni (2018) has conducted a broad research on jihadi channels and groups, producing categorizations based on the type of distributed content, groups size and aims. As for the alt-right, Mazzoni (2019) has briefly described three possible categories of channels, i.e. images channels, news channels and discussion channels. Other works focus on the role played by Telegram in amplifying the diffusion of non consensual intimate images (Semenzin, Bainotti, forthcoming), arguing that the platforms’ features can be considered as gendered affordances (Shwartz, Neff, 2018), as they suggest and reproduce gendered structures and inequalities. From a methodological point of view, the above mentioned research usually rely on qualitative methodologies, such as digital ethnography inspired approaches (Semenzin, Bainotti, forthcoming).

Telegram’s increasing popularity, together with the peculiarity of its affordances, raises important questions that needs to be further addressed. On the one hand, at a theoretical level, much work is needed to better understand how the platform’s affordances allow and shape users’ behaviours, perceptions and imaginaries. On the other hand, it is important to better understand how to study Telegram methodologically and what ethical issues are implied in such analysis.

In this context, the present project aims to develop a Telegram research protocol, exploring different methodological possibilities and showcasing a case study of fringe political subcultures (left and right). The project therefore asks how it is possible to study Telegram following the medium and developing a research protocol grounded in Digital Methods. Secondly, it aims to understand how Telegram’s affordances suggest possible uses to fringe political actors. The interactive nature of the messaging platform, together with its possibility for anonymity and security provided by its system of encryption, can indeed foster an interesting perspective to look at how the imaginaries about the alt-right and the far-left, and the different publics revolving around them take shape. Lastly, we are willing to address how Telegram relate to the processes of deplatforming on mainstream platforms.

As mentioned before, in the attempt of building a research protocol, it is necessary to take into consideration some ethical issues related to the platform’s structure (Figure 1.), and specifically how to negotiate access and analyse publics and private channels and chats.

Figure 1.

2. Initial Data Sets

Following Mazzoni’s (2019) research on Telegram and alt-right, we created a first sample of 20 channels for the alt-right and 30 for the far-left.

One of the challenges of the project was to create a corpus of data starting from Telegram channels and chat groups. Since we wanted to test the feasibility of entering private groups, we did not have a pre-built dataset, but we created it during the group work. The data collection procedure started by negotiating access to already known alt-right and far-left Telegram channels and chats (according to the literature review) and then following the medium.

3. Research Questions

1. How can we study Telegram following the medium and developing a research protocol grounded in Digital Methods?

2. How does Telegram’s affordances suggest different uses of the platform for fringe political actors?

3. How does Telegram relate to deplatforming on mainstream platforms?

4. Methodology

One of the main aims of the project is to elaborate a research protocol useful for the analysis of Telegram by following the medium. From a methodological point of view, we identified four main issues that needed to be addressed, and specifically how to: address ethical concerns; delimit the object of study; collect and gather data; analyse data.

Given the ethical issues mentioned above, in this first phase of the analysis we focused on public channels. We decided to approach such online public spaces following a covert ethnography approach: we thus decided not to reveal ourselves as researchers but we just observe users’ activities in a non intrusive way to prevent the risk of losing or altering the object of study.

the first main issue in analysing Telegram is how to discover illicit or extremist spaces on the platform, which is not always easy given the interconnection between the slippery nature of the topic itself, and Telegram’s affordances. In order to address the issue of delimiting the object of study and consequently perform a coherent data collection, we developed two possible approaches, as Figure 2. shows:
  1. A cross-platform (or external) approach:

  • query other platform of interest for https://i.me. In so doing, it is possible to delimit the object of study and gather data by looking at some entry points provided by other platforms. For the nature of the present research (fringe political actions) we relied on 4Chan (4chan/pol/) and Reddit (/r/The_Donald).
  1. An in-platform (or internal) approach:

  • create a list of topic-related Telegram channels (and/or groups) relying on: a previous qualitative and in-depth analysis of the literature and searching for keywords using the Telegram search bar, following a snowball sampling logic.
  • further extend the list of possible channels by using an ad hoc created crawler
  • extract data by using an ad hoc created Python Scraper
We then performed different analysis according to the different data obtained and platform’s affordances, and specifically:
  1. URL analysis
  2. Network and cluster analysis
  3. Automated text analysis
  4. Qualitative in-depth analysis of Telegram Channels

Figure 2.

5. Findings

5.1. URLs analysis: a cross-platform perspective

The first step of our research is based on a cross platform (or external) approach, according to which we started the data collection and analysis by looking at some platform of interest, in our case 4Chan and Reddit, and query for Telegram links. Specifically, we queried t.me/$ in 4cat to 4chan/pol and in r/The_Donald for Reddit to ascertain linkages between the board, the subreddit and telegram. This first map allows to see how Telegram is used on 4chan/pol and r/The_Donald, two already established venues for the diffusion of alt-right content. As shown in Figure 3., the majority of the found links to Telegram were from 4chan/pol, and very few from the subreddit /r/The_Donald. Only 4 links were shared by both.

We then delved deeper by looking at each link to see the different types of content it refers to, with the aims to spot differences and similarities between 4Chan and Reddit. In particular, the links related to 4chan/pol are much more varied in their content, as they link mostly to Telegram channels, but also to groups, bots and messages. On the contrary, the links found on The_Donald we see are mostly directed to specific messages (Figure 4.).

Finally, we can see the degree of privacy of the content shared. The majority of links on both platforms was public (green, see Figure 5.) rather than private (red). Although Telegram is considered and depicted as encrypted and secure (and even more secure than WhatsApp), this result suggests that alt-right communities tend to use it as a public space, by taking advantage of its accessibility and by the possibility to create large groups, communities and to popularise content

Figure 3.

Figure 4.

Figure 5.

5.2. Mapping far-left and alt-right Telegram channels: an in-platform perspective

The second approach we followed for the analysis of fringe political actor on Telegram is an in-platform (or internal) one. We developed a few tools to perform the data collection and analysis of Telegram data: one to retrieve messages of Telegram channels and groups, another to crawl from a snowballed list of channels to see what they link to and what other channels they mention. The lists of seed channels, both for the far-left and for the alt-right, were constructed through a qualitative and ethnographic research, starting from channels already known in the literature and following a snowball sampling approach. In our case study, as reported in Figure 6., crawling resulted in a network showing seeds of left and right Telegram channels (red and blue), the domains being linked (green) and mentions (pink). Further analysis is needed here to make more robust findings, but one observation is that while the left has more diverse channels, some very big, some smaller, the right has more interconnection between them, linking them together.

Figure 6.

After mapping out the various channels through the crawler, we delve deeper into the content of 4 alt-right channels, chosen according to the number of subscribers: “Alt-right shitlords”; “CIGtelegram”; “HansTerrorwave”; “Multiculturalism”. The conversation posted between September 2016 and July 2019 were collected using an ad hoc created tool. In this phase of the research we firstly aimed to understand which sources of information were used within each channel, by looking at the links from Telegram to other platforms. The results are reported in Figure 7. As we can see, Twitter (699 mentions) is the most used content provider followed by Youtube (453). This result shows that the main sources of information used by Telegram communities are so called “mainstream platforms”, while what we called “fringe platforms” are less mentioned (4Chan/pol/, 70; bitchute, 5; Gab.ai, 4). Archive.(various) was the third most used content provider (435): this can indicate that in these communities content is quite ephemeral.

Figure 7.

5.3. Alt-Right spaces and their characteristics: is Telegram a safe space?

A last (but not least) step in analyzing dark telegram is text analysis. At this stage of the research, we performed an exploratory analysis of the content of Telegram channels, both on the quantitative and on the qualitative side. In this part of the analyses we focused again on four of the most important alt-right channels (“Alt-right shitlords”; “CIGtelegram”; “HansTerrorwave”; “Multiculturalism”). We firstly performed an automated text analysis. Specifically, a Word2Vec model has been computed for an exploratory analysis of the semantic model of Telegram Channel, in order to understand how language is used in this spaces. As an example, Figure 8. shows how the word “nigger” is used. As we can see, one interesting resultat is that such a world is not used for hate speech purposes.

Figure 8.

Jason Davies word trees have been used for an exploratory qualitative analysis. In this case we can see how Facebook is mentioned in this channel. Firstly, there is a minority of links related to content (mostly alt-right related). Secondly, we can see that in the vast majority of cases, Facebook is heavily criticized and it’s portrayed as a platform that doesn’t grant freedom of speech and, more generally, against far-right interests (Figure 7).

Figure 9.

As a second step, we performed an in-depth qualitative analysis, aimed to understand which were the main themes discussed in the four channels under investigation. In this phase, a specific attention was paid to Telegram’s features as imagined affordances (Nagy, Neff, 2015), in order to understand whether and to what extent Telegram is conceived a “safe place” by its users. Some of the main themes emerging from the analysis are the following (see Figure 10.):

  • Telegram is indeed considered as a “safe place” from users. The sense of anonymity and security provided by the platform’s affordances (as well as by the narrations revolving around such affordances), play an important role in shaping users’ participation in Telegram groups and channels. Specifically, users feel more free to express their content without restrictions, and consider Telegram a safer platforms compared to other mainstream ones.

  • Telegram seems an answer to the problem of deplatforming. Deplatforming processes are deployed as means to construct a cohesive community of users around new and different platform such as Telegram.

  • Given the risks of deplatforming, Telegram is also described as a “backup platform”.

Figure 10.

5.4. Deplatforming: Following extremists to an alternative social media ecosystem (by Richard Rogers)

Deplatforming, or the removal of one’s account on social media for breaking platform rules, has recently been on the rise, gaining attention as an antidote to the so-called toxicity of online communities and the mainstreaming of extreme speech, but also stirring a discussion about the ‘liberal bias’ of US tech giants implementing the bans (Bilton, 2019). In the past few years, Facebook, Instagram, YouTube , Twitter and other platforms have all suspended and removed a variety of individuals and groups, comprising, according to one accounting, ‘white nationalists’, ‘anti-semites’, ‘alt-right’ adherents, ‘neo-nazis’, ‘hate groups’ and others (Kraus, 2018). Many of those who have been deplatformed are on the far right of the ideological spectrum, and certain of them could be described as social media celebrities, such as Milo Yiannopoulos and Alex Jones, whose removals have had a significant impact on their visibility, the maintenance of their fan bases and the flow of their income streams. Yiannopoulos has claimed to have become bankrupt by deplatforming (Beauchamp, 2018). Jones has seen the view counts and seemingly the impact of his posts decline.

Deplatformings have been widely reported in the tech news and beyond (Martineau, 2019). When Yiannopoulos, Jones, Laura Loomer and Paul Joseph Watson were removed from Facebook and Instagram in 2019 for being ‘dangerous individuals’ engaged or involved in ‘organised hate’ and/or ‘organized violence’ (Facebook, 2019), it drew widespread reaction, including the story of how Facebook announced the ban some hours prior to its implementation, allowing the deplatformed to post notices on their pages, redirecting their audience to other platforms (Martineau, 2019). Laura Loomer, for one, announced her Telegram channel; Alex Jones pointed to his websites. The migration from mainstream to alternative social media platforms was underway.

At the same time, protests from these individuals and their followers have been staged, where Loomer, the ‘white nationalist’ also banned from Twitter for a ‘racist attack’ on a Muslim U.S. congresswoman, handcuffed herself to the front door of the corporation’s office in New York city, livestreaming her plight and her views on the suppression of ‘conservative’ viewpoints on a supporter’s Periscope account (itself a Twitter service). Having been banned, other users switched to platforms friendly to their politics, such as Gab, a Twitter alternative (that upvotes and downvotes posts like Reddit). It has become known as a ‘haven for white supremacists’ and for its defense of free speech (Zannettou et al., 2018; Ohlheiser and Shapira, 2018). It also positions itself as distinct from the “left-leaning Big Social monopoly” (Coaston, 2018).

When deplatformed social media celebrities migrate to alternative platforms, these sites are given a boost through media attention and increases in user counts. Milo Yiannopoulos initially turned to Gab after his account was removed on Twitter (Benson, 2016). Around the same time Alex Jones joined it “with great fanfare” (Ohlheiser, 2016). Indeed, when Twitter conducted a so-called ‘purge’ of extremists in 2016, Gab gained tens of thousands of users in a short time. It is continually described as a favored platform of expression for extremists, including the shooter in the Pittsburgh synagogue in 2018 who announced his intended acts there (Nguyen, 2018). Morbidly, such events immediately drive traffic to the platform. Gab drew over a million hits, after it became known that a mass shooter posted his manifesto there (Coasten, 2018).

Mainstream social media, however, drives more traffic to extremist content than other web sources or alternative social media platforms, at least in the case when Alex Jones was banned from Facebook and YouTube, as mentioned above. His InfoWars posts, now only available on his websites (and a sprinkling of alternative social media platforms, as we come to), saw a decline in traffic by one half (Nicas, 2018).

Effectiveness of deplatforming

After the shooting, Gab itself was ‘deplatformed’ from the web, removed from domain service, GoDaddy, its hosting provider, Joyent, as well as payment processors PayPal and Stripe, but it returned shortly thereafter, finding alt-right-friendly alternatives (Martineau, 2018). Epik, the company that now hosts Gab.com’s domain, wrote a blog post defending its decision, with the subtitle, ‘deplatforming is digital censorship’ (Epik, 2018). Cloudflare, the web infrastructure and security company (that protects against denial of service attacks) also continued to ‘protect’ Gab, with its argument that no web infrastructure company should have such great editorial power. After the mass shooting in El Paso in 2019, Cloudflare, however, would remove its protection of 8chan, where the shooter posted his manifesto explaining his acts. The company wrote, “in taking this action we’ve solved our own problem, but we haven’t solved the Internet’s” (Kelly, 2019).

There has been some scholarly attention paid to the effectiveness of shutting down particularly offensive online communities, such as the subreddits r/fatpeoplehate and r/coontown, banned by Reddit in 2015 for violating its harassment policies. It was found that the shutdowns worked, in that a proportion of offending users appeared to leave the platform (for Voat, an alternative to Reddit), and the subreddits that inherited those migrating from those spaces did not see a significant increase in extreme speech (Chandrasekharan et al., 2017). Indeed, the closing of those communities was beneficial for Reddit, but less research has been performed about the effectiveness of the ban for the health of social media or the internet at large.

Similar to Cloudflare’s viewpoint, the Reddit study’s authors reported that not only did Reddit make these users “someone else’s problem”, but also perhaps pushed them to “darker corners of the internet” (Chandrasekharan et al., 2017).

Telegram as ‘dark corner of the internet’

Apart from Gab and perhaps Voat (to which deplatformed Pizzagate, incel and QAnon subreddit users have migrated), Telegram is another of those dark corners (Wikipedia, 2019). It is an instant messaging app, founded in 2013 by the same internet entrepreneurs who launched VKontakte, the social media platform popular in Russia. Telegram has a reputation, whether or not well-founded, for highly secure messaging, having notoriously been listed by ISIS as ‘safe’ and having themselves championed privacy upon its founding that coincided with the US state spying revelations by Edward Snowden. Indeed, the founders started Telegram so communications could not be monitored by governments, including the Russian authorities, who pursued the founder until he fled the country (Cook, 2018). The Russian state later accused Telegram of enabling terrorists because it would not turn over users’ encrypted messages, leading to a ban of the application in Russia. The founders, and their programming team, are themselves self-exemplary of privacy-enablers, for they require secure communication, and have moved from location to location to elude what the founder calls ‘unnecessary influence’ (Thornhill, 2015).

‘Private sociality’

How does Telegram appeal to its users, including those who have been deplatformed for violating speech rules? Telegram not only has the reputation but also the affordances that would be attractive to those seeking something similar to ‘social privacy’, or the capacity to retain control over what is known about oneself whilst still participating (and becoming popular) on a social media platform (Raynes–Goldie, 2010). On social media platforms such as Facebook, the user is public-facing at the outset, and subsequently makes deft use of aliases, privacy settings as well as account and timeline grooming. That is how social privacy is performed.

Telegram, however, is something of a hybrid system, and in contradistinction to Facebook, it leads with protected messaging, and follows with the social. That is, it is in the first place a messaging app (and VOIP service), where one has an account, and can message others and join groups, first private ones (by default) but also public ones. It also has some elements of social media, where one may create a channel (public by default) and have others subscribe to it.

The apt deployment of Telegram would seem to conceptually invert social privacy. The app first offers protected communication, appealing to a private user, rather than to the public-facing user, seeking publicity (De Zeeuw and Tuters, forthcoming). ‘Private sociality’ may be a term that captures operating in private chats and private group chats both for the private user (not necessarily seeking publicity) as well as the masked user (who may seek attention). One can operate in private mode and still participate. For example, on Telegram, to join a group or subscribe to a channel, the user need not enter a telephone number.

Second, apart from only private spaces in which to organize, recruit, chat and so forth, the Telegram user may still seek publicity. Here the use of a channel is significant for it allows building a following. Similar to YouTube, one can broadcast to subscribers.

Telegram thereby could be said to reconcile dual desires of protection and publicity by offering private messaging and broadcasting. It thereby appears to go some way towards resolving the ‘online extremists’ dilemma’, a variation on the ‘terrorist’s dilemma’, which concerns balancing “operational security and public outreach” (Shapiro, 2013; Clifford and Powell, 2019).

Telegram, as mentioned above, has been associated with terrorism. In a description of its use in this context, the features of the hybrid application become clearer: “Telegram’s public-facing ‘channels’ and private messaging ‘chats’ make it a ‘dual-use’ weapon for extremist groups like ISIS, al-Qaeda and Hamas” (Counter-extremism project, 2017). The groups broadcast to followers on channels and recruit and organize through one-to-one or small group chats.

The Deplatformed and Telegram

For the deplatformed, Telegram’s reputation may be appealing. It affords ‘protected speech’ by being permissive of extreme speech. It also offers to back up channels and group content, thereby allaying threats of deletion, a key concern.

Telegram also offers means to build a following, and broadcast to large numbers of users (as on YouTube and Twitter). Like other messaging apps including WhatsApp, Telegram has groups, though it does not limit their size as much. Groups can have up to 200,000 users (compared to 256 for WhatsApp), and channels can have an unlimited number of subscribers. Telegram also enables large clusters of groups, in that one can forward a message to an unlimited number of groups, as opposed to the smaller number on WhatsApp, a restriction that received attention in the wake of misinformation campaigns around elections in India (and Brazil). (On WhatsApp India has a special limit of five forwards, compared to twenty globally.)

In all, Telegram can compete for the deplatformed users by offering the kinds of features sought by those seeking protected speech as well as a following.

This study empirically examines the extent to which the platform is considered and used as suggested. It examines the content of a select number of channels self-identified as alt right, asking how the platform is discussed (or imagined) as well as used as a ‘protected space’ as well as a ‘publicity space’. To what extent is Telegram employed for protection and/or for publicity broadcasting, so to speak?

It also generally discusses the effects of deplatforming for extremists, mainstream social media platforms as well as researchers. As mentioned, Telegram has affordances that appeal to the deplatformed seeking protected speech as well as publicity or a following. But from the point of view of (extreme) Telegram users, is Telegram an effective alternative to mainstream platforms? Which mainstream social media platforms still relevant? Even if they deplatform users, which ones remain vital and which fade from interest? Such is one way to address the question of the effects of deplatforming for single as well as multiple (mainstream) social media platforms.

Deplatforming effects and researcher migration

It was found that Facebook and Instagram are routinely critiqued as sites that do not grant freedom of speech and whose use are generally not in the interests of extreme actors, whereas Twitter and YouTube remain of interest, in the sense that their content is linked to (rather than the platforms only discussed in critical terms).

This study not only examines Telegram as a destination for the deplatformed, asking what it has to offer not only to extremists, but also to researchers, who in a rather different sense are also being deplatformed, having seen their access to data from mainstream platforms diminished or removed (Bruns et al., 2018).

For researchers, Telegram is not ‘locked’. Apart from certain rate limiting, the platform allows widespread probing. The extreme voices may thus be studied, where questions may be posed concerning (at least) two widespread views that have been circulated about the effects of channeling extreme voices into an alternative set of platforms friendly to them. Does the content become only more and more extreme? Is it contained within these spaces, rather than mainstreaming? There is also the question of the thinning of audiences, both in gross terms but also over time.

The channels under study, it was found, are also gateways to an alternative social media ecosystem, comprised of alternatives to each type of mainstream social media platform – from social networking, video uploading, content rating, payment processing and more. The question of the robustness and longevity of the ecosystem would go some way towards answering the question of whether the extreme voices shrivel or thrive despite deplatforming.

6. Conclusion

By analysing Telegram, we tried to develop a new research protocol and we provided a case study to exemplify the different steps we attempted. We wanted to test whether it was possible to use a digital methods approach to study Telegram, and we found out that different analysis can be applied to the platform, both qualitative and quantitative. We discovered that the main advantage of using a digital methods approach for Telegram is that it makes possible to reap the benefits of a cross-platform analysis in the effort of delimiting the object of study and collecting data. In a later stage, we asked what behaviours are allowed by the platform’s affordances and how this might affect users’ activity. We found out that, as expected, there are signs of users’ perceived affordances as allowing them to feel safer and freer to express their content, that we described as ‘imagined affordances’ (Nagy, Neff, 2015). Lastly, focusing on our case study, we also found out that the alt right seems to use Telegram for networking more than the left.

For future research on Telegram, some of these questions might be addressed:
  • Different types of data open up to different types of analysis: how to extend the analysis to visual content such as memes and gifs?

  • How to make sense of still overlooked Telegram features, such as secret chats and self-destructing messages? How do they work on users imagined affordances?

7. References

  • Bruns, Axel, Anja Bechmann, Jean Burgess, Andrew Chadwick, Lynn Schofield Clark, William H Dutton, Charles M Ess, Anatoliy Gruzd, Susan Halford … (2018) ‘Facebook shuts the gate after the horse has bolted, and hurts real research in the process’, Internet Policy Review, 25 April.

  • Beauchamp, Zack (2018) ‘Milo Yiannopoulos’s collapse shows that no-platforming can work’, Vox, 5 December.

  • Benson, Thor (2016) ‘Inside the “Twitter for racists”: Gab – the site where Milo Yiannopoulos goes to troll now’, Salon, 5 November.

  • Bilton, Nick (2019) ‘The Downfall of Alex Jones Shows how the Internet can be Saved’, Vanity Fair, 5 April.

  • Chandrasekharan, Eshwar, Umashanthi Pavalanathan, Anirudh Srinivasan, Adam Glynn, Jacob Eisenstein and Eric Gilbert (2017) ‘You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech’, Proceedings of the ACM on Human-Computer Interaction, 1(2), art. 31.

  • Clifford, Bennett and Helen Christy Powell (2019) ‘De-platforming and the Online Extremist’s Dilemma’, Lawfare blog, 6 June, https://www.lawfareblog.com/de-platforming-and-online-extremists-dilemma.

  • Coaston, Jane (2018) ‘Gab, the social media platform favored by the alleged Pittsburgh shooter, explained’, Vox, 29 October.

  • Cook, James (2018) ‘The incredible life of Pavel Durov — “Russia’s Mark Zuckerberg” who is raising $2 billion for his messaging app’, Business Insider, 8 February.

  • Counter-extremism project (2017) ‘Terrorists on Telegram’, New York: Counter-extremism project, May, https://www.counterextremism.com/sites/default/files/Terrorists%20on%20Telegram_052417.pdf

  • De Zeeuw, Daniël and Marc Tuters (forthcoming) ‘Teh Internet is Serious Business: On the Deep Vernacular Web and its Discontents’, Cultural Politics, forthcoming.

  • Epik (2018) ‘Why Epik welcomed Gab.com’, Epik blog, Sammamish, WA: Epik.

  • Facebook (2019) Community Standards, Dangerous Individuals and Organizations, https://www.facebook.com/communitystandards/dangerous_individuals_organizations, Facebook: Menlo Park.

  • Kelly, Makena (2019) ‘Cloudflare to revoke 8chan’s service, opening the fringe website up for DDoS attacks’, The Verge, 4 August.

  • Kraus, Rachel (2018) ‘2018 was the year we (sort of) cleaned up the internet’, Mashable, 26 December.

  • Martineau, Paris (2018) ‘How Gab, the Right-wing Social Media Site, Got Back Online’, Wired, 11 May.

  • Martineau, Paris (2019) ‘Facebook Bans Alex Jones, Other Extremists - But not as Planned’, Wired, 2 May.

  • Mazzoni, V. (2019) “Far right extremism on Telegram

  • Mazzoni, V. (2018) “Exploring the jihadi telegram world: a brief overview

  • Nguyen, Tina (2018) ‘Gab’s Demis is just the Beginning of a Horrific New Era of Far-right Extremism, Vanity Fair, 29 October.

  • Nicas, Jack (2018) ‘Alex Jones Said Bans Would Strengthen Him. He Was Wrong’, New York Times, 4 September.

  • Ohlheiser, Abby (2016) ‘Banned from Twitter? This site promises you can say whatever you want’, Washington Post, 29 November.

  • Ohlheiser, Abby and Ian Shapira (2018) ‘Gab, the white supremacist sanctuary linked to the Pittsburgh suspect, goes offline (for now)’, Washington Post, 29 October.

  • Raynes–Goldie, Kate (2010) ‘Aliases, creeping, and wall cleaning: Understanding privacy in the age of Facebook’, First Monday, 15(1-4), January.

  • Shapiro, Jacob N. (2013) The Terrorist’s Dilemma: Managing Violent Covert Organizations, Princeton: Princeton University Press.

  • Shehabat, Ahmad, Teodor Mitew, Yahia Alzoubi. 2017. «Encrypted Jihad: Investigating the Role of Telegram App in Lone Wolf Attacks in the West». Journal of Strategic Security 10(3):27–53.Yayla & Speckhard (2017): “Telegram: the Mighty Application that ISIS Loves”

  • Thornhill, John (2015) ‘Lunch with the FT: Pavel Durov’, Financial Times, 3 July.

  • Wikipedia contributors (2019) ‘Voat’, Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Voat&oldid=906229670, 16 August.

  • Zannettou, Savvas, Haewoon Kwak, Emiliano De Cristofaro, Gianluca Stringhini, Barry Bradlyn, Michael Sirivianos and Jeremy Blackburn (2018) ‘What is Gab? A Bastion of Free Speech or an Alt-Right Echo Chamber?’ WWW ‘18 Conference Companion, New York: ACM, https://doi.org/10.1145/3184558. 3191531.

Topic revision: r2 - 03 Sep 2019, LuciaBainotti
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback