Within the

Terms

and

Conditions

What is allowed to be online, stay online and spread online? And, who decides what is harmful, and when that content should be removed? This expanded documentary project by Caroline Sinders, commissioned by The Photographers’ Gallery in 2021, explores harmful content, conspiracy theories, and hate videos on YouTube. This project features 91 videos that have been uploaded from as early as 2011 and as recent as March 2021, across 9 different categories. These videos were scraped from 1 January to 31 March, 2021, and were revisited on 10 May, 2021 to see what had been removed. Out of the 91 videos, 8 have been removed, with content that was primarily anti-vaccines or white supremacist in nature. Some of the removed videos had been on YouTube as early as 2017. All of the categories in this project have relevancy and ties to 2021, like COVID-19 conspiracy theories or extremist content, including some older videos resurfacing and gaining popularity to support conspiracy theories about Bill Gates, or harassment campaigns.

there's a cacophony of sound
loading viral videos
a dogmatic rhetoric designed to
soothe the most privileged.
name it for what it is:
covid conspiracy theories
beliefs
ideas
truisms
as if all of our statements are fact

Category: Conspiracy Theories on COVID19

7 videos uploaded from 2011 to 2020. One of the videos was originally posted in English, but it was removed and then reuploaded with the title changed to be in Spanish. None of these videos have been removed.

name it for what it is:
Anti-Vax
"do your research!"
an empty rallying cry from
someone in their room, on a
phone, recording (conspiracy)
theories to justify their fears.

Category: Anti-Vax

5 videos uploaded from 2020 to 2021. 3 videos have now been removed.

this weaponisation of 'fear' renames
and whitewashes white supremacy
for a more palatable term.
name it for what it is:
anti, anti, anti: anti big tech,
anti islam, and anti immigration

Category: Anti, Anti, Anti: Anti Big-Technology, Anti- Isalm, Anti-Immigration, Anti-Corona

11 videos from 2018 to 2021. None of these videos have been removed.

name it for what it is:
conservative conspiracy theories

Category: Conservative Conspiracy Theories

6 videos uploaded from 2019 to 2021. None of these videos have been removed.

name it for what it is:
power and privilege
name it for what it is:
misinformation

Category: General Misinformation

7 videos uploaded from 2020 to 2021. 2 videos are now private and 1 video has been removed.

name it for what it is:
white supremacy
name it for what it is:
power and privilege
not all harm is the same

Category: White Supremacy

11 videos uploaded from 2017 to 2021. 1 video is now private and 3 videos have been removed.

behind          beyond truth.
within the terms and conditions,
name it for what it is:
racialised disinformation
and while the platform's policies
account for free speech,
how       does it
account for harm?

Category: Racialized Disinformation

9 videos uploaded from 2020 to 2021. None of these videos have been removed.

name it for what it is:
gender based violence and harassment
who decides what is harmful?

Category: Gender Based Violence and Harassment

20 videos uploaded from 2014 to 2021. 1 video has been removed.

name it for what it is:
TERFs and anti-trans rhetoric
you are what you watch.
watch what they say.

Category: TERFs and anti-trans rhetoric

15 videos uploaded from 2016 to 2021. None have been removed.

‘Within the Terms and Conditions’ is a part of a new form of practice of mine that I describe as ‘expanded documentary.’ I’m both an artist and a researcher; I’ve been researching online harassment, digital violence, and human rights in technology since 2013, and I’ve been making art for even longer. I’m  interested in the ephemera of people, current events and communities, and this includes digital content. Memes, screen grabs, and ripped YouTube videos may seem like unusual topics for photojournalism and contemporary art, partly because the Web is already such a flat and image-focused space. However, this work lends itself well to expanded documentary; it’s a bit like taking a camera within the different corners of the Internet and documenting what is there. As an artist, I take the web seriously as a place where people live and events unfold because online life is real life, too. The phrase “it’s just the Internet, it’s not real life” minimises that digital harm but it is also a phrase I heard being repeated many times in my research, from online harassment videos to reactionist videos posted by white supremacists after the failed coup in the United States capital on 6 January, 2021.

Contributors and Thanks
Artist and Researcher: Caroline Sinders 
Video Fabrication Support: Róisín Berg 
Web Development: Black Shuck Cooperative
And many thanks to The Photographers’ Gallery curators who helped make this project a reality: Jon Uriarte, Sam Mercer and Ioanna Zouli as well as the many researchers, journalists and activists working in studying and documenting misinformation, disinformation, hate speech and extremist research.

Turtles All The Way Down

From January to March 2021, I watched hundreds of hours of video that were on YouTube, and selected 91 videos to categorise for ‘Within the Terms and Conditions’ commissioned by The Photographers’ Gallery. There are nine different categories of harm: conspiracy theories on COVID-19; anti-vax; ‘anti anti anti’: videos on anti-immigration, anti-Islam, anti-COVID-19, and anti-big technology; general conservative conspiracy theories; general misinformation; the far-right and white supremacy; racialised disinformation; gender based violence and harassment; and TERFs and anti-trans rhetoric.

These 9 categories are all reflective of conversations, topics and concerns I saw expressed in 2020 and 2021, either in response to the media, breaking news, or other global phenomena.

YouTube, Facebook, Instagram, Twitter, Discord and other major platforms have a problem with hate speech, misinformation, disinformation, conspiracy theories, and other harmful content, including harassment campaigns and targeted harassment videos against individuals. But these problems have been persistent. As early as 2019, YouTube has said that they have been removing white supremacist content. However, a study conducted by the Anti Defamation League found that YouTube’s recommendation algorithms are still pushing and sharing white supremacist content. The ADL’s study found that 9% of participants viewed content from an extremist channel. Other platforms have had issues with extremist content and communities as well. NPR reported in April 2021 that over the past year Discord has removed over 2,000 white supremacists severs. But far-right groups have been on Discord for years; journalist collective Unicorn Riot has been infiltrating neo-nazi and white supremacy Discord servers since 2016. Discord has made announcements since 2018 about removing white supremacists servers, and yet the servers, and content keep being uploaded and uploaded.

Name It For What It Is: Harm. Name It For What It Is: White Supremacy
These varying kinds of harms documented in the project are not going away, and they are permetating and bleeding across all platforms. Showing the muck, the harm, and the violence of platforms, and the content created and shared on those platforms may be a strange choice for an art piece. However, the messiness, harm, the algorithms, and their outputs, including the downstream effects that those platforms have offline in society should be documented in some way. It should be critiqued, commented on, but also archived and acknowledged. These videos, and their ability to be created and shared, are very much part of the media landscape since the mid 2000’s, and most likely will continue to exist for the next few decades.

Methodology
All of the videos in ‘Within the Terms and Conditions’ were scraped from YouTube from 1 January to 31 March, 2021. Some videos were uploaded back in 2011, like a video with Serbian subtitles hypothesising Bill Gates medical related conspiracy theories, while some other videos were as new as March 2021.

I would occasionally check the URLs in the beginning of the project to see if videos had been removed. I noticed in late January that two videos uploaded by a German doctor denying COVID-19 had been removed but then reuploaded in Spanish and German titles. His videos and titles are usually in English, which is why I imagine they spread further and are better seen by the algorithm and more audiences. I checked all URLs for the project one final time on 10 May, 2021, and saw 8 videos had been removed; 3 were white supremacy videos from one extremist creator who had been uploading videos as early as 2017, and the other 5 were all anti-vax videos.

All of the themes come from different academic articles and activist reporting I read about YouTube; for example, recent research conducted by Harvard researchers Joan Donovan and Brandi Dexter-Collins on racialised disinformation after the publication of the 1619 Report became a category in the project. In order to find COVID-19 conspiracy theories, I followed the reporting of BuzzFeed and Craig Silverman, but then switched to keyword searching from phrases I heard used over and over again in the videos I had watched. Other information came from academic white papers on misinformation campaigns, and reporting by journalist collectives such as Bellingcat and Unicorn Riot. Some categories are much more clearly linked together, but some are broader such as ‘general misinformation’ and ‘anti, anti, anti’. The ‘anti, anti, anti,’ videos all share a common rhetoric echoed across conservative politicians and pundits in the UK, the US and Germany.

My methodology was loose but structured; since my role is not to be a content moderator but a researcher on harmful and problematic content. Deciding what is harmful, I could cast a wide net of videos that are harmful but not necessarily breaking YouTube’s policies as well as those that clearly do. However, I could not include videos that were covering a harmful subject to unpack. Instead I was looking for videos pushing ideological hate frameworks, harassment, extremist content, misinformation, disinformation, and transphobia. To determine the intent and content of the video, I had to watch every single video to make a decision. In the end, I only included 91 videos in the project, but I estimate I watched almost 150 videos to find this final 91.

Some subjects, almost like happenstance, unfolded around me while I was researching and creating a particular category. In March 2021, at the same time that I was watching videos exploring anti-trans sentiment and ‘gender critical’ theory a convertersy was unfolding in the US media over the newsletter platform Substack. These videos approached trans identity as a subject open for debate, ridicule, or an identity that should not exist at all. TERFs and gender critical theory are harmful in nature because they remove the humanity of trans people or denies their identity altogether. Many anti-trans writers who had been deplatformed in spaces like Twitter were migrating to Substack, like Graham Linehan. In real time, I could see conversations unfolding across many different platforms. This kind of immediacy and fluidity of topics across platforms illustrates not only the position that YouTube holds in the circulation of online content, but also the power relations between the platforms that make this content visible and accessible.

It's Just the Internet, It's Not Real
Documenting anything about a platform and its content removal means dealing in varying levels of opacities, unanswered questions, and hypotheses to understand how the platform functions and why. Becca Lewis, one of the premier researchers on YouTube, describes this secretive platform behaviour best in a tweet:

YouTube’s lack of transparency, creators, community and design is something Lewis has been researching since 2017. Often major platforms' decision-making in terms of moderating content is kept hidden and extremely opaque,, similarly with how their algorithms and feeds suggest and share content. Only recently has there been insight into how YouTube’s recommendation algorithms work. YouTube’s former algorithmic recommendation engineer, Guillaume Chaslot, left Google and whistleblew on the company, explaining that to keep more people watching and clicking, the algorithm starts to recommend more and more extremist content. According to Lewis,

“YouTube could remove its recommendation algorithm entirely tomorrow and it would still be one of the largest sources of far-right propaganda and radicalisation online. The actual dynamics of propaganda on the platform are messier and more complicated than a single headline or technological feature can convey — and show how the problems are baked deeply into YouTube’s entire platform and business model.”

Effectively, it’s turtles all the way down.

I saw the opacities Lewis referenced firsthand. In creating this project, it’s as though I was training my own YouTube recommendation algorithm but also exposing the inner workings of it.

Conducting this project by analysing certain trends during a global pandemic meant I saw in real time how YouTube responded to harm, and how it could be spurred into action. However, it also meant I saw how much the platform itself could break and what kinds of content slips through the cracks. In the wake YouTube banning, removing and suppressing 5G content, I couldn’t find any new videos earnestly created by creators on 5G that spread misinformation or disinformation, but I did find a large number of videos created by influencers and creators that try to unpack the conspiracy theory itself, some with varying degrees of accuracy. As an artistic excavator, I decided to move on from the topic of 5G, while making note of this phenomenon but reminding myself of the central thesis of the project: finding harmful videos online that fit into the grey areas of YouTube’s terms of service.

White Supremacy and Parasocial Relationships
It’s important to emphasise that YouTube is not just an entertainment platform but also a social network, and a space for creators. Unlike Netflix or Hulu, where content is uploaded and very explicitly curated by the company, YouTube is entertainment combined with social media underpinned by the idea that any person, anywhere, can be a creator. It’s designed for making and intaking, for watching, sharing, commenting and creating. Lewis describes this as a parasocial relationship , and discusses how the issue isn’t YouTube’s algorithm but how the algorithm, the creators, and fans all interact. She also writes that there are other things to consider, on top of the recommendation algorithm, like,

“[the] other algorithms — the search algorithm and the algorithm that puts content on the home page, for example. None of these technical features exist in a vacuum — influencers explicitly work to maximise their visibility, and far-right influencers have been particularly effective at optimisation strategies to appear highly in search results for political terms, for example, using keywords and tagging features.”

On top of the intertwined systems and nodes of YouTube, with its multiple algorithms, video creators, fans, comments and watchers, there are also the policies of YouTube. And YouTube is not often unaware of what is happening. Experts, activists, researchers and academics have been tracking disinformation campaigns, misinformation peddlers and many different white supremacists on YouTube for years. For example, Lewis has repeatedly called for racist and conservative commentator Steven Crowder to be removed and the UK-based organisation Hope Not Hate repeatedly finds extremists online and calls for them to be removed. In an interview over Zoom, Joe Mulhall, a senior researcher with Hope Not Hate, mentioned that this comes down to how YouTube approaches regulations and moderation. As he described, in YouTube’s moderation, while someone can be a Neo-Nazi ‘offline’ or on other platforms, unless harm can be proved on YouTube, that person may not be removed. Mulhall further explained that,

“[this is an] outdated way of looking at content moderation, but YouTube refuses to budge on it. So you often get anomalies on YouTube where individuals just sit within the terms and conditions of the platform ”

Over the last few years, Hope Not Hate has engaged in investigating, naming and deplatforming. In my interview with Mulhall he also mentioned Mark Collett, a well-known white supremacist and Holocaust denier, whose videos were up on YouTube. At the time of writing this essay, it seems Collett’s YouTube account has been removed. But there are many more extremists and white supremaciasts that Hope Not Hate have identified that still have active profiles on Facebook, Twitter, YouTube and other platforms.

So much of the issues of harmful content goes back to how and if the platforms proactively research content, and how they choose to moderate that content. Melissa Ryan, a hate speech and policy researcher, explained to me how platforms make decisions. She said,

“...the big platforms are still not willing to alter their business model to protect their users from hate, harm, and harassment online. There was a news story about Facebook just this week where they reduced content that was deemed bad for the world, but it also reduced engagement. So they ended up not doing it. And that's a choice that we have seen the platforms make again and again, because hate generates a lot of engagement and traffic. But what that does is, again, it puts the vast majority of their users at risk. It exposes them to hate, harassment, and harm. And that's what fundamentally the tech platforms have to do, or governments have to force them to do through regulation.”

Lewis's tweet describing how difficult the problem of extremism is on YouTube.

Many Hands and Many People Make Hate Happen
In researching this project, I also interviewed Tomas van den Heuvel, an assistant curator for the Design Museum Den Bosch in the Netherlands. The museum had an exhibition on the design of the Third Reich that had closed in March 2020. I was curious about how they had handled showing such a subject matter. Van den Huevel said their intention was to show how designers were a part of the Third Reich machine by showing blueprints of crematoriums, and the designed, graphical propaganda posters. Architects sat down and made buildings: barracks, cafeterias, and gas chambers. For this particular project, I ruminated on that - of how many people it takes to build and maintain a system. There are many hands involved in the dissemination of white supremacy, and some of those hands are at YouTube and Facebook. Some of those hands are shaping the way we talk about harmful content, and some of those hands are afraid to talk about this.

As an artist and researcher, I wanted to highlight this ubiquity and casualness to online harm. There’s white supremacist, disinformation, misinformation and extremist content on spaces like YouTube, Instagram or Facebook. This content is not hidden on some dark corner of the Internet or in a tiny website with little traffic. It is instead, right here, on the big tech platforms where we watch TED talks, celebrity interviews, music videos, and makeup tutorials. This content is all right there and it’s easy to find.

‘Within the Terms and Conditions’ is a part of a larger body of work I’ve been building that ruminates on online harassment, and white supremacy in digital platforms, called Architectures of Violence (presented as a solo show opening 15 May, 2021 with Telematic Media Arts in San Francisco, CA).

Video Information

Category: Conspiracy Theories on COVID19
7 videos uploaded from 2011 to 2020. One of the videos was originally posted in English, but it was removed and then reuploaded with the title changed to be in Spanish. None of these videos have been removed.

Category: Anti-Vax
5 videos uploaded from 2020 to 2021. 3 videos have now been removed.

Category: Anti, Anti, Anti: Anti Big-Technology, Anti-Islam, Anti-Immigration, Anti-Corona
11 videos from 2018 to 2021. None of these videos have been removed.

Category: Conservative Conspiracy Theories
6 videos uploaded from 2019 to 2021. None of these videos have been removed.

Category: General Misinformation
7 videos uploaded from 2020 to 2021. 2 videos are now private and 1 video has been removed.

Category: White Supremacy
11 videos uploaded from 2017 to 2021. 1 video is now private and 3 videos have been removed.

Category: Racialised Disinformation
9 videos uploaded from 2020 to 2021. None of these videos have been removed.

Category: Gender Based Violence and Harassment
20 videos uploaded from 2014 to 2021. 1 video has been removed.

Category: TERFs and anti-trans rhetoric
15 videos uploaded from 2016 to 2021. None have been removed.