A cold draught in a hot medium

February 2019

Constant Dullaart‘s (Netherlands, 1979) practice reflects on the broad cultural and social effects of communication and image processing technologies, from performatively distributing artificial social capital on social media to completing a staff-pick Kickstarter campaign for a hardware start-up called Dulltech™.

Read full Bio

The Kayapo people, indigenous to the Amazon rainforest, have an expression for taking a photograph: “akaron kaba”. Which not only means “to take a photo” but also “to steal a soul”. And perhaps there is some truth in this.

The visual representation of human identities combined with ‘psychographic micro-targeting’ has become an industry so large, it dwarfs the lonely photographic portrait, manifested on a thin light sensitive medium. Images now published on social media are valorised in terms of distribution and quantifiable interactions, particularly when triangulated with data about a user’s online purchases or social media behaviour. This process shapes visual representations of human identities into ‘data images’ outside the control of the person the data originates from. These identity images, or profiles are bought and sold without our knowing, and provide more insight into our behaviour and motivations than we perhaps have ourselves. The industry to create falsified, or manipulated, profiles is flourishing: authoring identities on an industrial scale.

In the Fall of 2014 I purchased 2.5 Million Instagram Followers for US$5000 as part of a commissioned artwork I made for Jeu de Paume in Paris, and HMKV Dortmund. This represents one fifth of a cent per follow on Instagram: a social media metric copied, cloned, commodified, sold.

In this project I wanted to highlight how the medium of photography is changing due to the quantified distribution and valuation of images. Instagram offers a marketplace for aesthetics mixed with social dynamics, a measurable popularity contest based on a sequence of images. The distribution of images, and the measurements for social interactions around them is increasingly outsourced to savvy agents with commercial intent. These agents have an interest in generating a system of rewards (likes, follows, comments) to keep users active on a platform, who are in turn primed to generate more content to harvest more social validation and rewards. In 'The Regime of Visibility', Camiel van Winkel describes the impact of this visual/social feedback loop as follows:

Not only is my sense of reality destabilized by the exclusive domination of visual stimuli; under the present circumstances it is even becoming more difficult to determine what ‘sense of reality’ actually means.

The neurotransmitter dopamine is released in our brains to reward behaviour, and cannot be stockpiled or hoarded. However, quantified social media rewards can be stockpiled on popular social media accounts. The measuring systems that reveal these rewards are corrupted and flawed: likes, followers, comments and entire profiles can be purchased to manipulate social metrics.

The sheer amount of money generated by this type of participatory mass media will continue to provide the incentive for a growing industry creating artificial identities. A continuous battle to recognise the origin of a user account - be it fake, crafted, hijacked or real - is being waged between the detection protocols of social media companies, and those trying to circumvent these protocols in order to trade in the currency of the attention economy. And although agents like Facebook or Twitter publicly battle the fake account industry, the value of these companies is directly based on the number of active users, leaving little incentive to delete these 'alternate' accounts, so long as they stay unnoticed. It therefore becomes important to ensure faux identities look like real people, with appropriate content, images, preferences and interactions. What results is a shadow world of falsified profiles, legitimated through images, fulfilling our human desire for social rewards, real or fake. The commercial trade in these metrics of social validation has hijacked and undermined our semiotic understanding of media, confusing retweets with agreement, and likes with approval. This has the effect of undermining our perception of one of the most dominant value systems: popularity.

The practicalities of creating and maintaining this ‘shadow world’ reveals much about the way we read images and how we understand the representation of identity online. The profile images and photo feeds of fake accounts are easily ‘scraped’ or downloaded from social media platforms, or via simple Google Image searches. The fake profiles compiled are managed by multiple people, who can write or upload original content to accounts on request, in order to influence a public debate or commercial interest. Now that it’s possible to purchase verified Instagram and Twitter accounts (the ones with the blue tick), fake accounts have become even harder to discriminate from ‘real’ accounts. ‘Spoofed attention’ has become a commodity, an identifiable trackable interchangeable unit which can be traded, and sold.

The market for these profiles reveals an emerging geo-political value system of crafted identities. For example, a Russian Instagram account costs about eight US cents, while a UK account will set you back thirty cents. 'Aged' Facebook Phone Verified Accounts (using a real sim card, not a virtual number) start around $1 with a complete profile aged 1-3 years costing $5 to $10. Aged, seasoned and targeted accounts can go up to $50 or more. Age is defined by the date the account was registered, whilst seasoning defines the amount of 'normal' user behaviour an account has engaged in, making it more difficult for Facebook to detect the origin of an account as human or machine.

Targeted accounts are used to 'target' specific geographic audiences, often registered and verified through cheap local SIM cards and accessed through dedicated proxies (geo-located internet gateways). These accounts are traded on marketplaces for people selling services to create more online attention, or to monetise real and artificial attention, i.e. so-called Search Engine Optimisation (SEO) specialists. The people trading on these marketplaces come from all over the world, but a reliance on very low-wage workers to create such profiles means they are manufactured in countries like Bangladesh, India, and the Philippines. For example, I can buy an aged, seasoned and US targeted Facebook account for $50 made by a person whose Indian Facebook account will never be worth more than 20 cents. But what does that mean for the value of the circulated image, or our capacity to understand these images?

There is a specific technical history to how these profiles have been assembled. In 2014 the images used to compile these profiles were typically scraped from social media or search engines. For example, one could search for 'gender + swimwear + happy' on Google Images, and download the first 500 image results to populate fake profiles with the right mix of generic Instagram content. These image sets could also (at that time) be bought already compiled into categories based on gender and geographic location. This frequently resulted in visual incongruities between an account’s content and its user profile due to the cultural differences between the creator of the profile and the culture the profile was targeting. Secondly, the methods used to categorise and pre-filter the images could not distinguish between an image’s quality, landscape orientation, or people in the images. At a time when Instagram still forced the square image format, this also resulted in many crafted accounts showing awkwardly cropped images, downloaded from other sources, posing as original content on Instagram.

As machine learning becomes more popular and accessible, new tools have become available for those manufacturing artificial profiles. The latest smartphones connected to a Google account can access free unlimited photo storage and can use image recognition to organise an image library and classify ‘people, places and things’. This new generation of artificial accounts can publish a convincing stream of perfectly sorted images, showing several images taken moments after each other of the same situation or person, but uploaded hours or days apart. Because each snapshot has a clear relation with the next, it becomes harder to distinguish a real profile from a crafted one.

Many of the profiles used in commodified social transactions are therefore becoming almost impossible to discern as 'fake'. With the help of machine learning and automated interactions, a fake profile will be so real people won't mind it might not be. But apart from the 'fake' profiles spewing out advertisements for sunglasses or other merchandise, it’s important to remember that many 'fake' interactions are in fact generated by actual 'real' accounts. These authentic, or so-called 'stealth' profiles represent the hacked accounts of actual people hijacked to distribute likes or comments. Once the account has been unlocked by brute force (using login/password combinations leaked from other hacked online platforms), the profile can be used to sell their social transactions without the owner knowing. Or perhaps the profile’s owner has willingly given access away to a service providing free followers, likes or comments.

Although I have found these ‘identity images’ to be very interesting artistic material, the difference between real or fake identity images has become more and more opaque, and it’s increasingly unclear if an identity image is spoofed, crafted, hijacked, or authentic. After I set up my Facebook army a year before the 2016 American presidential elections, I began to use these questionable profiles to distribute poems in the comments of the Instagram feeds of selected public organisations. In this way I’m interested in using materials that are diminishing the value systems we have to understand the image to reframe another image. This reframing of ‘value’ is crucial at a time when we feel surrounded by social ‘shadows’ with unclear intent, leading us to doubt our own agency, or to what extent we were primed, or manipulated in our social behaviour.

So perhaps our images have stolen our souls. I recently heard Moby's 1995 track 'I'm feeling so real' again, and remembered why I always disliked it so much. And this sentiment remains as strong today as it did than (although it was mostly drug fuelled at that time). It would have been so much better if Rozz Morehead would have sung: “I'm feeling so unreal”. Which leaves me to think that we have a new responsibility to redefine ‘reality’ as a fluid concept. Something we should do as intensely as possible, before artificial intelligence can generate hi-res photorealistic images, or there are laws passed forbidding alternate explanations of reality (fake news) and pseudonyms (fake accounts).