Philipp Schmitt's 'Declassifier' uses a computer vision algorithm trained on COCO, an image dataset developed by Microsoft in 2014. In the work, photographs from Schmitt’s series 'Tunnel Vision' are tested and overlaid with the images used to generate the algorithm in the first place.
By doing so, Schmitt exposes the myth of magically intelligent machines; the visual data by which machine learning algorithms learn to make predictions is hardly ever shown, let alone credited.
'Epic Hand Washing in the Time of Lost Narratives' by xtine burrough and Sabrina Starnaman is a speculative remix that confronts Epic Kitchens, a dataset of first-person cooking videos, with quotations from literature written during or about prior pandemics such as the bubonic plague and the global influenza pandemic of 1918-19. The project reveals the arbitrary nature of information preservation and highlights the constructed nature of digitised materials. Blurring the lines between art and archive, or information and dataset, this work furthers discourse about the digital dataset as an authority of knowledge curation.
Lacework is a new work by Everest Pipkin that uses artificial neural networks to reinscribe the videos of MIT’s Moments in Time Dataset. Using algorithms that stretch time and add details to images, Pipkin creates a series of hallucinatory slow-motion vignettes from the videos of everyday actions that form the collection. By manipulating the source videos of the MIT dataset, Lacework presents a river of these moments, as captured in amber; flowing from one to another into a cascade of gradual, unfolding details.