The reason biases against women or people of colour appear in technology are complex. They’re often attributed to data sets being incomplete and the fact that the technology is often made by people who aren’t from diverse backgrounds. That’s one argument at least – and in a sense, it’s correct. Increasing the diversity of people working in the tech industry is important. Many companies are also collecting more data to make it more representative of the people who use digital technology, in the vain hope of eliminating racist soap dispensers or recruitment bots that exclude women.

The problem is that these are social, not digital, problems. Attempting to solve those problems through more data and better algorithms only serves to hide the underlying causes of inequality. Collecting more data doesn’t actually make people better represented, instead it serves to increase how much they are being surveilled by poorly regulated tech companies. The companies become instruments of classification, categorising people into different groups by gender, ethnicity and economic class, until their database looks balanced and complete.

Doug Specht

Senior Lecturer in Media and Communications, University of Westminster

< Prev Next >