The software was trained mainly with photos of white men, so it has a much harder time distinguishing a non-white woman's face. A hidden prejudice has crept into it, which in practice can even lead to the misidentification of criminal suspects. But the spectrum of digitised prejudices and their effects goes much further. Biased artificial intelligence can deny applicants the right to a mortgage, fire employees or expose citizens to arbitrary power. A group of activists, experts and victims of algorithmic bullying show how prejudices from the analogue world not only do not vanish with digitisation, but also slyly increase in scope and severity.