← Back

UK police forces lobbied to use biased facial recognition technology

Exclusive: System more likely to suggest incorrect matches for images of women and Black peoplePolice forces successfully lobbied to use a facial recognition system known to be biased against women, young people, and members of ethnic minority groups, after complaining that another version produced...

<p>Exclusive: System more likely to suggest incorrect matches for images of women and Black people</p><p>Police forces successfully lobbied to use a facial recognition system known to be biased against women, young people, and members of ethnic minority groups, after complaining that another version produced fewer potential suspects.</p><p>UK forces use the police national database (PND) to conduct retrospective <a href="https://www.theguardian.com/technology/facial-recognition/2025/dec/05/all">facial recognition</a> searches, whereby a β€œprobe image” of a suspect is compared to a database of more than 19 million custody photos for potential matches.</p> <a href="https://www.theguardian.com/technology/2025/dec/10/police-facial-recognition-technology-bias">Continue reading...</a>
Read the full article at: The Guardian World β†’
πŸ“§ Email 🐦 Twitter πŸ’Ό LinkedIn