Member-only story

AI & Facial Recognition wrongly identifies innocent emergency chocolate purchase.

George Little
3 min readJun 5, 2024

Why the EU AI Act got the restrictions on this spot on.

Image Credit:Leonardo.AI

Imagine walking into a store, just needing some chocolate to unwind after a long day, only to be accused of theft within minutes. This isn’t a dystopian movie plot but a real-life incident reported by the BBC. Sara faced this nightmare at a Home Bargains store when a facial recognition system called Facewatch wrongly identified her. After a humiliating bag search, she was banned from all stores using this technology, leaving her distraught and fearful of being labeled a thief despite never having stolen anything.

Facewatch is used in various UK retailers to combat shoplifting, but Sara’s experience reveals the darker side of this technology. While Facewatch later admitted their error, the damage was done.

It’s not just stores. In Bethnal Green, East London, police use modified vans with cameras capturing thousands of facial images. These images are checked against watchlists, leading to arrests. The Metropolitan Police claim this technology helps them catch criminals quickly. However, civil liberties groups raise valid concerns about its accuracy and potential misuse.

For instance, Shaun Thompson was mistakenly identified and detained by police due to a facial recognition…

--

--

George Little
George Little

Written by George Little

🚀 Exploring the tech landscape at IBM 🤖 | Passionate about Ethical AI 🌐 | HR Tech Enthusiast | Retro-Tech Lover 📼 | Foodie & Family Life. Views are my own.

No responses yet