23.02.2026 21:00
A 67-year-old man in Chester, England, was forcibly removed from a store after being flagged as a 'thief' by an AI-powered facial recognition system used in the shop. However, a subsequent investigation revealed that no theft had actually occurred, sparking a debate about the increasingly common use of AI-powered security applications in the country.
Artificial intelligence-based security applications, which are rapidly spreading in the retail sector, have now targeted an innocent customer. Ian Clayton, a 67-year-old resident of Chester, England, was removed from the store due to a faulty alert from the AI-based digital system during an ordinary shopping trip.
AI SAID "THIEF," WAS EJECTED FROM THE STORE
The incident occurred at a branch of the discount chain Home Bargains. The facial recognition system used in the store analyzes customer movements and sends notifications to staff in potential risk situations, marked Clayton as a 'suspicious person' while he was shopping. Following this, store employees forcibly dragged Clayton out. However, a subsequent investigation revealed that no theft had actually occurred.
FOOTAGE DELETED FROM THE SYSTEM
Clayton, who shared his experience with the BBC, expressed that he felt a great embarrassment when store employees asked him to leave the shop. He stated that he felt 'helpless' and when he contacted the security company, he was told that there was a record in the system indicating he had "stolen by putting products in his bag." However, when the elderly man wanted to see the footage, he was informed by the company that "the footage had been permanently deleted from the system."
SPARKED DEBATE IN THE COUNTRY
The incident alarmed campaign groups advocating for privacy. Silkie Carlo, director of the Big Brother Watch group, stated that private companies profiling individuals based on biometric data could have serious consequences.
In recent months, it has also emerged that some individuals were removed from stores due to incorrect matching. Experts suggest that although the margin of error may seem low, even a single false match in such systems can have a lasting impact on individuals.