- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Retailers increasingly are using facial recognition software to patrol their stores for shoplifters and other unwanted customers. But the technology’s accuracy is highly dependent on technical factors — the cameras’ video quality, a store’s lighting, the size of its face database — and a mismatch can lead to dangerous results.
This is the best summary I could come up with:
A man was sexually assaulted in jail after being falsely accused of armed robbery due to a faulty facial recognition match, his attorneys said, in a case that further highlights the dangers of the technology’s expanding use by law enforcement.
Harvey Murphy Jr., 61, said he was beaten and raped by three men in a Texas jail bathroom in 2022 after being booked on charges he’d held up employees at gunpoint inside a Sunglass Hut in a Houston shopping center, according to a lawsuit he filed last week.
A representative of a nearby Macy’s told Houston police during the investigation that the company’s system, which scanned surveillance-camera footage for faces in an internal shoplifter database, found evidence that Murphy had robbed both stores, leading to his arrest.
The company said in a previous statement that it uses “facial recognition in conjunction with other security methods in a small subset of Macy’s stores with high incidences of organized retail theft and repeat offenders.”
But the technology’s accuracy is highly dependent on technical factors — the cameras’ video quality, a store’s lighting, the size of its face database — and a mismatch can lead to dangerous results.
The Federal Trade Commission last month said the pharmacy chain Rite Aid had misused its facial recognition system in a way that led to shoppers being falsely accused of theft, including in confrontations with police.
The original article contains 805 words, the summary contains 230 words. Saved 71%. I’m a bot and I’m open source!