Most police departments don't have the resources to sift through all their body-cam footage, meaning most of it remains unreviewed and unexamined. According to Axon, a company...
An AI art website I use illustrates your point perfectly with its attempt at automatic content filtering. Tons of innocent images get flagged, meanwhile problem content often gets through and has to be whacked manually. Relying on AI to catch everything, without false positives, is a recipe for disaster.
I really don’t think it’s better than nothing. You put a biased AI in charge of reviewing footage and now they have a reason to say they’re doing the right thing instead of doing nothing, despite what they’re doing being worse.
An AI art website I use illustrates your point perfectly with its attempt at automatic content filtering. Tons of innocent images get flagged, meanwhile problem content often gets through and has to be whacked manually. Relying on AI to catch everything, without false positives, is a recipe for disaster.
Still better than what he have now, where the footage usually isn’t reviewed at all.
I really don’t think it’s better than nothing. You put a biased AI in charge of reviewing footage and now they have a reason to say they’re doing the right thing instead of doing nothing, despite what they’re doing being worse.