When a machine moderates content, it evaluates text and images as data using an algorithm that has been trained on existing data sets. The process for selecting training data has come under fire as it’s been shown to have racial, gender and other biases.
Removed by mod
Only it leaves pictures of the Hell’s Angels alone. Clearly there’s an issue here.