Recently, a number of Facebook users who came across a video of a British tabloid outlet that featured Black men, received an automated prompt from the platform asking if they’d like to continue seeing videos relating to Primates. This led to Facebook apologizing for the “unacceptable error”, pausing all the artificial intelligence features that produce such messages, and investigating what had happened.
The company quickly shared an apology for the mistake and stated that it would be looking into developing the recommendation feature on the platform to prevent such situations from happening again in the future.
The video in question was dated from June 2020, was shared by the British The Daily Mail, and featured several clips of Black men in arguments with police officers or white civilians, and had absolutely no connections to primates or any monkeys. One of the company’s former content design managers, Darci Groves, stated that one of her friends had sent her the screenshot of the automatic video prompt.
She then shared that screenshot to a product feedback forum, where both former and current Facebook employees shared content. The post received a response from a product manager from Facebook Watch, which is the company’s video service feature. In the response, the product manager called the prompt “unacceptable”, saying that the company would be looking into the source behind what happened. A spokesperson for Facebook also released a statement saying that the company has been continuously working on improving its Artificial Intelligence technology, but there’s still plenty of progress to make.
A number of other companies in the tech industry, such as Amazon and Google, have been facing backlash and criticism for years because of the biases that are within their own artificial intelligence technology, especially regarding issues of race. There’ve also been numerous studies where researchers have shown how facial recognition technology is more biased towards people of color, including having issues identifying them. This problem has then led to incidents where Black people have either been discriminated against or even arrested because of an AI error.
Previously, both Facebook, and the photo-sharing platform Instagram, have had struggles with other race-related issues. For example, three Black members of the England national soccer team received racial abuse on Instagram for missing penalty kicks during the European Championship in soccer in the summer.
Furthermore, Facebook has had internal racial issues too, as back in 2016, the CEO, Mark Zuckerberg, asked the employees to stop crossing out phrases regarding the Black Lives Matter movement, and writing “All Lives Matter” instead of in the communal spaces of Facebook’s facilities. He also started working with and hired a vice president of civil rights, and shared a civil rights audit.
The company’s employees also staged a virtual walkout during the pandemic last year in protest, and in agreement with the other Black Lives Matter protests around the country, because of the way that the company handled one of the posts made by President Donald Trump about the murder of George Floyd.