Last Updated on 22/11/2021 by Khushi
5th September 2021
In recent times, a video from a British tabloid depicting Black males sent an automatic message from Facebook to the viewers on there, asking whether they want to “keep watching films about Primates”, ended up offending a majority of the viewers leading to a complaint against the same. This act prompted the company to investigate and stop the artificial intelligence-powered function that pushed the post.
The Daily Mail released a video on June 27, 2020, that showed clips of Black individuals fighting with white people and police personnel. It has nothing to do with apes or monkeys.
On Friday, Facebook announced it has an “unacceptable error” and apologized for the mistake and saying it will be looking into the suggestion tool to “prevent this from happening again.
A former Facebook Darci Groves, content design manager, claimed she got a snapshot of the question from a friend lately. She then shared it with current and past Facebook workers in a product feedback group. A product manager for Facebook Watch, the company’s video programme, responded by calling it “unacceptable” and saying the firm was “investigating the core cause.”
Dani Lever, a Facebook spokesperson, said in a statement: “As we have said, while we have made improvements to our AI, we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
For years, Google, Amazon, and other technology corporations have been criticized for biases in their AI systems, particularly in relation to race. According to studies, face recognition software is prejudiced towards persons of color and has a harder time recognizing them, resulting in cases when Black individuals have been discriminated against or jailed as a result of a computer error.
Facebook’s facial- and object-recognition algorithms are trained on one of the world’s largest libraries of user-uploaded photos. The business, which customizes material for users based on their previous browsing and watching habits, occasionally asks if they want to see more articles in related categories. It was unclear if warnings like the one about “primates” were widely distributed.
Other race-related concerns have plagued Facebook and Instagram, its photo-sharing service. Three Black players of England’s national soccer team, for example, were racially attacked on social media after missing penalty kicks in the European Championship game in July.
Internal conflict at Facebook has also been fueled by racial concerns. In a shared room in the company’s Menlo Park, California, offices, CEO Mark Zuckerberg instructed staff to cease crossing out the term “Black Lives Matter” and replace it with “All Lives Matter.”
Facebook really needs to look after the errors and recommendations and needs to rectify these, unless it creates much bigger chaos.