Facebook has apologized for an incident in which its artificial intelligence systems named Video clip To Black Men it’s a monkey video, calling it an unacceptable mistake and is looking into a recommendation feature to prevent this from happening again.
And as mentioned According to the New York Times, users who watched the June 27 video published by the British newspaper Daily Mail received an automatic prompt asking if they wanted to continue watching videos of monkeys.
A company spokesperson said Facebook disabled the topic recommendation feature entirely as soon as it realized what was happening. We apologize to anyone who may have seen these offensive recommendations.
The spokesman added: This is clearly an unacceptable mistake. The company is investigating the cause to prevent the behavior from happening again. And like we said, while we’ve made improvements to our AI, we know it’s not perfect. We have more progress to make.
Google, Amazon and other tech companies have come under scrutiny for years for bias within AI systems, particularly on issues of race.
Studies have shown that facial recognition technology is biased against people of color and has more trouble recognizing them. This leads to incidents in which blacks are discriminated against or arrested because of a computer error.
In 2015, Google apologized after its Photos app flagged black people’s photos as gorillas.
Read also: Facebook enters the field of fantasy sports
Facebook apologizes for the video
Facebook has one of the world’s largest repositories of images from which to train facial and object recognition algorithms.
The company, which customizes content for users based on their past browsing and viewing habits, sometimes asks if they’d like to continue seeing posts under relevant categories.
The company and the photo-sharing app, Instagram, faced other issues related to race. After the European Football Championship in July, for example, members of the England national football team were subjected to racist abuse via the social network due to the penalty shootout failure in the tournament match.
Racial issues also caused internal conflict in the company. In 2016, CEO Mark Zuckerberg asked employees to stop deleting the phrase Black Lives Matter and replace it with All Lives Matter in a shared space at the company’s headquarters in Menlo Park, California.
Hundreds of employees also staged a virtual strike last year to protest the company’s handling of a publication from President Donald Trump about the killing of George Floyd in Minneapolis.
The US Federal Trade Commission warned in April that AI tools that demonstrated racial and gender bias may violate consumer protection laws. This is if it is used to make a decision about obtaining credit, housing or employment.
Read also: WhatsApp fined $267 million in Europe