LOS ANGELES, United States – Facebook users who
recently watched a video from a British tabloid featuring Black men saw an
automated prompt from the social network that asked if they would like to “keep
seeing videos about Primates,” causing the company to investigate and disable
the artificial intelligence-powered feature that pushed the message.
اضافة اعلان
Facebook on Friday apologized for what it called “an
unacceptable error” and said it was looking into the recommendation feature to
“prevent this from happening again.”
The video, dated June 27, 2020, was by The Daily Mail and
featured clips of Black men in altercations with white civilians and police
officers. It had no connection to monkeys or primates.
Darci Groves, a former content design manager at Facebook,
said a friend had recently sent her a screenshot of the prompt. She then posted
it to a product feedback forum for current and former Facebook employees. In
response, a product manager for Facebook Watch, the company’s video service,
called it “unacceptable” and said the company was “looking into the root
cause.”
Groves said the prompt was “horrifying and egregious.”
Dani Lever, a Facebook spokesperson, said in a statement:
“As we have said, while we have made improvements to our AI, we know it’s not
perfect, and we have more progress to make. We apologize to anyone who may have
seen these offensive recommendations.”
Google, Amazon and other technology companies have been
under scrutiny for years for biases within their AI systems, particularly
around issues of race. Studies have shown that facial recognition technology is
biased against people of color and has more trouble identifying them, leading
to incidents where Black people have been discriminated against or arrested
because of computer error.
In one example in 2015, Google Photos mistakenly labeled
pictures of Black people as “gorillas,” for which Google said it was “genuinely
sorry” and would work to fix the issue immediately. More than two years later,
Wired found that Google’s solution was to censor the word “gorilla” from
searches, while also blocking “chimp,” “chimpanzee” and “monkey.”
Facebook has one of the world’s largest repositories of
user-uploaded images on which to train its facial- and object-recognition
algorithms. The company, which tailors content to users based on their past
browsing and viewing habits, sometimes asks people if they would like to
continue seeing posts under related categories. It was
unclear whether messages
like the “primates” one were widespread.
Facebook and Instagram, its photo-sharing app, have
struggled with other issues related to race. After July’s European Championship
in soccer, for instance, three Black members of England’s national soccer team
were racially abused on the social network for missing penalty kicks in the
championship game.
Racial issues have also caused internal strife at
Facebook.
In 2016, CEO Mark Zuckerberg asked employees to stop crossing out the phrase
“Black Lives Matter” and replacing it with “All Lives Matter” in a communal
space in the company’s Menlo Park, California, headquarters. Hundreds of
employees also staged a virtual walkout last year to protest the company’s
handling of a post from President Donald Trump about the killing of George
Floyd in Minneapolis.
The company later hired a vice president of civil rights and
released a civil rights audit. In an annual diversity report in July, Facebook
said 4.4% of its U.S.-based employees were Black, up from 3.9% the year before.
Groves, who left Facebook over the summer after four years,
said in an interview that a series of missteps at the company suggested that
dealing with racial problems wasn’t a priority for its leaders.
“Facebook can’t keep making these mistakes and then saying,
‘I’m sorry,’” she said.
Read more Business news