Facebook panel overturns 4 content takedowns in first ruling
January 29, 2021 10:41 am
Facebook’s quasi-independent oversight board issued its first rulings on Thursday, overturning four out of five decisions by the social network to take down questionable content.
But critics called the announcement largely irrelevant given the flood of misinformation, extremism and racism that remains on Facebook despite the company’s efforts over the past years.
“The whole thing is kind of like putting new windows on a house in which the roof has caved in,” said Gautam Hans, a Vanderbilt University expert on civil liberties and intellectual property. “The (oversight board) can’t do very much — it selects a tiny percentage of potential cases — to fix a company with so many systemic and in my opinion unfixable problems.”
Nonetheless, Hans said he respects the effort and believes “there are some clear distinctions” between what the oversight board thinks the standards should be and what the company does.
The social media giant set up the oversight panel to rule on thorny issues about content on its platforms, in response to furious criticism about its inability to respond swiftly and effectively to misinformation, hate speech and nefarious influence campaigns.
Facebook regularly takes down thousands of posts and accounts, and about 150,000 of those cases have appealed to the oversight board since it launched in October. The board is prioritizing the review of cases that have the potential to affect many users around the world.
In its initial batch of rulings, the board ordered Facebook to restore posts by users that the company said broke standards on adult nudity, hate speech, or dangerous individuals.
One case, in which a Brazilian user’s Instagram post about breast cancer was automatically removed because it included images of female nipples, should have been allowed because the platform makes an exception for breast cancer awareness, the board said.
A Myanmar user’s Burmese-language Facebook post about Muslims that included two widely shared photos of a dead Syrian toddler was offensive but did not rise to the level of hate speech, it ruled.
The human rights group Muslim Advocates lambasted the decision, saying the board “bent over backwards to excuse hate in Myanmar — a county where Facebook has been complicit in a genocide against Muslims.”