Meta Oversight Board finds plenty of flaws with Facebook's content moderation

Surprise, surprise: Facebook doesn't always make the right decision.
By  on 
Meta Oversight Board
The Oversight Board wants Meta to do better. Credit: Hakan Nural/Anadolu Agency via Getty Images

Facebook's content moderation systems are clearly in need of repair.

On Thursday, Meta's Oversight Board announced that it had reversed two of Facebook's decisions to remove content from its platform. The independent group's conclusions point to major flaws in Facebook's content moderation protocols in two major areas: the platform's use of automated systems to take down content and the removal of newsworthy content by human moderators.

The first case from the Oversight Board concerns a Facebook user in Colombia who had posted a cartoon image depicting police brutality from the National Police of Colombia in September 2020. Facebook removed the user's post 16 months later when the company's automated systems matched the cartoon image with one stored in a Media Matching Service bank.

The Oversight Board determined it was wrong for Facebook to remove the user's post because the image depicted did not violate Facebook's rules and should not have been added to the Media Matching Service bank.

And, according to the Oversight Board, this user wasn't the only one affected. In total, 215 users appealed the removal of a post which included this image. Of those, 98 percent were successful in their appeal to Meta. However, the cartoon image remained in the bank and continued to lead to automated detections and subsequent post removals. Meta only removed the image from the Media Matching Service bank when the Oversight Board decided to take up this particular case.

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

In the second case, the Oversight Board determined Meta wrongly removed a news post about the Taliban. In January 2022, an India-based newspaper had posted a link to an article on its website about the Taliban's announcement to re-open schools for women and girls. Meta had determined that the post was in violation of its Dangerous Individuals and Organizations policy as it construed the post as "praise" of the Taliban. 

As a result, Meta removed the post and limited the Indian newspaper's access to certain Facebook features, such as Facebook livestreaming. The newspaper attempted to appeal the decision but it was not reviewed due to a lack of Urdu-speaking reviewers at the company.

Once more, when the Oversight Board decided to take this case, Meta then reversed its decision, restored the content, and removed the Facebook Page limitations. Simply reporting on newsworthy events is not a violation of Facebook's policies, the Oversight Board determined.

While the affected users in these specific cases may be fairly small in number or reach, the Oversight Board used the opportunity to recommend broader changes to Facebook's content moderation systems, whether it be automated or human-reviewed.

Founded in 2018, the Oversight Board was formed to create somewhat of a Supreme Court for Meta's content moderation decisions. The organization released the decisions on its first cases in January 2021. One of those early rulings was heavily criticized as it called for the restoration of a removed post that Muslim activist groups deemed as hate speech. But, the Oversight Board's most notable case up to this point has easily been its decision to uphold Meta's suspension of Donald Trump on Facebook. The former President was suspended from the platform following the violent riots at the Capitol building on Jan. 6.

The Oversight Board's decision did force Meta to set a timeframe for Trump's suspension, however. Shortly after this 2021 ruling from the Oversight Board, Meta announced it would consider allowing Trump back on its platforms in January 2023. That may have sounded far off into the future back in June 2021, but now that's just a few months away. If and when Trump returns to Facebook next year, don't be surprised to see his name on an Oversight Board case or two...or twenty.

Topics Facebook Meta


Recommended For You
John Oliver is trolling Meta with a new, memorably-named website
A man in a suit sits behind a talk show desk.

Meta plans to launch standalone Meta AI app. OpenAI's Sam Altman fires back.
Meta CEO Mark Zuckerberg at Donald Trump's inauguration making a frowning face

Meta's crackdown on adult content fails to stop AI nudify apps from flourishing
Facebook and Instagram app logos

Mark Zuckerberg announces $60 billion investment in Meta AI
Mark Zuckerberg's personal Facebook account is displayed on a mobile phone with the Meta logo visible on a tablet screen


Trending on Mashable
NYT Connections hints today: Clues, answers for April 1, 2025
Connections game on a smartphone

Wordle today: Answer, hints for April 1, 2025
Wordle game on a smartphone


NYT Strands hints, answers for April 1, 2025
A game being played on a smartphone.

The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!