Bu içerik henüz Turkish [Türkçe] içinde mevcut değil

Enabling Users to Raise Awareness for Breast Cancer

GÜNCELLENME

FEB 19, 2025

The Oversight Board overturned Meta’s decision to remove a post on Instagram that featured images of nudity related to breast cancer. After the Board selected this case on Breast Cancer Symptoms & Nudity, Meta restored the content. Meta’s automated systems originally removed the post for violating the company’s Community Standard on Adult Nudity and Sexual Activity. The Board found that the post was allowed under a policy exception for “breast cancer awareness” and Meta’s automated moderation in this case raised important human rights concerns.

The Board issued several recommendations, including that Meta improve automated detection of images with text overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. In response, Meta committed to refining these systems by continuing to invest in improving our computer vision signals, sampling more training data for our machine learning, and leveraging manual review when we’re not as confident about the accuracy of our automation.

What was the impact of Meta’s implementation of this recommendation?

In response to the Board’s recommendation in January 2021, Meta committed to improving text-overlay to ensure that posts raising awareness of breast cancer symptoms are not removed by over-enforcement of our Adult Nudity and Sexual Activity (ANSA) Community Standards. Meta’s implementation team enhanced Instagram’s techniques for identifying breast cancer context in content via text and deployed it in July 2021. Those enhancements have been in place since, and in one 30 day period alone (between Feb 25 - March 27, 2023), these enhancements contributed to an additional 2,500 pieces of content being sent for human review that would have previously been removed.

In an analysis of a 30 day time period in 2023

OSZAR »