Maandag 4 mei 2026 — Editie #4

RainbowNews

The global platform for LGBTQ+ news, analysis and stories. Independent and inclusive.

NederlandsUKGlobalDeutschFrançaisEspañolBrasilAsia-PacificLatinoamérica
redactie

Meta Ordered to Restore Lesbian Content After Oversight Ruling

Meta's Oversight Board ruled the company wrongly removed lesbian Instagram posts. What does this mean for content moderation rights?

RainbowNews RedactieMay 6, 2026 — International3 min read
···

Photo: RainbowNews Editorial

Meta's Oversight Board has ruled that the company wrongly removed lesbian content from Instagram. The board said the takedowns violated Meta's own rules. Meta must now restore the posts and review its moderation systems. The decision was published in late April 2026.

The Oversight Board is an independent body. It reviews difficult content choices made by Meta. Its rulings are binding for individual posts. Policy advice is not binding, but Meta usually responds within 60 days.

What the case was about

The case involved several Instagram posts by lesbian users. The posts used words like dyke and lesbian in a positive way. Meta's automated systems flagged the content as hate speech. Human moderators then confirmed the removals.

Users appealed the decisions. They said the words were used as self-description, not as insults. The Oversight Board agreed. According to the ruling, Meta failed to recognise reclaimed language. Reclaimed language is when a group uses a former insult in a positive way.

The board wrote that the removals limited free expression. It also said the moderation harmed visibility for lesbian users. Meta's own policy allows reclaimed slurs when the speaker belongs to the group.

Background: how content moderation works

Meta uses a mix of automated tools and human reviewers. Automated systems scan billions of posts each day. They look for words and images that may break the rules. Flagged posts go to human moderators or are removed directly.

The system has known problems. Automated tools often miss context. A word can be hateful in one post and supportive in another. LGBTQ+ groups have complained about this for years. They say their content is removed more often than other content.

In 2021, the human rights group GLAAD reported that LGBTQ+ posts were taken down at higher rates. Meta said it was working on the issue. Several research reports since then show the problem continues.

What the ruling means in practice

Meta must restore the specific posts in the case. The company must also explain what it will change. The Oversight Board asked Meta to:

  • Improve detection of reclaimed language
  • Train moderators on LGBTQ+ context
  • Be more transparent about removal reasons
  • Give users clearer appeal options

The board's recommendations are not legally binding. But Meta has agreed to respond publicly. In past cases, Meta has accepted most policy advice.

Legal framework in Europe

The ruling comes at an important moment. The European Union's Digital Services Act (DSA) is now fully active. The DSA requires large platforms to handle content fairly. Platforms must explain removals. Users must have a clear appeal process.

Article 17 of the DSA states that users must receive a reason when content is removed. Article 20 requires an internal complaint system. Article 21 gives users the right to outside dispute settlement.

The European Commission can fine platforms up to 6 percent of global turnover for serious breaches. Meta is one of the platforms under direct EU supervision.

Reactions from both sides

LGBTQ+ groups welcomed the ruling. ILGA-Europe said the decision shows that automated moderation often fails. The group called for clearer rules across all platforms. GLAAD said Meta must move faster on its promises.

Meta said it accepts the board's decision on the individual posts. A spokesperson said the company will study the wider recommendations. Meta noted that it removes millions of posts every week. The company said mistakes can happen at that scale.

Some free speech groups also commented. They said the ruling shows the risks of strict moderation rules. They want platforms to remove less content, not more. Other groups said Meta still does too little against real hate speech.

What users can do

Users who believe their content was wrongly removed have several options. The first step is the appeal button inside the app. Meta must respond within a set time.

If the appeal fails, EU users can go to a certified out-of-court body. These bodies are listed on the European Commission's website. The service is free or low-cost for users. Decisions are not binding but carry weight.

Users can also file a complaint with their national digital services coordinator. In the Netherlands, this is the Authority for Consumers and Markets (ACM). In Germany, it is the Bundesnetzagentur.

Wider context

The case is one of several about LGBTQ+ content online. Earlier this year, the Oversight Board ruled on transgender posts. In that case, the board also found Meta's rules unclear. TikTok and YouTube face similar complaints.

Researchers at the University of Amsterdam studied 10,000 LGBTQ+ posts in 2024. They found that 14 percent were wrongly removed or hidden. The same study found lower error rates for non-LGBTQ+ content.

Looking ahead

Meta has 60 days to respond to the policy recommendations. The Oversight Board will publish that response. The European Commission is also watching the case. It may use the findings in its own DSA reviews.

For now, the ruling sets a clear marker. Platforms must take context into account. Reclaimed language is not the same as hate speech. How well Meta applies this in practice will become clear in the coming months.

RR

RainbowNews Redactie

Editor

Part of the RainbowNews editorial team.

Meer van deze auteur →

More in Redactie