San Francisco, United States–Meta was reviewing a call by its oversight board to make its policies on adult nudity more inclusive, a spokesperson said Thursday, after the tech giant removed two Instagram posts showing transgender and non-binary people with their chests bared.
Neither post violated Meta’s policies on adult nudity, and in a statement released earlier this week, the board said it had overturned the company’s decision to remove them.
A Meta spokesperson told AFP Thursday that the company welcomed the board’s move and had already restored the images, agreeing they should not have been taken down.
But the board also seized the opportunity to call on Meta, which also owns Facebook, to make its broader policies on adult nudity more inclusive.
The current policy “prohibits images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery,” the oversight board wrote in its decision.
That policy, it continued, “is based on a binary view of gender and a distinction between male and female bodies,” and results in “greater barriers to expression for women, trans and gender non-binary people on its platforms.”
It called for Meta to evaluate its policies “so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender.”
The Meta spokesperson said the company was reviewing that request, which echoes calls made by activists for years.
“We are constantly evaluating our policies to help make our platforms safer for everyone,” the spokesperson said.
“We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy organizations on a range of issues and product improvements.”
“We have given Meta food for thought,” oversight board member Helle Thorning-Schmidt, a former prime minister of Denmark, said Thursday in a forum at Instagram.
“It’s interesting to note that the only nipples not sexualized are those of men or those who have been operated on.”
“Over-policing of LGBTQ content, and especially trans and nonbinary content, is a serious problem on social media platforms,” a spokesperson for advocacy group GLAAD told AFP.
“The fact that Instagram’s AI system and human moderators repeatedly identified these posts as pornographic and as sexual solicitation indicates serious failures with regard to both their machine learning systems and moderator training.”
Meta said it will publicly respond to each of the board’s recommendations on the matter by mid-March.