Meta launches MENA campaign against child abuse

2 min read
Facebook and Instagram stand accused globally of not heeding teenagers sustaining mental trauma due to their use of these social media platforms.
  • Facebook and Instagram’s parent Meta has launched the “Report it, Don’t Share it” campaign in the MENA region
  • In the eye of the storm of allegations that it did not act against child abuse and exploitation, Meta said the campaign will prevent exactly those things

Facebook and Instagram, which are in the eye of the storm of allegations that they turned a blind eye to teenagers being abused or sustaining mental trauma due to their use of these social media platforms, has launched a new campaign in the Middle East and North Africa region against it.

The campaign, formally announced by the companies’ parent Meta, is apparently to raise awareness about child abuse on the internet.

It claims to intend to educate people that sharing child abuse, even out of anger or condemnation, causes more harm to the child and is illegal.

Victims in MENA

According to a survey conducted by the WeProtect Global Alliance, nearly half of people aged 18-20 year in the Middle East have been victims of online abuse and exploitation, with many of them taken advantage of after sharing compromising images to the web.

Over 5,000 people between the ages of 18 and 20 from 54 countries were surveyed about their online experiences as children, revealing that 44 percent of respondents in the MENA region had been a victim of online sexual abuse as a child.

The report’s primary recommendations include tighter regulation, increased transparency in technology companies’ online security tools, and increased investment in law enforcement.

Meta’s approach

Meta has reportedly acknowledged that Facebook and Instagram users must feel safe on the platforms to accomplish the company’s objective.

That’s why it launched its campaign “Report it, don’t share it” in the region, it said.

During the launch, David Miles, Head of Safety at Meta EMEA, and Tara Hokpins, Director of Policy at Instagram EMEA, encouraged anyone who sees content they think breaks the company’s rules to report it using in-app reporting tools.

They also promised that these reports would be prioritized for review.

Meta said preventing child abuse on its platforms has been a top priority for the company for months, and it has taken special precautions to do it.

It said it has created cutting-edge technology that can detect and prevent misuse before it occurs.

It added that it has also made it easier for users to control their experience or seek assistance.

Meta said it frequently collaborates with specialists to firm up its security strategy and stay ahead of concerns that could endanger the well-being of its consumers.

It claimed it also reports all apparent instances of child exploitation appearing on the website from anywhere globally to the National Center for Missing and Exploited Children or NCMEC.

Then, NCMEC coordinates with the International Center for Missing and Exploited Children and law enforcement authorities worldwide.

Meta said it also uses PhotoDNA and other technology across all of its apps to detect, remove, and prevent sharing images and videos that exploit children.

It also uses technology and behavior signals to recognize and avoid grooming or potentially inappropriate interactions between minors and adults.



Today's Headlines

The most important news stories of the day, curated by Post editors and delivered every morning.

Google reCaptcha: Invalid site key.

By signing up you agree to our Terms of Use and Privacy Policy.