Facebook on Tuesday fired back after a series of withering Wall Street Journal reports that the company failed to keep users safe, with the social media giant noting an increase in staff and spending on battling abuses.
The company has been under relentless pressure to guard against being a platform where misinformation and hate can spread, while at the same time remain a forum for people to speak freely. It has struggled to respond.
A series of recent Wall Street Journal reports said the company knew its Instagram photo sharing tool was hurting teenage girls’ mental health, and that its moderation system had a double standard allowing VIPs to skirt rules.
One of the articles, citing Facebook’s own research, said a 2018 change to its software ended up promoting political outrage and division.
But Facebook said Tuesday it has spent more than $13 billion in the past five years on teams and technology devoted to fighting abuses.
Some 40,000 people now work on safety and security for the California-based tech giant, quadruple the number in the year 2016, according to Facebook.
“How technology companies grapple with complex issues is being heavily scrutinized, and often, without important context,” Facebook contended in a blog post.
The social network launched an about.facebook.com/progress website to showcase work done to counter abuses.
Facebook’s Nick Clegg also attacked the reporting in a blog post on Saturday, saying the articles were unfair.
“At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company,” he wrote.Â
The Journal stories cited, in part, studies commissioned by the company and which contained disturbing revelations like: “We make body image issues worse for one in three teen girls.”Â
Clegg said the stories selectively employed quotes in a way that offered a deliberately lop-sided view of the company’s work.Â
“We will continue to ask ourselves the hard questions. And we will continue to improve our products and services as a result,” he said in the closing lines of his post.
Facebook recently launched an effort targeting users working together on the platform to promote real-world violence or conspiracy theories, beginning by taking down a German network spreading Covid misinformation.Â
The new tool is meant to detect organized, malicious efforts that are a threat but fall short of the social media giant’s existing rules against hate groups, said Facebook’s head of security policy Nathaniel Gleicher.