Whistleblower warns MPs that ‘Facebook is closing the door’ on opportunity to regulate

Written by Eleanor Langford on 27 October 2021 in News

Frances Haugen gives evidence to inform development of Online Harms legislation

Facebook whistleblower Frances Haugen has warned that events like the attack on the US Capitol and violence between ethnic groups in Ethiopia are just “opening chapters” to worse events if social media companies are left unchecked.

Giving evidence to a parliamentary committee advising on the development of the government’s Online Harms Bill, Haugen said the Facebook algorithm “prioritises and amplifies divisive, polarising content”.

She also said she "wouldn't be surprised" if social media platforms were “under-enforcing" UK-based harmful content because Facebook’s systems were designed to detect American-English, not British-English.

The 37-year-old rose to prominence after she provided thousands of documents to US security services which allegedly detail the company’s failure to prevent harmful content on its platform. Testifying before the US congress earlier this month, she accused Facebook of pursuing profit over safety and warned of the dangers the company posed to children and wider society.

Asked this week  if MPs should consider the societal harm of social media as well as individual harm when drafting legislation, she said: “I think it is a grave danger to democracy and societies around the world to omit societal harm. A core part of why I came forward was I looked at the consequences of choices Facebook was making, and I looked at things like the Global South. I believe situations like Ethiopia are just part of the opening chapters of a novel that is going to be horrific to read. We have to care about societal harm, not just for the Global South but our own societies. 

Related content

“When an oil spill happens it doesn’t make it harder for us to regulate oil companies. But right now, Facebook is closing the door on us being able to act.”

Facebook has been accused of stoking ethnic violence in both Ethiopia and Myanmar by allowing content which breaches its guidelines to be promoted on its platform. 

It has also faced claims that it contributed to the January 6 riots at the Capitol building in Washington DC. 

Haugen noted that the algorithmic favouring of extreme views meant that people were more likely to see hateful content in their feeds. 

She also warned that large Facebook groups built new “societal norms” and that the platform’s algorithm, which often prioritised content from groups, pushes people “towards extreme interests”.

“Now you see a normalisation of hate, the normalisation of dehumanising others, and that's what leads to violence,” she said. 

She believed that such problems could be much worse in the UK as Facebook’s algorithms could struggle to pick up harmful content in Britain as a result of local language nuances. 

“UK-English is sufficiently different that I would be unsurprised if the safety systems that they developed, primarily for American-English, would be under-enforced in the UK”, Haugen said.

The whistleblower also raised issues around individual harm caused by social media platforms, particularly among young people. She warned that Instagram, which is also owned by Facebook, is especially harmful to young people due to its focus on “social comparison”. 

She accused the social media giant of not doing enough to tackle underage users lying about their age to access the platform, claiming it "knows young users are the future of the platform" and it is easier to get them "hooked" younger.

She pointed to internal Facebook research which found that “for some cohorts” 10-15% of 10-year-olds were using the platform, despite the minimum age for users being 13. She said Facebook AI was able to work out the age of users from a number of factors, including friends and images, that predicted real age, even when they had disclosed that they were over 13.

“I'm extremely worried about the developmental impacts of Instagram on children," Haugen continued. "Kids are learning that the people that care about them treat them poorly. Imagine what the domestic relationships will be like for those kids when they’re 30 if they learned that people who care about them are mean.”


About the author

Eleanor Langford is a lead curation editor for PublicTechnology sister publication PoliticsHome, where a version of this story first appeared. She tweets as @eleanormia.

Share this page




Please login to post a comment or register for a free account.

Related Articles

Consultation reveals widespread opposition to proposed data-sharing laws for government login system
26 May 2023

Overwhelming majority of respondents voice disapproval but government will press on with plans to bring forward legislation

Interview: CDDO chief Lee Devlin on the ‘move from being disruptive to collaborative’
23 May 2023

In the first of a series of exclusive interviews, the head of government’s ‘Digital HQ’ talks to PublicTechnology about the Central Digital and Data Office’s work to unlock £8bn...

HMRC finds strong support for online Child Benefit claims – but ‘digital by default’ would cause problems for one in five users
17 May 2023

Department publishes findings of study conducted ahead of planned digitisation initiative

Government begins £75m 4G upgrade across rural Scotland
4 May 2023

First on 120 new masts being rolled out to connect ‘not spots’ has been installed in Lockerbie