• Former Facebook employee Frances Haugen gives evidence to the Joint Committee on the Draft Online Safety Bill of UK Parliament that is examining plans to regulate social media companies, in London,

Former Facebook employee Frances Haugen gives evidence to the Joint Committee on the Draft Online Safety Bill of UK Parliament that is examining plans to regulate social media companies, in London, (Photo : UK Parliament 2021/Annabel Moeller/Handout via REUTERS)

Facebook Inc will fuel more violent unrest around the world unless it stops its algorithms pushing extreme and divisive content, whistleblower Frances Haugen told the British parliament on Monday.

The former employee, who accused the social media giant of putting profit before people at a Senate subcommittee earlier this month, said she was encouraged by British plans to force big tech companies to tackle harmful content on their platforms.

Like Us on Facebook

Facebook, Haugen said, saw online safety as a cost and the company lionised a startup culture where cutting corners was good. "Unquestionably it is making hate worse," she said.

With a focus on the United States, the company was wilfully blind to its impact in many markets where a lack of local-language staff meant it often failed to understand the toxic or dangerous nature of messages on its platform, she said.

The world's biggest social network has rejected the charges, with CEO Mark Zuckerberg https://www.reuters.com/article/usa-congress-facebook-idCNL1N2R10Y1 saying earlier this month that it was deeply illogical to argue that Facebook deliberately pushed content that made people angry.

"Contrary to what was discussed at the hearing, we've always had the commercial incentive to remove harmful content from our sites. People don't want to see it when they use our apps and advertisers don't want their ads next to it," Facebook said in a statement on Monday.

It said it had spent $13 billion on keeping users safe and agreed regulation was needed across the industry, adding that it was pleased Britain is moving ahead with online safety laws.

Facebook, which also owns Instagram and WhatsApp, has been accused by U.S. lawmakers of chasing higher profits while being cavalier about user safety.

Britain is bringing forward laws that could fine social media companies up to 10% of their turnover if they fail to remove or limit the spread of illegal content.

"The events we're seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters because engagement-based ranking does two things: one, it prioritises and amplifies divisive and polarising extreme content and two it concentrates it," Haugen said.

Haugen in October told a Senate Commerce subcommittee hearing that Facebook had devised ways to keep users scrolling even if it was detrimental to their wellbeing.

She also said she provided the documents used in a Wall Street Journal investigation and a Senate hearing on Instagram's harm to teenage girls. She compared the platform to addictive substances such as tobacco and opioids.

Facebook operates in more than 190 countries and boasts more than 2.8 billion monthly users.

BRITISH INTERIOR MINISTER SEEKS TOUGHER LAWS

Before Monday's hearing, Haugen met the country's interior minister, Priti Patel, who advocates tougher legislation for tech platforms that fail to keep users safe.

She is scheduled to speak at a major tech conference, the Web Summit, next week and in Brussels to European policymakers.

"Facebook has been unwilling to accept even little slivers of profit being sacrificed for safety, and that's not acceptable," she said on Monday, singling out Instagram's impact on the mental health of some young users.

Reuters, along with other news organisations, viewed documents released to the U.S. Securities and Exchange Commission and Congress by Haugen.

They showed Facebook https://www.reuters.com/technology/facebook-knew-about-failed-police-abusive-content-globally-documents-2021-10-25 had known it had not hired enough workers who possessed both the language skills and knowledge of local events needed to identify objectionable posts from users in a number of developing countries.