Frances Haugen said research shows how Facebook magnifies hate and misinformation, leads to increased polarization and that Instagram can harm teenage girls' mental health.

Frances Haugen giving interview to 60 minutes, looking at interviewer to slight left of frame
Atom Facebook Tuesday, October 05, 2021 - 13:24

Things are not looking too good for Facebook, which also owns platforms such as Instagram and WhatsApp after a whistleblower revealed that the company was aware of the negative effects of its products and decisions. The whistleblower, a source for Wall Street Journal’s series of stories, has now identified herself in an interview ‘60 Minutes’ on Sunday, October 3.

Facebook, over and over again, has shown it chooses profit over safety, she said. Haugen, who will testify before US Congress this week, said in the interview she hopes that by coming forward, the government will put regulations in place to govern the company's activities. She was formerly a lead product manager at Facebook and said that when she was being hired she said she wanted to work around issues of misinformation. 

She anonymously filed complaints with US federal law enforcement that Facebook’s own research shows how it magnifies hate and misinformation, leads to increased polarization and that Instagram, specifically, can harm the mental health of teenage girls.

She also said she believes Facebook’s reports on its regulation of COVID-19 misinformation and hate speech are not fully transparent. Here are the five key takeaways from the whistleblower’s revelations:

The algorithm: In the interview, she revealed how Facebook’s algorithm picks and chooses content that’s likely to create the most amount of engagement and make you angry. This isn’t necessarily what is factual, so this means the company is optimising its own interests over and over again, Haugen says. This happened after the platform tweaked its algorithm in 2018, where it prioritised engagement — which it called “meaningful social interactions.”

“Its own research is showing that content that is hateful, that is divisive, that is polarising — it is easier to inspire people to anger than it is to other emotion,” Haugen said, adding, "Misinformation, angry content, is enticing to people and keeps them on the platform.”

Good for public vs good for Facebook: She said that there were conflicts of interest between what was good for the public and what was good for Facebook. “And Facebook, over and over again, chose to optimize for its own interests, like making more money,” she said. She said the company has realised if the algorithm is changed to make the platform safer, people will spend less time on the site, which means fewer people will click on advertisements and the site will make less money. 

She stressed that the version of Facebook that exists is tearing societies apart. 

“When we live in an information environment that is full of angry hateful polarising content, it erodes our civic trust, our faith in each other, erodes our ability to want to care for each other. The version of Facebook that exists today is tearing our societies apart and causing ethnic violence.” 

Prioritising profit over safety: She added that the company does not offer the same safety systems in all the languages the platform is used or the countries it is available in as it will not be able to make money. 

Facebook makes different amounts of money for every country in the world, she said, and that every time Facebook expands to a new linguistic area, “it costs just as much, if not more, to make the safety systems for that language as it did to make English or French.” 

“Because each new language costs more money but there are fewer and fewer customers. And so, the economics just doesn't make sense for Facebook to be safe in a lot of these parts of the world,” she said. 

Instagram: Haugen also spoke about the impact that Facebook’s own research showed about Instagram’s impact on teenage girls, about how the platform made them feel worse about their bodies and made thoughts of suicide worse, and how it made eating disorders worse. 

“Facebook’s own research says that as these women begin to consume this eating disorder content, it makes them more and more depressed and it makes them use the app more. So they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research shows Instagram is not only dangerous for teenagers, it harms teenagers. It is that it is distinctly worse than other forms of social media,” she said. 

This, in fact, was revealed by Wall Street Journal earlier, and the research put out by WSJ even had a small nugget of information pertaining to India

While the appearance-based comparison is worse among women in most countries, in India, it is higher in men. The reason for this, however, is because women comprise a smaller proportion of the paid workforce, due to which men may see the comparison in areas such as professional and social experiences, whereas women do not have these experiences. 

“It's surprising that these results hold for appearance-based comparison specifically (not just comparison of achievements or status), especially since societal expectations for appearance are stricter for women than men in India,” the research showed.

Safeguards: Haugen was a part of a team at Facebook called Civic Integrity, which had been dissolved after the 2020 US Presidential Election. She alleged that certain safeguards, that had been put in place to reduce misinformation, had been turned off. “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”

Watch Haugen's interview here:

Topic tags,
Become a TNM Member for just Rs 999!
You can also support us with a one-time payment.