Facebook Whistleblower Frances Haugen gave her testimony before the US Senate Committee on Commerce, Science, and Transportation on Tuesday, October 5. It was related to the same case where Facebook was getting grilled for hiding how its product and services were damaging kids. Frances revealed on Sunday, Oct 3 that she was one who released controversial documents to the Wall Street Journal. Those documents contained proof that the tech giant buried crucial internal research; concerned with Instagram’s impact on teenage users and their mental health. Interestingly, the research was done in 2019 and Facebook never intended to share it publicly, but Frances had other plans.
The whistleblower is an ex-Facebook employee, who has also worked in companies like Google, and Pinterest as a project manager. At Facebook, she reportedly worked on multiple issues related to misinformation, democracy, and counter-espionage. After acknowledging the complexity level of these issues, she mentioned it was equally important to demand change from the social media company. US lawmakers also accused the absent CEO Mark Zuckerberg of being careless towards user safety for gaining higher profits and demanded the regulators to probe.
Facebook Whistleblower Explains the Problem in Detail
Throughout the hearing, Frances painted a dreadful image of Facebook’s systems and policies. US lawmakers have held numerous sessions regarding the operations of big tech but rarely is this insightful and substantive. Haugen said that the decisions and choices being made inside the offices of Facebook were harmful to children, safety, and democracy. She had seen young kids involved in relentlessly hateful behavior online, which made her express concern for future generations.
This time it was different for FB because the witness was also an expert who gave workable suggestions on how to improve any tech company. Moreover, her testimony echoed the same familiar concerns from researchers and activists, who frequently raised their voices against how Facebook enabled harmful content and then encouraged engagement on such posts.
She discussed many actionable points but FB’s controversial algorithm remained the highlight of the whole session.
An Algorithm with Side Effects
Facebook whistleblower repeatedly criticized the tech giant’s algorithm as dangerous while on paper, it was supposed to generate Meaningful Social Interactions (MSIs). In 2018, the company made historical changes to the News Feed algorithm. It explained that the new algorithm would prioritize the engagement between close people and their posts that sparked conversations. It did that by dialing down on the publisher content i.e., news, videos, and the whole guilty-pleasure stuff due to which the total time spent on the app was extraordinary. Zuckerberg said that publisher content was being consumed passively but not talked about actively so the company had to make this change.
However, after the leakage of confidential files, the data scientists raised concerns that such an algorithm compromised the important bits of publisher content like politics and news. Moreover, Facebook has been using a ranking system based on engagement, in which the AI (artificial intelligence) displayed content to the viewers, which was getting popular. It meant that no matter how much misinformation, toxicity, or violence was promoted in the content, if it generated a strong reaction from the public, then it would be prioritized in the News Feed. The whistleblower also shared that chronological ranking could have been a proper way to mitigate such issues.
Haugen described FB’s move as a “false choice” because the algorithm worked perfectly for the growth of the company but at the same time sacrificed user safety. As an example, she reiterated the events of US Elections 2020 when Facebook added safeguards to prevent misinformation. However, after the attacks on US Capitol Hill, the company reverted to default settings. Haugen found it deeply problematic as FB was being led by metrics rather than people. That’s why she argued the need for oversight from researchers, academics, and government agencies to help improve FB’s experience. She further implied that by adding such regulations and restrictions, FB could become even more profitable as it would be less toxic and people would not quit it that much.
The Level of Government Oversight
Congress asked Frances as a thought experiment about what would she do if it was her in Zuck’s place. She replied that there would be policies about sharing everything with the oversight team including congress; working closely with academics to ensure they had sufficient information for conducting research, and keep up those interventions that were introduced at the time of elections. Moreover, she suggested that users should be required to click on a link themselves before they were able to share it. Since other social media companies have reportedly reduced misinformation by following this tactic.
She further highlighted that the current structure of Facebook was unable to prevent the spread of vaccine misinformation because of its dependency on AI systems, which reportedly captured only 10 – 20 % of the questionable content. She followed up by encouraging the reforms in article 230 of the US Communications Decency Act 1996 that absolved the big tech from being held accountable for whatever their users post on their platforms.
It shows that the government needs to fix its own policies before fixing someone else’s. Haugen suggested that this particular section must not include algorithmic decisions so that tech companies can face legal actions whenever their algorithm hurts people. She explained that the tech company may not be able to control the user-generated content but they surely have full control over their algorithms. So, Haugen recommended that Facebook should also be held accountable for its choice to enable problematic content that prioritized profit, reactions, and virality over public safety.
The senate asked Facebook Whistleblower about the relevant impacts on FB’s main business if the algorithm was changed to promote safety. She answered that whenever the users find engaging or enraging posts on social media, they tend to spend more time on the site than usual, resulting in more revenue from the ad impressions. However, Frances maintained that the social media platform can still make a lot of money by following the steps she outlined in the testimony.
Social Interactions Leading to International Crimes
The documents that Haugen leaked also included the evidence that FB’s response to its platform being used for violent crime abroad, was inadequate. There are numerous stories in The Wall Street Journal’s Facebook Files about the employees flagging similar instances. One such story was about an armed group in Ethiopia, who coordinated violent attacks against ethnic minorities, on Facebook.
This again showed how FB’s moderation heavily depended on AI to do a human’s job. If an AI was in place to catch such sort of stuff, then it should have been able to work in every language and dialect used by its nearly 3 billion monthly active users. According to reports, Facebook’s AI does not cover all languages being used on the site. As the Facebook whistleblower said, even though only 9% of Facebook users were English speakers, the rate of misinformation directed towards English speakers was 87%.
She mentioned FB’s persistent understaffing of the information operatives, counter-terrorism, and counter-espionage, was a security threat. Her communication on this topic with the congress is going on in some other capacity.
Appropriate Congressional Action
It appeared that the members of the senate committee were moved by Haugen’s testimony and indicated their motivation to take action against Facebook, which is already tied in an antitrust lawsuit.
It also discussed the breakup of Facebook which could have taken WhatsApp and Instagram away from Mark Zuckerberg’s control. However, the Facebook Whistleblower expressed concern over the matter and disagreed to break up Facebook. She explained that if the machine of Facebook, WhatsApp, and Instagram is broken then most advertising revenue will go to Instagram, and other apps will be left to continue endangering lives as there won’t be enough money to fund them.
Critics, on the other hand, argue if these apps stayed linked together then the issues like the 6-hour Facebook outage, that happened on 4th October would continue to hinder other apps as well. The incident clearly showed the disadvantages of one company having so much power over multiple apps, especially WhatsApp, which has become a vital mode of communication for the masses.
Meanwhile, lawmakers will create legislation to promote safety for children on social media platforms. Democratic Senators Ed Markey and Richard Blumenthal announced to reintroduce Kids Internet Design and Safety (KIDS) Act from last year that included new protections for online users under 16. Republican Senator Jon Thun also brought up a bipartisan bill called Filter Bubble Transparency Act from 2019. Reportedly, it was to increase transparency on social media by enabling users to choose to view content that was not presented by any algorithm.