Facebook whistle-blower Frances Haugen to testify to Congress today

Facebook whistle-blower Frances Haugen will tell Congress today that tech giant’s products ‘harm children stoke division and weaken our economy’ because profits are put ahead of people

  • Haugen will testify that social media can lead to violence and needs regulating
  • She will tell Congress Facebook executives regularly chose profits over safety
  • Facebook maintains that the whistleblower’s allegations are misleading 

Facebook whistleblower Frances Haugen will urge Congress today to regulate social media, saying the sites are a threat to children and democracy and even lead to violence.

The former employee, who worked for the tech giant in its misinformation department, will testify today on a Senate Commerce subcommittee that Facebook’s bosses ‘put their immense profits before people’, she will claim.

After recent reports in The Wall Street Journal based on documents she leaked to the newspaper raised a public outcry, Haugen revealed her identity in a CBS ’60 Minutes’ interview aired Sunday night.

She claims Facebook had a role in the January 6 Capitol riots and is damaging for teenagers, particularly young girls.

She will say: ‘When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seatbelts, the government took action. I implore you to do the same here.’ 

The ex-employee challenging the social network giant with 2.8 billion users worldwide and nearly $1 trillion in market value is a 37-year-old data expert from Iowa with a degree in computer engineering and a master’s degree in business from Harvard. 

Facebook whistleblower Frances Haugen (pictured) will urge Congress today to regulate social media, saying the sites harm children and even lead to violence

She worked for 15 years prior to being recruited by Facebook in 2019 at companies including Google and Pinterest. 

‘The company’s leadership knows ways to make Facebook and Instagram safer and won’t make the necessary changes because they have put their immense profits before people. Congressional action is needed,’ she will say. 

‘As long as Facebook is operating in the dark, it is accountable to no one. And it will continue to make choices that go against the common good.’

Senator Amy Klobuchar, who is on the subcommittee, said that she would ask Haugen about the Jan. 6 attack on the U.S. Capitol by supporters of then-President Donald Trump.

‘I am also particularly interested in hearing from her about whether she thinks Facebook did enough to warn law enforcement and the public about January 6th and whether Facebook removed election misinformation safeguards because it was costing the company financially,’ Klobuchar said in an emailed comment.

The senator also said that she wanted to discuss Facebook’s algorithms, and whether they ‘promote harmful and divisive content.’

Haugen, who worked as a product manager on Facebook’s civic misinformation team, was the whistleblower who provided documents used in a Wall Street Journal investigation and a Senate hearing on Instagram’s harm to teen girls.

Senator Amy Klobuchar, who is on the subcommittee, said that she would ask Haugen about the Jan. 6 attack on the U.S. Capitol

The panel is examining Facebook’s use of information from its own researchers on Instagram that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. 

For some of the teens devoted to Facebook´s popular photo-sharing platform, the peer pressure generated by the visually focused Instagram led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research leaked by Haugen showed.

One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.

Facebook owns Instagram as well as WhatsApp.

The company did not respond to a request for comment.

Haugen added that ‘Facebook’s closed design means it has no oversight even from its own Oversight Board, which is as blind as the public.’

That makes it impossible for regulators to serve as a check, she added.

‘This inability to see into the actual systems of Facebook and confirm that Facebook’s systems work like they say is like the Department of Transportation regulating cars by watching them drive down the highway,’ her testimony says. ‘Imagine if no regulator could ride in a car, pump up its wheels, crash test a car, or even know that seat belts could exist.’

The Journal’s stories, based on Facebook internal presentations and emails, showed the company contributed to increased polarization online when it made changes to its content algorithm; failed to take steps to reduce vaccine hesitancy; and was aware that Instagram harmed the mental health of teenage girls.

Haugen said Facebook had done too little to prevent its platform from being used by people planning violence.

The panel is examining Facebook’s use of information from its own researchers on Instagram that could indicate potential harm for some of its young users

‘The result has been a system that amplifies division, extremism, and polarization’ and undermining societies around the world. In some cases, this dangerous online talk has led to actual violence that harms and even kills people,’ she said.

Facebook was used by people planning mass killings in Myanmar and in the Jan. 6 assault by Trump supporters who were determined to toss out the 2020 election results.   

After the November election, Facebook dissolved the civic integrity union where Haugen had been working. That, she said, was the moment she realized ‘I don´t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’

Haugen says she told Facebook executives when they recruited her that she had asked to work in an area of the company that fights misinformation, because she had lost a friend to online conspiracy theories.

Antigone Davis, Facebook’s head of global safety, faced a barrage of criticism from senators on the Commerce panel at a hearing last Thursday. They accused Facebook of concealing the negative findings about Instagram and demanded a commitment from the company to make changes.

Davis defended Instagram’s efforts to protect young people using its platform. She disputed the way The Wall Street Journal story describes what the research shows.

Facebook maintains that Haugen’s allegations are misleading and insists there is no evidence to support the premise that it is the primary cause of social polarization.

‘Even with the most sophisticated technology, which I believe we deploy, even with the tens of thousands of people that we employ to try and maintain safety and integrity on our platform, we´re never going to be absolutely on top of this 100% of the time,’ Nick Clegg, Facebook´s vice president of policy and public affairs, said Sunday on CNN´s ‘Reliable Sources.’

That’s because of the ‘instantaneous and spontaneous form of communication’ on Facebook, Clegg said, adding, ‘I think we do more than any reasonable person can expect to.’

By coming forward, Haugen says she hopes it will help spur the government to put regulations in place for Facebook´s activities. Like fellow tech giants Google, Amazon and Apple, Facebook has for years enjoyed minimal regulation in Washington.

Separately Monday, a massive global outage plunged Facebook, Instagram and the company’s WhatsApp messaging platform into chaos, only gradually dissipating by late Monday Eastern time. For some users, WhatsApp was working for a time, then not. For others, Instagram was working but not Facebook, and so on.

Facebook didn’t say what might have caused the outage, which began around 11:40 a.m. EDT and was still not fixed more than six hours later.

Haugen’s statement to Congress in full

Chairman Blumenthal, Ranking Member Blackburn, and Members of the Subcommittee. Thank you for the opportunity to appear before you and for your interest in confronting one of the most urgent threats to the American people, to our children and our country’s well-being, as well as to people and nations across the globe.

My name is Frances Haugen. I used to work at Facebook and joined because I think Facebook has the potential to bring out the best in us. But I am here today because I believe that Facebook’s products harm children, stoke division, weaken our democracy and much more. 

The company’s leadership knows ways to make Facebook and Instagram safer and won’t make the necessary changes because they have put their immense profits before people. Congressional action is needed. They cannot solve this crisis without your help.

I believe that social media has the potential to enrich our lives and our society. We can have social media we enjoy – one that brings out the best in humanity. The internet has enabled people around the world to receive and share information and ideas in ways never conceived of before. And while the internet has the power to connect an increasingly globalized society, without careful and responsible development, the internet can harm as much as it helps.

I have worked as a product manager at large tech companies since 2006, including Google, Pinterest, Yelp and Facebook. My job has largely focused on algorithmic products like Google+ Search and recommendation systems like the one that powers the Facebook News Feed. Working at four major tech companies that operate different types of social networks, I have been able to compare and contrast how each company approaches and deals with different challenges. The choices being made by Facebook’s leadership are a huge problem – for children, for public safety, for democracy – that is why I came forward. And let’s be clear: it doesn’t have to be this way. We are here today because of deliberate choices Facebook has made. 

I joined Facebook in 2019 because someone close to me was radicalized online. I felt compelled to take an active role in creating a better, less toxic Facebook. During my time at Facebook, first working as the lead product manager for Civic Misinformation and later on Counter-Espionage, I saw that Facebook repeatedly encountered conflicts between its own profits and our safety. Facebook consistently resolved those conflicts in favor of its own profits.

The result has been a system that amplifies division, extremism, and polarization – and undermining societies around the world. In some cases, this dangerous online talk has led to actual violence that harms and even kills people. In other cases, their profit optimizing machine is generating self-harm and self-hate – especially for vulnerable groups, like teenage girls. These problems have been confirmed repeatedly by Facebook’s own internal research.

This is not simply a matter of some social media users being angry of unstable. Facebook became a $1trillion company by paying for its profits with our safety, including the safety of our children. And that is unacceptable. 

I believe what I did was right and necessary for the common good – but I know Facebook has infinite resources, which it could use to destroy me. I came forward because I recognized a frightening truth: almost no one outside of Facebook knows what happens indie Facebook. The company’s leadership keeps vital information from the public, the US government, its shareholders, and governments around the world.

The documents I have provided prove that Facebook has repeatedly misled us about what its own research reveals about the safety of children, its role in spreading hateful and polarizing messages, and so much more. I appreciate the seriousness with which Members of Congress and the Securities and Exchange Commission are approaching these issues. 

The severity of this crisis demands that we break out of previous regulatory frames. Tweaks to outdated privacy protections or changes to Section 230 will not be sufficient. The core of this issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood. 

A critical starting point for effective regulation is transparency: full access to data for research not directed by Facebook. On this foundation, we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more. 

As long as Facebook is operating in the dark, it is accountable to no one. And it will continue to make choices that go against the common good. Our common good. 

When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seatbelts, the government took action. And today, the government is taking action against companies that hid evidence on opioids. I implore you to do the same here.

Right now, Facebook chooses what information billions of people see, shaping their perception of reality. Even those who don’t use Facebook are impacted by the radicalization of people who do. A company with control over our deepest thoughts, feeling and behaviors needs real oversight.

But Facebook’s closed design means it has no oversight – even from its own Oversight Board, which is as blind as the public. Only Facebook knows how it personalizes your feed for you. It hides behind walls that keep the eyes of researchers and regulators from understanding the true dynamic of the system.

When the tobacco companies claimed that filtered cigarettes were safer for consumers, it was possible for scientists to independently invalidate that marketing message and confirm that in fact they posed a greater threat to human health. But today we can’t make this kind of independent assessment of Facebook. We have to just trust what Facebook says is true – and they have repeatedly proved they do not deserve our blind faith.

The inability to see into the actual systems of Facebook and confirm that Facebook’s systems work like they say is like the Department of Transport regulating cars by watching them drive down the highway. Imagine if no regulator could ride in a car, pump up its wheels, crash test a car, or even know that seat belts could exist.

Facebook’s regulators can see some of the problems – but they are kept blind to what is causing them and thus can’t craft specific solutions. They cannot even access the company’s own data on product safety, much less conduct an independent audit. How is the public supposed to assess if Facebook is resolving conflicts of interest in a way that is aligned with the public good if it has no visibility and no context into how Facebook really operates?

This must change.

Facebook wants you to believe that the problems we’re talking about are unsolvable. They want you to believe in false choices. They want you to believe you must choose between connecting with those you love online and your personal privacy. That in order to share fun photos of your kids with old friends, you must also be inundated with misinformation. They want you to believe that this is just part of the deal.

I am here to tell you today that’s not true. These problems are solvable. A safer, more enjoyable social media is possible. But if there is one thing that I hope everyone takes away from these disclosures it is that Facebook chooses profit over safety every day – and without action, this will continue.

Congress can change the rules Facebook plays by and stop the harm it is causing. 

I came forward, at great personal risk, because I believe we still have time to act. But we must act now. 

Thank you. 

Source: Read Full Article