Facebook after the whistleblower: can Zuckerberg reboot the social network?

Facebook after the whistleblower: can Zuckerberg reboot the social network?
The company is facing criticism from its own staff that its ‘growth at all costs’ culture is damaging individuals and society

In Facebook’s early days, Mark Zuckerberg ended weekly meetings by raising his fist and shouting “domination”.

On a call with investors on Monday, the boyish social media titan struck a similarly defiant tone, promising that Facebook would throw its weight behind efforts to lure younger users back to the platform after their numbers had dwindled. He pledged the company would build the “successor to the mobile internet”, an avatar-filled virtual world known as the metaverse.

But the chief executive was also swift to address mounting allegations against his company that it has relentlessly placed “profits over safety” and downplayed its alleged role in the poisoning of democratic society – misleading both investors and the public.

“When we make decisions, we need to balance competing social equities,” Zuckerberg said, citing as his first example balancing free speech – of which he has been a fierce proponent – with reducing harmful content.

“It makes a good sound bite to say that we don’t solve these impossible trade-offs because we’re just focused on making money, but the reality is these questions are not primarily about our business, but about balancing different difficult social values,” he added.

Thanks to one of the biggest leaks in history, Facebook is fighting claims that it has done little to shed its “growth at all costs” culture that turbocharged its rise to capture 3.58 billion users and quarterly sales of more than $29 billion. It is an image that Zuckerberg has sought to overturn with billion-dollar investments in moderation, safety and what his company calls “integrity” work.

But Frances Haugen, an employee on the Facebook integrity team until May 2021, argues that the company’s commitment to the cause is insincere. To prove it, she leaked tens of thousands of internal documents – including many from employee discussion sites, company presentations and research papers – that have unveiled the inner workings of Facebook. She has also filed eight complaints against the company with US securities regulators. Facebook, in turn, has sought to portray Haugen as a junior employee cherry-picking to fit her own narrative, with little knowledge of some of the issues on which she has taken a view.

Mark Zuckerberg at a US Senate hearing. The Facebook chief has been forced to downplay its alleged role in the poisoning of democratic society. File photograph: Tom Brenner/The New York Times

The reality is that the documents show Facebook is painfully aware of the harm the platform and its algorithms can cause: exacerbating the poor mental health of teenagers, accelerating polarisation in countries where the political landscape is fragile, fuelling misinformation and conspiracy theories.

In many cases, Facebook researchers are actively wrangling with ways to solve these thorny issues. When these good faith efforts fall short, it is often because they are stymied sometimes by top-down pressure, but also by the technical and bureaucratic challenges that come with managing a sprawling $915 billion company. Although its share price remains resilient the leak has also shown where Facebook’s future problems could lie, as global regulators circle the company and its Big Tech counterparts.

Facebook has “huge numbers of people working on analysing and fixing” its content problems, says Benedict Evans, an independent technology analyst. “But trade-offs, [organisational] structure, conflicting priorities, language, tech limitations, politics and massive growth means lots of that work is broken.”

Losing the youth

The latest firestorm of Facebook criticism has built up in several stages, part of a slick campaign by Haugen and a team of legal and press relations professionals supporting her. Copies of the internal documents were disclosed to US regulators and provided to Congress in redacted form by Haugen’s legal counsel. The Wall Street Journal was the first media group to receive and report on the documents.

Haugen (37) then went on 60 Minutes, the US TV news programme, in early October to reveal herself as the source. Two days later, she testified before Congress. Now, a consortium of news organisations, including the Financial Times, has obtained the redacted versions of the documents received by Congress, prompting fresh coverage.

They reveal Facebook to be acutely aware that it is considered by younger generations to be desperately uncool. It’s not a new phenomenon but has become more pronounced in recent years. According to a March 2021 document, daily user numbers in the US on Facebook for teenagers and young adults – aged 18 to 29 – are in decline and projected to fall by 4 and 45 per cent respectively in the next two years.

Mark Zuckerberg saidFacebook would be ‘retooling’ its teams to help retain younger users of the social networking site as their numbers dwindle. Photograph: iStock

“Young adults perceive [Facebook] content as boring, misleading and negative,” one November 2020 research presentation reads, citing data from multiple qualitative and quantitative studies. As a network, it is considered “outdated” and that time spent on it is “unproductive”.

Even Instagram, the photo app it bought in 2012 for $1 billion that has until now been a magnet for young people, shows “worrying” trends in the consumption and production of content by users, other documents from 2021 reveal. This is blamed partly on the dizzying rise of Chinese-owned rival TikTok during the pandemic and, experts say, does not bode well for the company.

“I can’t think of a social platform which has begun a sustained decline in terms of users that has then been able to recover from that,” says Andrew Lipsman, principal ecommerce analyst at Insider Intelligence. “[Though] the trends can take time to bear out.”

On Monday, Zuckerberg scrambled to dispel allegations that Facebook was hiding such challenges from investors by announcing that the company would be “retooling” its teams “to make serving young adults the north star, rather than optimising for the larger number of older people”.

It is against this tense backdrop that Facebook has been making many of the decisions outlined in the documents. Among them, the company has embraced controversial efforts to build a version of Instagram for under-13s, Instagram for Kids. This, the documents reveal, is despite internal research showing a complex impact on the mental health of young people, with some affected detrimentally but others benefiting from social media use.

Facebook says the push to attract those aged under 13 is an attempt to offer parents extra controls when their children will probably be on the internet anyway. But two former staffers, speaking on condition of anonymity, dismiss this suggestion.

Among Haugen’s most powerful accusations are that Facebook has not just ignored but knowingly fanned the flames of violence  across the world. Photograph: Niall Carson/PA Wire

The efforts are instead designed to get young people hooked on the platform early, one former employee says. “[This remains] an internal culture where product managers are motivated – and compensated – to show impact by driving engagement and user growth,” the person adds.

‘Embarrassing to work here’

Brian Boland, the former vice-president of partnerships marketing at the company, agrees that Facebook cares about safety “to a point” but “errs on the side of growth”. The company is also, he says, the clearest example of the issues thrown up by the new frontier of surveillance capitalism – the commodification of a person’s online data for profitmaking purposes.

These issues do not all appear to be of Facebook’s own deliberate making, rather an unpredictable byproduct of dizzying growth, some documents show. But the fundamentals of how the site operates – engaging users with “likes” and “shares” – have not escaped internal scrutiny, leading to several soul-searching comments among employees.

One Facebook worker wrote in response to an August 2019 research paper that the company had “compelling evidence” that its “core product mechanics” such as recommending groups or people to users and optimising for engagement were a significant part of why hate speech and other unwanted material was able to “flourish” on the platform.

There is evidence of some efforts to measure and mitigate the issue. A document presented to Zuckerberg in February 2020 outlined Project Daisy, an initiative to hide “likes” and other metrics from users to see if removing the measure of popularity would make users feel better about using the platform.

The effects of Project Daisy on wellbeing were negligible, an internal study subsequently found. It did, however, have a much more marked impact on advertising – driving down performance. An option, introduced after the Project Daisy discussions, to hide likes can now be found in Instagram. Another project, codenamed Drebbel, sought to monitor the spiralling effects of so-called “rabbit holes”, where users are directed towards harmful material by the site’s recommendation algorithms.

Among Haugen’s most powerful accusations against Facebook are that the company has not just ignored but knowingly fanned the flames of violence and misinformation across the world, and especially outside the English-speaking world. Internal documents show a crippling lack of in-country moderation and local linguistic support for widely-spoken languages such as Arabic, which have helped compound horrific real-world harms from ethnic cleansing to sex trafficking and religious rioting in places such as Dubai, Ethiopia, India and Myanmar.

In her testimony to members of the UK parliament on Monday, Haugen said the consequences of Facebook’s choices in the so-called Global South were a “core part” of why she came forward, describing the violent ethnic conflict amplified by Facebook in Ethiopia as the “opening chapter of a novel that is going to be horrific to read”.

Frances Haugen giving evidence to the UK joint committee for the Draft Online Safety Bill, as part of government plans for social media regulation. Photograph: House of Commons/PA Wire

Other documents show that Facebook has become a Petri dish for co-ordinated extremist groups all over the world. Sophie Zhang is a former data scientist in Facebook’s “fake engagement” team, which was created to identify and shut down inauthentic activity. She blew the whistle on the company’s inertia against political manipulation and says she began uncovering evidence of manipulative activity on the platform in September 2018 in multiple non-US markets.

“Essentially I thought when I handed [the evidence] over, they would prosecute it. I didn’t expect I would have to do it myself. I wa

Read More