Whistleblower’s account of Facebook rings true in Irish context

Whistleblower’s account of Facebook rings true in Irish context

To say Facebook has had a bad fortnight would be the understatement of a lifetime.

Were a normal company to go through what the social media behemoth has done in the US since mid-September it would probably spell curtains for it as a going concern.

That isn’t the case for a company valued this week at an eye-watering €818bn — its sheer size, geographic spread and 2.9bn-strong user base will see to that.

Nevertheless, Facebook is at present under pressure like never before, a tidal wave of scandal in recent years has culminated in the testimony before the US Senate this week of former employee-turned-whistleblower, Frances Haugen.

The problems which have caused that pressure — charges that the company doesn’t do enough to curb its use as a platform for hate speech, extremism, and misinformation while ignoring its own internal research in the pursuit of profit — are global in scale. 

Ireland itself is deeply affected.

In Ireland, Facebook has contributed to the elevation of Covid conspiracy theorists and far-right propaganda.

Ireland: Facebook’s host and regulator

And Ireland’s dual status as both host to Facebook’s European control centre and also the much-maligned social network’s regulator in chief for the EU means there is extra interest in what’s happening at the very top. 

In the bricks and mortar Irish context, Facebook means jobs, taxes, and a well educated young workforce from all over the world contributing to our economy.

But online, its impact is far darker, and deeper, as evidenced by the events of the past week.  

The company’s inner workings were dragged into the spotlight by the testimony of 37-year-old Frances Haugen, a recent senior Facebook insider, as to the alleged depths of the company’s tribulations.

There have been Facebook whistleblowers before, most notably in terms of the Cambridge Analytica scandal which suggested mass voter tampering in the 2016 US presidential election which saw Donald Trump elected to the White House.

Ms Haugen’s testimony though seemed, if anything, even more seismic. With her concerns placed on the record of the American legislature, some sort of response from Facebook and its all-controlling founder Mark Zuckerberg would seem inevitable.

During her appearance she underlined a number of points that were first brought to public attention via a series of systemic leaks by her to The Wall Street Journal (WSJ) and published on a staggered basis in recent weeks.

Former Facebook employee Frances Haugen testified to a Senate subcommittee about the harm the company's products were doing to children. Picture: Alex Brandon/AP
Former Facebook employee Frances Haugen testified to a Senate subcommittee about the harm the company’s products were doing to children. Picture: Alex Brandon/AP

She suggested that Facebook systematically ignored its own research suggesting the harms its platforms were inflicting in order to prioritise growth and profit, and that its celebrity and politician users are treated with more deference than the average account-holder.

Her leaks showed that Facebook is facing an enormous lawsuit taken by its own shareholders who allege, among other things, that the €4.3bn paid by the company to the US Federal Trade Commission on foot of the Cambridge Analytica scandal (which saw a standard Facebook quiz used to gain access to the data of 87m users in order to facilitate targeted political advertising) was set that high in order to protect Mr Zuckerberg from personal liability.

She said that the lack of transparency around Facebook’s algorithms, the unknowable decisions taken by artificial intelligence which dictate the content that users are shown, makes the company impossible to regulate.

Probably the most concerning allegations of all, certainly politically, are both that Facebook’s own internal research regarding its photo-sharing subsidiary Instagram showed that the app negatively impacts the mental health of teenage users — information which Facebook chose not to make public — and that the company targets under-13s specifically, despite it not even being legal for those age groups to have an account.

“There were conflicts of interest between what was good for the public and what was good for Facebook,” Ms Haugen told CBS’s 60 Minutes programme last week.”

Facebook over and over again chose to optimise for its own interests, like making more money.

Her allegations are terrifying in terms of their implications, and indeed Facebook would appear now to be at something of a crossroads, though if it recognises that fact, it’s not letting on.

Its vice president of global affairs Nick Clegg, the former UK Liberal Democrats leader, said at the time of the initial WSJ revelations that they contained “deliberate mischaracterisations”, adding that a central allegation — that Facebook “conducts research and then systematically and wilfully ignores it if the findings are inconvenient” — is “just plain false”.

All of this madness before you even consider the six-hour outage Facebook’s platforms — including the world’s most-used messaging service Whatsapp — experienced last Monday, something the company rather impenetrably blamed on “configuration changes on the backbone routers that coordinate network traffic between our data centres”.

Facebook and the far-right in Ireland

Leaving that aside, the issues surrounding Facebook’s algorithms and its long-term struggle to effectively deal with misinformation and hate speech have grown exponentially in recent years.

Ireland has a particular role to play in what happens in Facebook, given its European headquarters is based in Dublin. 

The company’s prickly relationship with Ireland’s privacy regulator, the Data Protection Commission, which recently fined Whatsapp a record €225m for GDPR breaches and which is central to the Schrems II legal case which threatens Facebook’s entire business model of intercontinental data transfers, is no secret.

And Facebook’s reputational problems in terms of the iffy content it amplifies are just as much an issue here as anywhere else.

In the case of 34 Irish Facebook groups that spread Covid-19 misinformation and conspiracy theories across the pandemic, their memberships have increased in size by 140% to 82,000 over the past 19 months.

Posts in those groups have received over 2.2m interactions in that time, while videos posted in the same groups have been viewed more than 10m times.

Dublin city centre was brought to a halt for over an hour last month by a few hundred anti-mask protesters. Picture: Eamonn Farrell/RollingNews.ie
Dublin city centre was brought to a halt for over an hour last month by a few hundred anti-mask protesters. Picture: Eamonn Farrell/RollingNews.ie

Some of the most interacted-with posts in these groups since the beginning of 2021 include a photo of a placard reading ‘when you try to kill everyone with a pandemic but accidentally cause a mass awakening’, and a picture of a man waving a sign inscribed with ‘Death Jab’ — in reference to vaccine scepticism — outside the GPO in Dublin.

“These groups are the organisation hubs for anti-lockdown and anti-mask rallies that have taken place across the country since the start of the pandemic and have certainly been part of the reason that regular people have been radicalised into believing false and ludicrous theories about the virus,” says Aoife Gallagher, an analyst with the UK-based Institute for Strategic Dialogue and an expert on misinformation.

There is a likewise a marriage of content from Covid misinformation sources and the far-right on Irish Facebook, the same alliance to be seen on social media across the globe.

The Far Right Observatory (FRO), an organisation with a stated goal of “better understanding and undermining far-right and hate organising in Ireland”, says there are currently 55 such Facebook pages and groups promoting far-right ideas, with a total membership of more than350,000. 

Most of these are public groupings. The largest following is that of Dolores Cahill, the controversial former UCD professor, whose page boasts 64,000 likes.

Reporting offensive content

Part of the problem is that Facebook, along with its content-sifting algorithms, rather conveniently also relies to a large extent upon reporting by its own users of offensive content. 

This is because the company sees itself as a host, not a publisher, which is the tricky line all social media companies have found themselves walking in order to justify — and immunise themselves legally from — the reprehensible content that can appear on their platforms.

This reporting system is not fit for purpose.

One Dublin-based individual, one of the chief organisers of a vaccination protest outside Tánaiste Leo Varadkar’s home two weeks ago, until recently carried a livestream video on his Facebook profile featuring him calling for people of Jewish faith to be “wiped out”.

Facebook claims that its technology “proactively detects the vast majority of violating content before anyone reports it”, and that between April and June of this year over 97% of the hate speech it removed was taken down in this manner, though “there will always be examples of things our technologies miss”.

However, the video mentioned above was only removed after Facebook was approached by a community group regarding its content.

Facebook said “the content was removed for violating our policies”. 

The page in question on which this video was posted has also been removed for repeatedly violating our policies.

Except the video wasn’t posted on a page, it was posted on a profile, and at least two profiles for the same person remain on the site, as do all the videos they have posted, including the aforementioned protest at Leo Varadkar’s home. They are easily found.

“This doesn’t inspire confidence around Facebook’s competency,” says a spokesperson for the FRO.

“In our experience, and the experience of all the community and civil society groups, Facebook’s reporting system simply doesn’t work,” they said.

“Facebook can say they removed 97% of hate speech automatically, but this figure tells us nothing about the true extent or hate speech on the platform. 97% of what? 

“Frances Haugen made statements to the US Senate this week that Facebook’s own research is that they catch — at best — 10% of hate-filled content on the site. That means 90% remains.” 

Promotion of hate speech

The FRO recently approached Facebook asking for the removal of the page of one of Ireland’s best-known far-right activists — one of this country’s earliest importers of the American QAnon conspiracy theory — who had used the platform for five years and in that time amassed over three million views of videos espousing “homophobic, antisemitic, and white supremacist content”, a request to which the platform acquiesced.

“This specific page had been reported numerous times,” the FRO spokesperson said. “Yet it remained for five years allowing the individual to grow a profile and sell merchandise.” 

If Facebook’s technologies are missing five years of hate-filled activity from one of the most prominent far-right actors in Ireland over that time, it is reasonable to ask if those technologies are in any way up to scratch. 

“In our experience, Facebook is one of the primary organising spaces for pushing far-right narrative and hate speech,” they said, adding that the social network’s algorithms are driving engagement for hate-filled ideas and ideologies in a way that “simply would not have been possible 10 or 15 years ago”.

The pattern, both in Ireland and globally, would appear to be that the best way to get Facebook to action extreme content is to approach its staff at a senior level, and even then the best way to get something removed is for a great deal of pressure to be applied.

Aisling O'Loughlin had her profile removed after public outcry over a series of anti-vaccine posts on Instagram.
Aisling O’Loughlin had her profile removed after public outcry over a series of anti-vaccine posts on Instagram.

A good example in an Irish context is that of Aisling O’Loughlin, the former presenter of Irish TV’s Exposé, who for a number of months early in 2021 gained notoriety for a series of anti-vaccine posts on Instagram. Her profile was eventually removed after a public outcry and a vast amount of media attention.

“Facebook is very sure not to give out what the hard and fast rules are,” the FRO says. 

“Action is taken whenever a bit of pressure is put on, which means civil society organisations end up acting as de facto moderators.” 

Is the situation hopeless then?

“The tide seems to be turning a little, but you just never know,” says Aoife Gallagher.

“The whistleblower in America, there are a lot of people backing up what she’s saying. The time to act is now though, given Facebook is under so much pressure.” 

But deplatforming people “shouldn’t be the long-term solution”, she says. 

“At the end of the day the issues lie in Facebook’s algorithms, and how they push this divisive content so that these people become massive influencers.” 

Getting one of the largest companies in the world to come clean on its inner workings may prove the biggest challenge of all.

Read More

No Comments Yet

Leave a Reply

Your email address will not be published.