Skip to content

Tech & Digitalisation

Social Media Futures: Making the Facebook Papers a Turning Point for Accountability


Commentary27th October 2021

This article is part of the tech and public policy team’s series on understanding the futures of social media, including trends in regulation, innovation, safety and harm. Here we explore the Facebook Papers and the regulatory response required.

Introduction

This week, several media organisations published the so-called ‘Facebook Papers’, detailing internal Facebook research about various negative impacts its products have had on society. The individual stories make for important reading, but the series also illustrates the need for a more sustainable, proactive and robust system of accountability that isn’t simply reliant on leaks, whistle-blowers and internal transparency reporting.

With the spotlight firmly on the internal workings of social media companies, now is an opportunity for governments, and for those serious about change within platforms, to double down on robust regulatory structures. Greater platform accountability, empowered regulators with systematic audit powers and a new geopolitical settlement with the global tech industry, will help move past the typical whistle-blower lifecycle that often ends in inaction.


Chapter 1

Recap: Whats in the Facebook Papers?

The ‘Facebook Papers’ are based on thousands of internal documents compiled by whistle-blower Frances Haugen, a data scientist and product manager formerly with Facebook’s civic integrity department. These files revealed various internal investigations on Facebook’s wider social impact, the degree to which Facebook employees were aware of these effects, and their efforts (or lack thereof) to address these harms. These include:

These leaks add to a growing tapestry of allegations that started in 2016. Trust in social media companies is at an all-time low with many seeing tech companies as pursuing financial gain, user retention and engagement at the expense of user safety and society at large. This crisis in trust is also more broadly contagious to wider attitudes to technology. It is in the interest of the whole tech and innovation ecosystem to agree a rigorous regulatory compact.

The first step is to give governments, users and others confidence in the actions taken by social media to address harms. Unfortunately, despite these stories, the true nature, patterns and causal links of how people, content and social media platforms impact society is still largely unknown. In order to rectify this information asymmetry, and start progress towards meaningful solutions, one idea is a new model of transparency and accountability based on systematic audit. 


Chapter 2

The Current Cycle of Scrutiny is Not Moving Us Forward

At present, we have three, inadequate levers to hold Big Tech companies to account. These responses are limited to being reactionary and ad-hoc, and thus far have failed to deliver an effective, robust and sustainable system of accountability. We must stop cycling through to inaction, merely waiting until the next controversy comes along. Instead, governments and tech companies themselves must use the public and political will generated during these moments to ensure progress.

Inadequate Levers

  1. Whistle-blower leaks of internal documents. There is a long history of leaks that help to shed light on public interest information, with whistle-blowers including Frances Haugen, Sophie Zhang and Christopher Wylie generating both important scrutiny and a frenzy of media activity. Whilst these actions should often be applauded, whistleblowers and the media also have their own incentives — indeed it is noticeable that the actual 'Facebook Papers' documents have not been fully released for wider academic or civil society scrutiny, while the media consortium behind these stories also did not include any publications from the Global South. Put together, this illustrates how leaks can only ever be a momentary catalyst for wider change, rather than a holistic mechanism for accountability. True accountability should not be reliant on any individual actions, but rather consistent and reliable mechanisms.

  2. Release of internal transparency reports. All major social media firms also produce regular transparency reports, but the omission of stories highlighted by the Facebook Papers indicates that these are ultimately only a selective edit of what firms understand about their impact. Unsurprisingly, internal teams will always have conflicting incentives that mean transparency reports are an incomplete and insufficient method of accountability.

  3. Post-hoc investigations by regulators and/or legislators. Investigations into Big Tech by domestic policymakers and regulators vary in scope and scale, but have been important to identify and highlight platform problems. While sometimes derided due to some poor questioning, investigations such as the International Grand Committee on Disinformation and “Fake News” – which brought together a coalition of 21 nations to tackle various platform issues – have been more productive. However, necessarily these regulatory investigations are initiated only after issues have come to light, and as it becomes clear that significant regulatory intervention is needed, these inquiries must be complemented with accountability measures that proactively identify risks and assess these risks at a holistic platform-wide level.

Whilst these levers may shed some light on internal decision-making, they fall short of what’s required to truly unpack what is going on. Corporations facing widespread, critical media coverage quickly mobilise to managing reputational risks, taking resource away from the underlying safety or social issues being highlighted. Internal transparency will only ever be a selective edit of the truth. Reactive, ad-hoc regulatory investigations can easily fall into the same ‘trial by media’ traps. We lack the necessary legal structures and processes to ensure follow through, creating a pattern of inaction until the next scandal breaks.


Chapter 3

Systematic Audit & A New Geopolitical Settlement With Tech: How To Break The Cycle Of Inaction And Improve Accountability
  1. Independent systematic audits

Governments should require large tech firms with significant geopolitical influence to submit, at their own expense, to annual independent audits of their operations. TBI analysis of other sectors in our society, such as within the financial services industry, indicates how this could provide much-needed transparency and accountability. Independent, systematic audits, designed to be proportionate and flexible, are key to engendering greater transparency and accountability for digital platforms and online harms efforts.

This analogous system would have many benefits such as independent scrutiny, avoiding overreliance on regulators as the sole centre of expertise, ensuring more accountable internal governance structures and providing consistent touchpoints for proactive regulation and risk assessment. Regulators should also retain investigatory and enforcement powers, as well as oversight and audit quality standard-setting powers to monitor and maintain audit quality.

While the audits must also consider the processes platforms have in place to ensure their algorithms do not facilitate and create risk for harm (particularly for content moderation, recommender systems and advertising systems), the Facebook Papers also highlight particular failings of corporate governance, management and training. By tackling information asymmetries and amplifying best practices, systematic audits which incorporate both quantitative and qualitative measures could provide a crucial challenge function to improve these aspects. An example of how this might work practically can be found in Article 28, on independent audit, of the EU’s proposed Digital Services Act.

  1. A new ‘Strategic Geopolitical Status’ designation, with corresponding rights and responsibilities that enable international mediation and accountability

One of the most important revelations from the Facebook Papers is the company’s fragmented and globally-uneven approach to content moderation, with particular gaps in the Global South. Under the status quo, only the 30 countries in tiers 0, 1, and 2 benefit from elevated support, while the rest must deal with both reduced internal focus at Facebook and, given many of these nations are emerging economies, limited domestic capacity to resource wider audit or accountability systems.

This divide is striking. For example, in comparison to the US and other high priority countries, where 24/7 staffing is resourced and bespoke AI classifiers are designed to detect hate speech and misinformation, others such as Ethiopia lacked similar targeted interventions – even during a civil conflict.

One approach to incentivising new structures capable of dealing with these risks is a new geopolitical settlement with the global technology industry. In a recent report we outlined several recommendations to ensure that firms aren’t held accountable in only ‘tier 1 nations’ or by richer countries that can afford to staff ‘tech diplomacy’ initiatives. This would involve a new ‘Strategic Geopolitical Status designation for major technology firms, with three associated accountability mechanisms:

  1. Requirement to establish and/or join a geo-technology board, a new type of independent, industry-wide, self-regulatory body for global technology companies with significant geopolitical importance

  2. These new bodies (and there could be multiple) should have non-member observer status at the UN to provide an authoritative touchpoint between global policymakers and technology companies

  3. Requirement for firms to set out a new international policy, recognising their role as global proponents for a secure, open, liberal internet model. This policy would include operational KPIs as well as community standards.

These efforts, taken in concert, seek to acknowledge that the geopolitical power of many tech companies is now a fact. New institutional structures are required to meet the challenges of this new reality.


Chapter 4

Conclusion

Over the past few days, 70+ stories have been written as reporters parse out Haugen’s documents. The details these stories contain mean governments have a prime opportunity to institute reforms that will genuinely move the dial towards meaningfully improving online safety. This must go beyond just reactive regulatory investigations and focus on systematic audit and a new geopolitical settlement which can reset international accountability for good.

Article Tags


Newsletter

Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions