Skip to content

Tech & Digitalisation

Online Harms: Bring in the Auditors


Paper30th July 2020


Chapter 1

Summary

Across the world, governments are considering new regulatory frameworks to address online harms. It is important that these new frameworks are proportionate and flexible to take account of the complex, innovative and diverse range of online content, communities and platforms. 

Online harms regulation needs to create the right incentives to design safe platforms and manage healthy communities. There is currently significant information asymmetry between platforms and the governments and regulators who seek to regulate them. In order to understand platforms and therefore regulate in a meaningful way, regulators need to be able to effectively investigate, assess and measure platforms’ mechanisms and procedures.

In other regulated sectors there are often tiers of independent experts, who are not regulators, but whose job is to affirm that a certain standard is met by the companies or bodies they are auditing. Social media companies and online platforms have been publishing detailed transparency reports for some years now, but they lack effective scrutiny and verification. The practice of auditing, specifically qualitative audit, could provide a model that delivers periodic, robust assessment of online harms and platform health.

This report builds on our recommendations published in February that the UK government should look to focus more on investigative powers in its online harms legislation and our 2018 report on the next-generation regulator needed for big tech.

From our analysis, we recommend that new models of online harms regulation include requirements for independent audit of social media companies’ procedures and processes and their transparency reports. We also recommend that statutory regulation should focus on setting out those audit standards and be responsible for the registration of the auditors. (We have also published a companion note that provides analysis on applying the principles of audit to online harms regulation.)

 

Social media and online platform transparency reports are a good first step, but they do not fully address the regulatory information asymmetry problem

Investigative, information-gathering powers are a critical part of a future regulatory framework. Effective platform accountability will require access to private business information to scrutinise the implementation of standards and the design of services, without compromising the commercial or security interests of the platform. A future regulator should have the right information, tools and technical capabilities to understand the nuances of different platforms and keep up with the speed of continuous innovation. 

For the biggest social media companies, transparency already exists through the regular release of transparency reports. These are good because they show the efforts taken to address harm and force platforms to demonstrate publicly their commitments. However, they are still ultimately risk management exercises and largely driven by the platforms as they determine the information they reveal. There are only a few voluntary initiatives, such as the Implementation Reports for the European Commission’s Code of Conduct for Disinformation, which attempt to address this and ask platforms to self-assess on their compliance with the code, with both qualitative and quantitative evidence. There are also initiatives, such as The Internet Commission, that seek to provide an independent tier of expertise, but through work on behalf of social media and technology companies. 

Facebook also recently published a detailed qualitative Civil Rights Audit carried out by a civil rights and civil liberties leader, Laura W. Murphy along with a civil rights law firm. The audit sets out that the auditors had a high level of access and were assigned a three-person full-time program management team and “a partially dedicated team of 15+ employees across product, policy, and other functions”. The final report sets out detailed recommendations and concludes that Facebook “has not yet devoted enough resources or moved with sufficient speed” to tackle the civil rights challenges it faces. 

While transparency reports and self-commissioned audits are generally positive and can be insightful, governments should be wary of using public transparency as a substitute for genuine investigation. Regulators need to be able to independently verify the outputs of these reports and the efficacy of these processes by accessing confidential information, like how content classifying algorithms are trained or how the data is used. Importantly regulators need powers to take effective action if the audits are failed. 

 

Effective regulatory scrutiny requires highly skilled regulators, but it will be a challenge to keep up with some of the most well-resourced companies in the world

It is optimistic to expect that any new online harms regulator could recruit the volume of staff with the technical skill required to properly scrutinise social media and other technology companies. Part of the challenge with social media regulation will be asking the right questions and understanding the complex businesses, while keeping up to date with highly innovative, ever-changing multinational organisations. 

It is in the interest of social media companies to provide regulators with the right information that allows them to be fairly scrutinised and subject to proper “better regulation” principles. A poorly resourced regulator with significant enforcement powers is likely to either take decisions based on low-quality evidence that is not consistent with or is harmful to the social media companies it is regulating, or it is likely to be risk-averse and therefore not capable of making effective regulatory decisions at all. Nor is it in the public interest that regulatory decisions and enforcement around social media become mired in protected legal proceedings. 

Any new regulator still requires the requisite skill to make balanced regulatory judgements, but the gathering of consistent and effective information will be a considerable burden on both the regulator and the regulated companies if not done properly. 
 

A new independent tier of regulatory audit can help solve the information asymmetry problem

Corporate transparency is already widespread in other industries. The accounting auditing industry, which provides assurances of financial stability in public interest entities (PIEs), exists primarily on the application of this concept. Accounting audits are commonplace, and their outcome is based entirely on sensitive information that businesses would not want revealed publicly, especially to their competitors. This normalisation of transparency in financial audit could be a valuable analogy for addressing online harms. 

As well as those already setting up audit functions, such as the Internet Commission, there may be other candidates who can quickly build up the necessary domain and technical expertise. The “big four” auditing firms may have the expertise and resources to fulfil a new auditing function, particularly given the consulting functions they also have around the world. More work would be needed to understand whether they could be appropriate online harms auditors, whether there would be a conflict of interest with their financial reporting, and whether more would need to be done to ensure there was a true diversity of auditors. 
 

The regulator would be responsible for registering the auditors, would set the standards for the audit and retain enforcement powers to address a failed audit

In any model of online harms regulation, the audit function could be fulfilled by the online harms regulator itself, but this would mean regular, continuous scrutiny of social media companies, which would require significant resource and may be too interventionist for a balanced model of regulation. 

In the model we have set out the regulator would retain investigatory and enforcement powers.  If there was a sub-standard audit, the regulator could issue sanctions, conduct further regulatory investigation and ultimately require the platform to take specific action to address the findings. In a financial audit if a company fails an audit it would have significant ramifications for the company and for its shareholders, for example in terms of its ability to access finance. There is not a direct equivalent, but the implications of a poor audit must be more than reputational damage. 

The regulator would also retain oversight of the auditors to ensure they meet a certain standard. They would specifically: 

  • Set and apply high corporate governance, reporting and audit standards 

  • Regulate and take responsibility for the registration of the audit profession

  • Maintain wide and deep relationships with civil society, e.g. child protection charities, civil rights organisations and other users of transparency information

  • Monitor and report on developments in the audit market, including trends in audit pricing, the extent of any cross-subsidy from non-audit work and the implications for the quality of audit 

  • Investigate online harms cases itself where there are public-interest concerns about any matter that falls within the regulator’s statutory competence

  • Conduct regular research on user-needs in order to iterate and refine the scope and standards of the audit, to ensure it remains consistent with users’ actual experience of social media and their experience of harm.

It is worth noting that the financial audit industry itself has been criticised for some structural problems and issues relating to competition. Meaningful independence from government is a necessity, but having a business incentive behind securing audit projects means it can be difficult for auditors to disentangle the various conflicts of interest. Auditing firms have to reconcile the priorities of their clients’ management, their own interests and their duties as auditors. 

Irrespective of its flaws, auditing plays an important role in maintaining confidence in financial reporting. The principles and practice of audit can provide useful insights into how independent centres of expertise can exist to regulate complex organisations.  

 


Chapter 2

Lessons and Recommendations

Our analysis suggests that social media transparency reports are important but, in their current form, are not useful enough regulatory tools to enforce better standards of protection.
 

The lessons legislators should learn from the audit industry are:
  1. Transparency reporting needs to be enhanced with independent scrutiny. 

  2. External scrutiny can make internal governance structures more accountable.

  3. Regulators need systems to understand compliance with processes as well as outcomes.

  4. The regulator need not be the sole centre of expertise for holding online platforms to account.

  5. There should be global coordination on qualitative auditing standards.

More detail on the analysis behind these lessons is included in a separate note, looking in-depth at how existing models of audit function. 

This report does not consider the specific enforcement powers of the regulator. It is vital that the regulator has the powers to address a failed audit and impose sanctions that create meaningful incentives for change within the companies under scrutiny. 

 


Chapter 3

Putting the Proposal Into Practice: Recommendations for the UK Online Harms Proposals

The UK government is looking to implement a new regulatory model requiring relevant social media and online platforms to have a duty of care with regards to the safety of their users. 

This duty of care should be implemented in a way that is proportionate, flexible and measurable. Investigatory powers including information gathering, assessment and measurement powers need to be part of the regulatory framework, but it may not be appropriate to replicate the model that Ofcom currently uses for broadcast, telecoms, spectrum and postal services regulation. 

Our recommendations for the UK government are:

  1. Statutory implementation: Require companies regulated under the new online harms framework to submit annual independent audits of their transparency reports, including their procedures and processes relevant to the safety of and potential harm to users. 

  2. Regulating the auditors: Ensure that the UK government gives Ofcom equivalent powers to those recommended for a new Audit, Reporting and Governance Authority in the Kingman Review of the Financial Reporting Council.
     

Statutory implementation of new information gathering powers

When giving Ofcom new powers to regulate online harms, the government will have to consider whether they amend existing regulation or create new regulatory frameworks. If they opt for the former, among other things, they will extend Ofcom’s broad information-gathering powers to social media and online platforms.

Ofcom has broad powers arising from s135-146 and other sections of the Communications Act 2003 to request information they consider necessary to carry out their functions. These powers are far-reaching but allow Ofcom to make well-evidenced decisions that can give confidence to all of their stakeholders, including the regulated business. 

Given the high levels of innovation, rapid new product deployment, large amounts of content and risks to freedom of expression in online platform businesses, the government and the new regulator may want to consider other options for carrying out its new duties. 

Figure 1: Options for investigatory powers in online harms regulation

Online harms proposal

Question

Duty of care

How do you assess whether this is being met in a proportionate and flexible way?

Procedural accountability

How do you practically look at and assess procedures?

 

Option 1

Option 2

Option 3

Option 4

Inspectorate (Ofsted-style regulator)

Telecoms-style information-gathering powers

Audit (backed by regulation and regulatory standards)

Transparency reports and self-assessment

+ High level of regulatory scrutiny and public reporting

+ Broad powers linked to specific investigations

+Well-established powers that create clear incentives and accountability

+ Clear enforcement powers 

+ Continuous assurance for regulator and the public

+ Enhances current model of transparency reporting

+ Effective in other industries

+ Low regulatory burden

+ Protects independence of companies

+ Codes of practice can set guidelines

- Very resource intensive on all sides

- Highly interventionist and compromises independence and freedom of expression

- Speculative requests can be resource intensive for companies

- Not ongoing scrutiny

- Social media regulation is a different task to broadcast or telecoms regulation

- Potential conflicts of interest between auditor and company can develop

-No regulatory oversight or independent assurance

- Limited onward enforcement if terms not met

 

The information asymmetry between the regulated company and Ofcom may not be best addressed in this new form of regulation by relying on traditional information-gathering powers. Instead, a system of audit may be more of a reliable and appropriate system for assessing the duty of care. 

 

Oversight of the Auditors

The financial audit industry has had its own issues with governance. In 2018 John Kingman published his independent review of the Financial Reporting Council recommending a new independent statutory regulator accountable to Parliament. These recommendations were made in the context of criticisms of the Financial Reporting Council and its closeness to the industry it is scrutinising. 

To mirror the recommendations of that review, we propose Ofcom be given powers to regulate a new independent social media audit function of transparency reporting. The functions should include:

  • Setting and applying high corporate governance, reporting and audit standards 

  • Regulating and taking responsibility for the registration of the audit profession

  • Maintaining wide and deep relationships with civil society, e.g. child protection charities such as the Internet Watch Foundation and other users of transparency information

  • Monitoring and reporting on developments in the audit market, including trends in audit pricing, the extent of any cross-subsidy from non-audit work and the implications for the quality of audit 

  • Investigating online harms cases itself where there are public-interest concerns about any matter that falls within the regulator’s statutory competence


Chapter 4

Conclusion

Any new online harms regulator should aim to deliver genuine assurances of safety for the general public, but they will struggle to take on the task of investigating and measuring on their own. The duty of care is the right approach to qualitatively assess the processes platforms take to tackle online harms, but a considerable information asymmetry will remain. The accounting audit model along with other models of qualitative audit provide useful lessons for online harms regulation and investigation-based regulation. 

Article Tags


Newsletter

Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions