Digital Technology and the Resurrection of Trust Contents


Democracy faces a daunting new challenge. The age where electoral activity was conducted through traditional print media, canvassing and door knocking, is rapidly vanishing. Instead it is dominated by digital and social media. They are now the source from which voters get most of their information and political messaging.

The digital and social media landscape is dominated by two behemoths–Facebook and Google. They largely pass under the radar, operating outside the rules that govern electoral politics. This has become acutely obvious in the current COVID-19 pandemic where online misinformation poses not only a real and present danger to our democracy but also to our lives. Governments have been dilatory in adjusting regulatory regimes to capture these new realities. The result is a crisis of trust.

Yet our profound belief is that this can change. Technology is not a force of nature. Online platforms are not inherently ungovernable. They can and should be bound by the same restraints that we apply to the rest of society. If this is done well, in the ways we spell out in this Report, technology can become a servant of democracy rather than its enemy. There is a need for Government leadership and regulatory capacity to match the scale and pace of challenges and opportunities that the online world presents.

The Government’s Online Harms programme presents a significant first step towards this goal. It needs to happen; it needs to happen fast; and the necessary draft legislation must be laid before Parliament for scrutiny without delay. The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech and others that benefit from the current situation.

Well drafted Online Harms legislation can do much to protect our democracy. Issues such as misinformation and disinformation must be included in the Bill. The Government must make sure that online platforms bear ultimate responsibility for the content that their algorithms promote. Where harmful content spreads virally on their service or where it is posted by users with a large audience, they should face sanctions over their output as other broadcasters do.

Individual users need greater protection. They must have redress against large platforms through an ombudsman tasked with safeguarding the rights of citizens.

Transparency of online platforms is essential if democracy is to flourish. Platforms like Facebook and Google seek to hide behind ‘black box’ algorithms which choose what content users are shown. They take the position that their decisions are not responsible for harms that may result from online activity. This is plain wrong. The decisions platforms make in designing and training these algorithmic systems shape the conversations that happen online. For this reason, we recommend that platforms be mandated to conduct audits to show how in creating these algorithms they have ensured, for example, that they are not discriminating against certain groups. Regulators must have the powers to oversee these decisions, with the right to acquire the information from platforms they need to exercise those powers.

Platforms’ decisions about what content they remove or stop promoting through their algorithms set the de facto limits of free expression online. As it currently stands the rules behind these decisions are poorly defined. Their practical operation should reflect what the public needs. In order to protect free and open debate online, platforms should be obliged to publish their content decisions making clear what the actual rules of online debate are.

Alongside establishing rules in the online world, we must also empower citizens, young and old, to take part as critical users of information. We need to create a programme of lifelong education that will equip people with the skills they need to be active citizens. People need to be taught from a very young age about the ways in which platforms shape their online experience.

The public needs to have access to high quality public interest journalism to help inform them about current events. This requires fair funding to support such journalism.

Platforms must also be forced to ensure that their services empower users to exercise their rights online. The public need to understand how their data is being used. We propose that this obligation of fairness by design should be a core element in ensuring platforms meet their duty of care to their users.

Parliament and government at all levels need to invest in technology to engage better with the public.

Electoral law must be completely updated for an online age. There have been no major changes to electoral law since the invention of social media and the rise of online political advertising. As the Law Commission recently pointed out, a wholesale revision of the relevant law is now needed. This should include rules that set standards for online imprints on political advertisements so that people can see who they come from and advert libraries that enable researchers and the public to see what campaigns are saying. The Electoral Commission needs the powers to obtain the information necessary to understand when individuals are breaking the rules and to be able to set fines that act as a real deterrent against flagrant breaches. We also need to ensure that there is greater clarity around the use of personal data in political campaigns; the Information Commissioner’s guidance should be put on statutory footing.

We take the Nolan Principles of Public Life as our guide in this Report, and as the standard to which individuals in public life should be held. In turn, platforms and political parties should aspire to the same high standards.

We believe this Report sets out a way whereby digital technology is no longer in danger of undermining democracy but rather where the wonders of technology can support democracy and restore public trust.

© Parliamentary copyright 2018