302.It is hard to differentiate on social media between content that is true, that is misleading, or that is false, especially when those messages are targeted at an individual level. Children and adults need to be equipped with the necessary information and critical analysis to understand content on social media, to work out what is accurate and trustworthy, and what is not. Furthermore, people need to be aware of the rights that they have over their own personal data, and what they should do when they want their data removed.
303.The majority of our witnesses stressed the need for greater digital literacy among users of social media. Ofcom has a statutory duty to promote media literacy, which it defines as “the ability to use, understand and create media and communications in a variety of contexts”. Sharon White told us that their focus on digital literacy is from a research base, “about how children use and understand the internet and similarly with adults”.340 We cannot stress highly enough the importance of greater public understanding of digital information—its use, scale, importance and influence.
304.Greater public understanding of what people read on social media has been helped by organisations working towards greater transparency on content. For example, journalists at the NewsGuard company apply nine criteria relating to credibility and transparency to news and information website—using ‘Nutrition Labels’, explaining each website’s history, ownership, financing and transparency. In January 2019, Microsoft integrated NewsGuard’s ratings into its Edge mobile browser.341
305.We received evidence from the Disinformation Index, an organisation that assigns a rating to each outlet based on the probability of that outlet carrying disinformation: “In much the same way as credit rating agencies rate countries and financial products with AAA for low risk all the way to Junk status for the most risky investments, so the index will do for media outlets”.342
306.Facebook gives the impression of wanting to tackle disinformation on its site. In January 2019, Facebook employed Full Fact to review and rate the accuracy of news stories on Facebook—including the production of evaluation reports every three months—as part of its third-party factchecking programme, the first time that such an initiative has been operated in the UK.343 However, as we described in Chapter 5, Facebook has also recently blocked the work of organisations such as Who Targets Me? from helping the public to understand how and why they are being targeted with online adverts. On the one hand, Facebook gives the impression of working towards transparency, with regard to the auditing of its news content; but on the other, there is considerable obfuscation concerning the auditing of its adverts, which provide Facebook with its ever-increasing revenue. To make informed judgments about the adverts presented to them on Facebook, users need to see the source and purpose behind the content.
307.Elizabeth Denham described the ICO’s “Your Data Matters” campaign, which has been running since April 2018: “It is an active campaign and I think that it has driven more people to file more complaints against companies as well as to us”.344 She also stressed the need for the public to both understand their rights and also “make citizens more digitally literate so that they know how to navigate the internet and be able to exercise their rights”. The Information Commissioner said that the ICO had a role to play in that, but did not necessarily have the resources.345
308.In our Interim Report, we recommended that the Government put forward proposals in its White Paper for an educational levy to be raised on social media companies, to finance a comprehensive framework based online, ensuring that digital literacy is treated as the fourth pillar of education, alongside reading, writing and maths.346 In its response, the Government stated that it was continuing to build an evidence base to inform its approach in regard to any social media levy, and that it would not want to impact on existing work done by charities and other organisations on tackling online harms. It did not agree that digital literacy should be the fourth pillar of education, since it “is already taught across the national school curriculum.”347
309.The term ‘friction’ represents anything that slows down a process or function. In 2011 Mark Zuckerberg announced that apps would no longer generate pop-up messages, asking users whether they wanted to publish their latest activity on their Facebook feed; instead Facebook created apps that would post directly onto users’ feeds, without the need for permission. Mr Zuckerberg said, “from here on out, it’s a frictionless experience”.348
310.Some believe that friction should be reintroduced into the online experience, by both tech companies and by individual users themselves, in order to recognise the need to pause and think before generating or consuming content. There is a tendency to think of digital literacy as being the responsibility of those teaching and those learning it. However, algorithms can also play their part in digital literacy. ‘Friction’ can be incorporated into the system, to give people time to think about what they are writing and what they are sharing and to give them the ability to limit the time they spend online; there should be obstacles put in their place to make the process of posting or sharing more thoughtful or slower. For example, this additional friction could include: the ability to share a post or a comment, only if the sharer writes about the post; the option to share a post only when it has been read in its entirety; and a way of monitoring what is about to be sent, before it is sent.349
311.The Center for Humane Technology suggests simple methods for individuals themselves to adopt, to build friction into mobile devices, including: turning off all notifications, apart from people; changing the colour of the screen to ‘grayscale’, thereby reducing the intensity and lure of bright colours; keeping home screen to tools only; launching apps by typing; charging devices outside people’s bedrooms; removing social media from mobile devices; and telephoning instead of texting.350
312.As we wrote in our Interim Report, digital literacy should be a fourth pillar of education, alongside reading, writing and maths. In its response, the Government did not comment on our recommendation of a social media company levy, to be used, in part, to finance a comprehensive educational framework—developed by charities, NGOs, and the regulators themselves—and based online. Such a framework would inform people of the implications of sharing their data willingly, their rights over their data, and ways in which they can constructively engage and interact with social media sites. People need to be resilient about their relationship with such sites, particular around what they read and what they write. We reiterate this recommendation to the Government, and look forward to its response.
313.The public need to know more about their ability to report digital campaigning that they think is misleading and or unlawful. Ofcom, the ASA, the ICO and the Electoral Commission need to raise their profiles so that people know about their services and roles. The Government should take a leading role in co-ordinating this crucial service for the public. The Government must provide clarity for members of the public about their rights with regards to social media companies.
314.Social media users need online tools to help them distinguish between quality journalism, and stories coming from organisations that have been linked to disinformation or are regarded as being unreliable sources. The social media companies should be required to either develop tools like this for themselves, or work with existing providers, such as Newsguard, to make such services available for their users. The requirement for social media companies to introduce these measures could form part of a new system of content regulation, based on a statutory code, and overseen by an independent regulator, as we have discussed earlier in this report.
315.Social media companies need to be more transparent about their own sites, and how they work. Rather than hiding behind complex agreements, they should be informing users of how their sites work, including curation functions and the way in which algorithms are used to prioritise certain stories, news and videos, depending on each user’s profile. The more people know how the sites work, and how the sites use individuals’ data, the more informed we shall all be, which in turn will make choices about the use and privacy of sites easier to make.
316.Ofcom, the ICO, the Electoral Commission and the Advertising Standards Authority have all written separately about their role in promoting digital literacy. We recommend that the Government ensures that the four main regulators produce a more united strategy in relation to digital literacy. Included in this united approach should be a public discussion on how we, as individuals, are happy for our data to be used and shared. People need to know how their data is being used (building on recommendations we set out in Chapter Two of this Final Report). Users need to know how to set the boundaries that they want to, and how those boundaries should be set, with regard to their personal data. Included in this debate should be arguments around whether users want an agreed basic expectation of privacy, in a similar vein to a basic level of hygiene. Users could have the ability of opting out of such minimum thresholds, if they chose.
317.We recommend that participating in social media should allow more pause for thought. More obstacles or ‘friction’ should be both incorporated into social media platforms and into users’ own activities—to give people time to consider what they are writing and sharing. Techniques for slowing down interaction online should be taught, so that people themselves question both what they write and what they read—and that they pause and think further, before they make a judgement online.
341 NewsGuard criteria for and explanation of ratings, NewsGuard website.
343 Full fact to start checking Facebook content as third-party factchecking initiative reaches the UK, FullFact, 11 January 2019.
346 Disinformation and ‘fake news’: Interim Report, DCMS Committee, Fifth Report of Session 2017–19, HC 363, 29 July 2018, p. 63.
347 DCMS Committee, Disinformation and ‘fake news’: Interim Report: Government Response to the Committee’s Fifth Report of Session 2017, p. 20.
348 Is Tech too easy to use? Kevin Roose, The New York Times, 12 December 2018.
349 The Center for Humane Technology website.
350 Same as above.
Published: 18 February 2019