232.Throughout the inquiry, we heard about the powerful influencing nature of social media and the fact that it is hard to differentiate between what is true, what is misleading, and what is false, especially when messages are targeted at an individual level. Children, young adults, and adults—all users of digital media—need to be equipped in general with sufficient digital literacy, to be able to understand content on the Internet, and to work out what is accurate or trustworthy, and what is not. Time and again, we heard people saying that “when the service is free, you are the product” and, as the product, individual users are continually being manipulated, without their even realising.
233.This chapter will explore how people, especially children and young adults, engage with social media, and what can be done to ensure that they understand the digital space, and can make informed choices about how they spend their time, how they identify sites that they can trust or are safe, how they appraise the content of what they read, and what information they share with others.
234.The point of social media is to interact with other people, and to share ideas. Dr Caroline Tagg, from the Open University, carried out research that showed that people use Facebook to maintain social relationships, and to many people Facebook was not seen as a news media site, but “a place where they carry out quite complex maintenance and management of their social relationships”.
235.Within those social relationships, people tend to connect and want to spend time with others who share their same views and interests, which is when the spread of misinformation can happen so quickly. Professor Lewandowsky, from the University of Bristol, told us about an Australian study on climate change:
Only 8% of people were found to completely negate the idea that the climate is changing but those 8% thought that their opinion was shared by half the population and that was because they were all in this echo chamber and talked to each other and felt their opinions confirmed. I think that is a novel problem that is inherent to the technology. That people think, whatever they think, everybody else thinks the same way.
236.This dependency and reliance on social media comes with worrying consequences, as Tristan Harris told us:
There are many different issues emerging out of the attention economy. The externalities range from public health, addiction, culture, children’s well-being, mental well-being, loneliness, sovereignty of identity and things like that to election democracy, truth, discernment of truth and a shared reality, anti-trust and power. There are multiple issues. There are even more, because when you control the minds of people, you control society. How people make sense of the world and how they make choices are what ID is, and that can affect every aspect of society.
237.Most users do not understand how the content they read has got there, but accept it without question. A significant part of digital literacy is understanding how social media works, and how the content that each user reads has appeared, as a result of specific algorithms:
If we are talking about news and media literacy curricula, that has to include teaching about how to evaluate an algorithm and how to understand how what you see on Amazon, Netflix or Facebook has been decided by an algorithm, how an algorithm gets developed, how it is created by a certain person and how their biases might shape that. That has to be part of the teaching that we give to people.
238.What appears on individuals’ newsfeeds is there either by an algorithm, based on their behaviour and profile, or it is targeted at their demographic by paid promotion. Indeed, it is common for publishers to pay for their content to be posted so that they can reach a wider audience, due to the fact that Facebook, for example, does not recognise or seek to categorise good journalism or news over other material.
239.Once content is on social media, it is hard for people to disregard what they have just read. Professor Stephan Lewandowsky told us that there are hundreds of studies have shown that “if we try to correct people’s beliefs based on what they have heard they may adjust their belief slightly but there is a lot of evidence to suggest that they continue to rely on that information nonetheless. […] The cognitive consequences of fake news are pervasive”.
240.When we share information about ourselves on social media, there is a tacit understanding that that information will become public. When Alexander Nix first gave evidence, in February 2018, he told us that people understand the reciprocity of businesses giving an offer in exchange of people’s data through, for example, loyalty cards and that “their data is being taken in return to help that brand to drive its marketing. […] People are not naïve”. However, in the context of the data extraction by Aleksandr Kogan, Sandy Parakilas, a former Facebook manager, rightly said that Facebook users “may have understood in theory that there were privacy concerns but they did not know how much of their data was being sent to developers whom they had no relationship with”.
241.People also need to be aware of rights they have with regard to how their personal data is used, and what to do when they want their data to be removed. Social media companies do not make it easy for their users to control their own data. It is hard to find privacy controls, and there is no simple explanation of how users can look after their data and their privacy. Facebook’s terms of its data and cookies policy had a large button for accepting Facebook’s ‘updated Terms to continue using Facebook’. If the user did not want to accept the Terms, they followed a small link ‘see your options’, which let them delete the account.
242.The Information Commissioner’s Office is planning to work with the Electoral Commission, the Cabinet Office, and political parties to launch a “Your Data Matters” campaign, before the next General Election, with the aim “to increase transparency and build trust and confidence amongst the electorate on how their personal data is being used during political campaigns”. We hope that this campaign will be proactive in telling people about their own data, and how they should share it, and their rights over their data.
243.The Education Policy Initiative reports that 95% of 15-year-olds in the UK use social media before or after school, and that half of 9 to 16-year-olds use smart phones daily. From an early age, young children are growing up with digital devices. Our education system should be equipping children with the necessary tools to live in our digital world, so that their mental health, emotional well-being and faculty for critical thinking are protected. They need to be aware of the issues surrounding social media, and be aware of their actions when interacting with digital arenas. Finding ways to involve parents and carers is equally important.
244.Our schools play a crucial role in helping students to differentiate between fact and fiction, and there are various initiatives to tackle the growing issue of the use of social media by children and young adults. The PSHE Association recommended that the secondary school Personal, Social, Health and Economic (PSHE) curriculum should cover the issues that young people are concerned about online, including compulsive use, data gathering and body image. The Times and The Sunday Times have recently launched a media literacy scheme in schools, to help pupils how to spot ‘fake news’. The scheme will be available for pupils in secondary schools, colleges and sixth form. The programme is in partnership with News UK’s News Academy.
245.In a letter sent to social media companies in April 2018, the then Secretary of State for Health, Rt Hon Jeremy Hunt MP, warned those companies that they needed to ensure the protection of children’s mental health from the dangers of social media, and discussed the possibility of introducing legislation for social media platforms, to curb the dangers of cyber-bullying of young adults. California’s Consumer Privacy Act of 2018 will establish special protections for children under the age of sixteen, including independent reviews, age ratings and other guidance to help children and their adults to navigate the world of social media.
246.We recommend that the Government put forward proposals in its White Paper for an educational levy to be raised by social media companies, to finance a comprehensive educational framework (developed by charities and non-governmental organisations) and based online. Digital literacy should be the fourth pillar of education, alongside reading, writing and maths. The DCMS Department should co-ordinate with the Department for Education, in highlighting proposals to include digital literacy, as part of the Physical, Social, Health and Economic curriculum (PSHE). The social media educational levy should be used, in part, by the Government, to finance this additional part of the curriculum.
247.There should be a unified public awareness initiative, supported by the Departments for DCMS, Health, and Education, with additional information and guidance from the Information Commissioner’s Office and the Electoral Commission, and funded in part by the tech company levy. Such an initiative would set the context of social media content, explain to people what their rights over their data are, within the context of current legislation, and set out ways in which people can interact with political campaigning on social media. This initiative should be a rolling programme, and not one that occurs only before general elections or referenda.
248.The public should be made more aware of their ability to report digital campaigning that they think is misleading, or unlawful. We look forward to the work that the Electoral Commission is planning, to bring this to the fore.
303 , Emily Bell, The Guardian, 24 June 2018
307 , ICO, 11 July 2018
308 The Science and Technology Committee launched its current inquiry into.
309 Same as above
310 , PSHE Association, 17 April 2018.
311 , Freddy Mayhew, Press Gazette, 28 June 2018.
312 , BBC, 22 April 2018.
Published: 29 July 2018