354.A country’s education system needs to prepare its people for their role as citizens. In the digital world, this means they need to be empowered to be critical, digitally literate consumers of information. In this Chapter we recommend that in order to secure democracy, people of all ages need to be taught critical digital media literacy skills suitable for a digital age. Some responsibility, however, must also lie with social media organisations who should ensure their products are accessible and understandable to the public. Oversight is needed to ensure that platforms are understandable to the public, while digital literacy is necessary for individuals to be empowered to thrive in a digital world.
355.Part of preparing children for their role in democratic society is understanding how the country is governed through a serious commitment to civic education. Democracy Club’s evidence suggested that many adults do not understand the UK’s electoral system, with people asking why Jeremy Corbyn or Theresa May were not listed on their ballot paper in the 2017 General Election.
356.Civic competencies, like participating in democratic systems, are affected by developments in the digital world. Particular knowledge and expertise are required when people engage with democratic processes online. This throws a new, digital imperative on existing debates around civic education.
357.Over the course of our inquiry, we heard various definitions of digital and media literacy. Dr Elinor Carmi, a Research Associate in Digital Culture and Society at the University of Liverpool, told us that there is no unified definition among academics. The Government’s definition of digital skills focusses on five elements: communicating, handling information, transacting, problem-solving and safety. The Government is also developing an online media literacy strategy, which should include a definition of media literacy. Our evidence consistently mentioned was the need for critical questioning and critical thinking skills as key elements of any definition and push for digital literacy. Angie Pitt, Director of NewsWise, a cross-curricular news literacy project for nine to 11-year-olds across the UK, told us that people may well be able to use a tablet, “but they do not have the critical skills to question how that information reaches them and how they are using it.”
We use the term ‘digital media literacy’ because our purposes go beyond, but do include, the functional skills required to use technology.
We define digital media literacy as being able to distinguish fact from fiction, including misinformation, understand how digital platforms work, as well as how to exercise one’s voice and influence decision makers in a digital context.
358.The House of Lords Digital Skills Committee in 2015 advocated greater digital literacy in the UK and set as one of its objectives for a Government Digital Agenda that the population have the right skill levels to use relevant digital technologies, including a culture of learning for life, with responsibility shared between Government, industry and the individual. The Government’s response focussed mainly on the report’s recommendations around infrastructure and failed to commit to upskilling the UK population with the necessary digital skills to thrive in the modern world.
359.Too often digital media literacy is confused with computer science. Having the prerequisite skills to understand how digital technologies function is essential to a solid digital media literacy education, but it is not enough to be merely a technically proficient consumer of digital technologies.
360.When asked about how the Government were equipping teachers to effectively teach digital media literacy, Nick Gibb MP, Minister of State for School Standards, told us that they had established the National Centre for Computing Education. Michelle Dyson from the Department for Education highlighted the Government’s computer science programme, which aims to train one teacher in every secondary school, and its inclusion in the Government’s relationships and sex education curriculum, rather than specifically addressing digital media literacy. The Government’s evidence stated that it was “taking action to help people attain the digital skills needed to fully participate and thrive in an increasingly digital world.” This included introducing digital literacy as a core part of the national curriculum and publishing a media literacy strategy in 2020. As yet, no media literacy strategy has been forthcoming. The Government also focussed on the reformulation of the computing curriculum, which covers the principles of e-safety. Below we have laid out where digital media literacy is presently taught in the curriculum in England.
361.As can be seen in Table 1, digital media literacy is primarily taught in computing, citizenship and relationships education. Liz Moorse from the Association of Citizenship Teaching told us that “the Government have made a shambolic mess of the relationships and sex education framework and tried to shoehorn something in that probably does not belong.” Ms Moorse also told us that subjects such as citizenship are neglected in terms of funding, adequate Continuous Professional Development (CPD) for teachers and in expertise at Ofsted. Jonathan Baggaley, CEO of PSHE Association also told us that whilst the statutory status for PSHE was welcome, more needed to be done by the Government to encourage schools to prioritise this.
Understand computer networks, including the internet; how they can provide multiple services, such as the World Wide Web, and the opportunities they offer for communication and collaboration.
Use search technologies effectively, appreciate how results are selected and ranked, and be discerning in evaluating digital content.
Explore how the media present information.
Pupils should know about online relationships, including how to be safe online and how information and data is shared and used online.
Relationships and Sex Education
Pupils should be taught the rules and principles for keeping safe online. This will include how to recognise risks, harmful content and contact, and how and to whom to report issues. Pupils should have a strong understanding of how data is generated, collected, shared and used online, for example, how personal data is captured on social media or understanding the way that businesses may exploit the data available to them.
362.The Government’s focus on computing education is insufficient; basic digital skills are not enough to create savvy citizens for the digital era. The Department of Education would appear to be struggling to anticipate the implications of the technological challenges of the 21st century. The 5Rights Foundation recommends that children must understand the purposes of the technology they use, have a critical understanding of the content it delivers, have the skills and competencies to participate creatively, and a reasonable, age-appropriate understanding of potential outcomes, including harms. We believe that the requirement for these skills should extend to adults.
363.The focus on computer science, rather than critical digital media literacy skills, is important because we received numerous pieces of evidence that suggested insufficient progress had been made on improving digital media literacy in the UK. The Digital Life Skills Company stated that only two per cent of children have the skills needed to critically evaluate news and that young people do not realise that YouTube does not fact check its content or always remove inaccurate content. Angie Pitt warned that while we call primary school children ‘digital natives’, she would warn against calling them “critical digital natives” because they do not all have the critical literacy skills needed to be discerning users of digital technologies. 5Rights Foundation stated that children are disproportionately affected by issues such as fraud, data protection and online targeting, given their developmental vulnerabilities and their status as ‘early adopters’ of emerging technologies.
364.The Commission on Fake News and the Teaching of Critical Literacy Skills in Schools, a joint venture between the All Party Parliamentary Group on Literacy and National Literacy Trust, found that in 2018 in the UK, 54 per cent of 12 to 15-year-olds use social media to access online news and 46 per cent of those who source news in this way say they find it difficult to tell whether or not a social media news story is true.
365.A lack of adequate digital media literacy is not, however, just seen in children. Many adults also lack the ability to evaluate critically what they see online. Ofcom’s Adult Media Lives 2019 report found that there had been little change in critical awareness in the past few years, with many still lacking the skills needed to identify when they are being advertised to online. Ofcom also found that the percentage of those who do not use the internet increases as people get older, with 33 per cent of 65 to 74-year-olds and 48 per cent of those aged 75 and over not using the internet. Care England and the National Pensioners Convention both raised the issue of older people finding it harder to take part in online debate. Index on Censorship and Dr Ana Langer and Dr Luke Temple, respectively from the Universities of Glasgow and Sheffield, stated that those over the age of 65 are most likely to share fake news and should be a focus of concern for digital literacy efforts. Helen Milner, Group Chief Executive of Good Things Foundation told us that adults who lack digital literacy skills have low learning confidence and that part of the challenge is how to build up their confidence and resilience.
366.Other evidence we received linked social exclusion to digital exclusion. Dr Elinor Carmi told us that her research showed that: “The lower your income, the more limited your use and understanding are.” MySociety suggested that the most economically disadvantaged do not have the necessary digital skills to engage online and that rural areas are held back by a lack of a good internet connection. Good Things Foundation stated that individuals without digital skills are most likely to be from socially excluded groups with a lack of digital skills being far more common in lower socioeconomic groups. National Literacy Trust suggested that adults with lower levels of education stand to benefit most from increased support and confidence in news literacy. Professor Sonia Livingstone, Chair of the LSE T3 Commission, sees reaching adults not in education or training as one of the key educational challenges in media literacy.
367.The reason why a lack of digital media literacy is so concerning is, as Will Moy, Chief Executive of Full Fact, put it, “Bad information can ruin lives. It damages people’s health. It promotes hate and it hurts democracy.” A pertinent example of misinformation and why digital media literacy is so crucial has been brought home by the COVID-19 pandemic. Misinformation has come in a myriad of forms online, including: supposed at-home cures like ‘avoiding ice cream’ and ‘drinking silver’; that COVID-19 was created and weaponised by the West; and linking 5G technology to the virus outbreak. Lisa-Maria Neudert offered us this example of how disinformation can cause real-world harm:
“As a recent example, the World Health Organization is currently co-operating with Google to bring out public health information on conspiracy theories and disinformation about coronavirus. The most popular theories that have been spreading are that you can vaccinate yourself against coronavirus by inhaling sulphurous fumes from fireworks and that you can use garlic to protect yourself from coronavirus, which is obviously very wrong. To stick with epidemiology, it is enough for one person to believe that. If one person thinks, “I can actually vaccinate myself by inhaling a firework”, there will be a terrible health effect no matter what. If that person then contracts coronavirus, because he thinks he cannot get it, and spreads it, we have a real-world impact from a piece of disinformation. In this analogy, we have a disease that is arguably spreading just as quickly as disinformation.”
368.The UK ranked twelfth out of 35 countries across wider Europe at promoting societal resilience to disinformation activities, according to the Open Society Institute Media Literacy Index. Throughout our inquiry, we heard positive examples from abroad, particularly from the Baltic countries, as to how digital media literacy can be promoted to citizens. Liz Moorse directed us towards the example of Finland. She explained that the Finnish government: “have worked for some decades to make sure that democracy education, citizenship and media literacy are part of every child’s education. They have put resources into it and trained teachers.”
369.We heard from Siim Kumpas, Adviser to the Government Office of Estonia, about his country’s successful attempts to improve the quality of their digital media literacy. He told us that an awareness among the public of the threat level was a “prerequisite” to any work in building societal resilience through digital literacy. Mr Kumpas stated that digital skills education starts in kindergarten with basic education, and media literacy is ingrained in other subjects. Estonia also has a 35-hour compulsory course called ‘media and manipulation’ which gives high school students a basic understanding of the role media and journalism play, and how they work.
Estonia has its own national definition of ‘digital competence’ alongside the EU-wide definition (DIGCOMP) and sets out standards of digital competence for teachers. Estonia’s desired ‘digital competence’ is defined as: the ability to use developing digital technology in a quickly changing society; using digital means for finding and preserving information and to evaluate the relevance and trustworthiness of the information; participating in creating digital content; using suitable digital tools and methods for solving problems, communicating and cooperating in different digital environments; awareness of online dangers and know how to protect one’s privacy, personal information and digital identity; following the same moral principles as offline Estonia is one of only three EU states where digital competence frameworks must be taken into consideration while developing Initial Teacher Education programmes. Estonia’s education strategy is known as the Lifelong Learning Strategy 2020, which sets “A digital focus in lifelong learning” as one of five key policy aims. The government of Estonia cofounded a programme which as of February 2019 had supplied 44 per cent of kindergartens with IT and programming equipment and training. Estonia is one of only five EU states to apply assessment criteria in digital competencies at both primary and secondary education level. Estonia monitors and evaluates its strategies for digital education at school level at regular interviews, one of only eight EU states to do so (exc Finland).
Estonia’s desired ‘digital competence’ is defined as: the ability to use developing digital technology in a quickly changing society; using digital means for finding and preserving information and to evaluate the relevance and trustworthiness of the information; participating in creating digital content; using suitable digital tools and methods for solving problems, communicating and cooperating in different digital environments; awareness of online dangers and know how to protect one’s privacy, personal information and digital identity; following the same moral principles as offline Estonia is one of only three EU states where digital competence frameworks must be taken into consideration while developing Initial Teacher Education programmes. Estonia’s education strategy is known as the Lifelong Learning Strategy 2020, which sets “A digital focus in lifelong learning” as one of five key policy aims. The government of Estonia cofounded a programme which as of February 2019 had supplied 44 per cent of kindergartens with IT and programming equipment and training. Estonia is one of only five EU states to apply assessment criteria in digital competencies at both primary and secondary education level. Estonia monitors and evaluates its strategies for digital education at school level at regular interviews, one of only eight EU states to do so (exc Finland).
Estonia is one of only three EU states where digital competence frameworks must be taken into consideration while developing Initial Teacher Education programmes.
Estonia’s education strategy is known as the Lifelong Learning Strategy 2020, which sets “A digital focus in lifelong learning” as one of five key policy aims.
The government of Estonia cofounded a programme which as of February 2019 had supplied 44 per cent of kindergartens with IT and programming equipment and training. Estonia is one of only five EU states to apply assessment criteria in digital competencies at both primary and secondary education level. Estonia monitors and evaluates its strategies for digital education at school level at regular interviews, one of only eight EU states to do so (exc Finland).
Finland has adopted the EU-wide definition of digital competence (DIGCOMP), which includes: browsing, searching and filtering information; engaging in online citizenship and collaborating through digital channels; netiquette and managing digital identity; developing content and programming; protecting personal data, health, and the environment; identifying needs and technological responses; innovating and creatively using technology. Finland has no digital competence framework for teachers but promotes the use of self-assessment tools for teachers to evaluate their level of digital competence and thereby define their development needs. In 2013, Finland’s Ministry of Education and Culture prepared policy guidelines to promote media literacy among children and adolescents. The National Core Curricula for Pre-Primary Education and Basic Education include ICT competence. Finland does not apply assessment criteria in digital competences at both primary and secondary level. Finland’s curriculum seeks to tackle hostile online activity by pursuing a broader media literacy, rather than a specifically ‘digital’ one, and frames media literacy as a civic competence. The Ministry of Culture deems media literacy to include: the traditional ability to read and write text, critical literacy, digital literacy, data literacy; emotional skills, social skills, empathy skills; competence in issues related to ethics and morality media criticism and source assessment; safety-related issues such as data security and privacy, cyber security and grooming. Finland monitors and evaluates its strategies for digital education at school level on an ad hoc basis.
Finland has no digital competence framework for teachers but promotes the use of self-assessment tools for teachers to evaluate their level of digital competence and thereby define their development needs.
In 2013, Finland’s Ministry of Education and Culture prepared policy guidelines to promote media literacy among children and adolescents. The National Core Curricula for Pre-Primary Education and Basic Education include ICT competence.
Finland does not apply assessment criteria in digital competences at both primary and secondary level.
Finland’s curriculum seeks to tackle hostile online activity by pursuing a broader media literacy, rather than a specifically ‘digital’ one, and frames media literacy as a civic competence.
The Ministry of Culture deems media literacy to include:
the traditional ability to read and write text, critical literacy, digital literacy, data literacy; emotional skills, social skills, empathy skills; competence in issues related to ethics and morality media criticism and source assessment; safety-related issues such as data security and privacy, cyber security and grooming.
Finland monitors and evaluates its strategies for digital education at school level on an ad hoc basis.
370.Elisabeth Braw, Director of the Royal United Services Institute’s (RUSI) Modern Deterrence Project, pointed out than in Latvia all high schools in the country teach a national security curriculum where children are taught what the threats are facing the country, including online threats. She suggested that this was important because: “If we are not taught about that as citizens, our instinct will always be to think that the Government can somehow put up a more powerful or larger umbrella over us, so that we do not have to worry about this or that threat.” Similarly, Ben Scott, Director of Policy and Advocacy at Luminate, pointed out that the government of Finland ran a public service campaign stating that it was the patriotic duty of every Finnish citizen to tell the difference between truth and falsehood online, because of the threat from Russia. Sir Julian King, Former EU Security Commissioner, told us that Finland, Denmark, Sweden and Estonia were the countries that best achieved societal resilience to disinformation through education. We note that the curriculum requirements, outlined in Table 1 are not as robust, nor as far reaching, as those from Estonia and Finland, as outlined in Table 2. This is symptomatic of a chronic lack of ambition by the UK in this regard.
371.We understand that in these countries, because of the geopolitical context, the issue of having a digitally literate citizenry that can identify disinformation is of existential importance to their democracy. We, of course, recognise that Estonia and Finland have a combined population smaller than that of London, and that the challenges of scaling such attempts to a country such as the UK will be significant. However, we believe that the UK Government should nevertheless look to and learn from examples of excellent digital media literacy strategies, wherever they are to be found.
372.We heard about the fragmented nature of the provision of digital media literacy teaching and resources. Full Fact highlighted the need to map and make coherent currently fragmented digital literacy schemes initiated by Government, for profit and non-profit organisations, to identify and share best practice. This activity was seen to be a cost-effective way of improving digital literacy.
373.A number of existing initiatives aiming to raise digital literacy are being shaped and promoted by technology platforms. Google highlighted the work they do to support digital literacy. For example, they run a programme with the Institute for Strategic Dialogue which encourages young people aged 13-15 to have a positive voice online and provides training in social media and critical thinking. Google also fund the Stanford History Education Group’s online digital literacy content which seeks to teach young people to read laterally by searching for more information about the sources they find online. Facebook listed numerous initiatives and measures they have in place to improve media literacy.
374.Prior to the COVID-19 pandemic, England’s teachers reported that less than half frequently used ICT for classwork and just over a quarter had not covered the use of ICT for teaching in their formal training. The crisis has now demonstrated the need for better pedagogic training in using technology for teaching, in general and digital media literacy in particular. The Sutton Trust found wide gaps and variability in provision of online learning, with poorer students less likely to have access to some types of provision. It stated that supporting more teachers to deliver online content was an immediate challenge, and students in schools with greater deprivation were less likely to have access to more intensive approaches such as recorded or live online classes. Secondly, as discussed earlier in this Report, the online misinformation about COVID-19 has resulted in real world harm and has made clear that it is vital that citizens of all ages are able to identify and evaluate what is trustworthy or credible content.
375.We consider it admirable that companies such as Google and Facebook are aiding media literacy initiatives. The digital media literacy organisations that we heard from explained how tight budgets were, and certainly global tech companies have the resources to help such smaller organisations. However, we have significant concerns that technology companies are setting the terms of media literacy, omitting the content that explains their business models and critiques their practices. What is most concerning is the absence of governmental leadership in shaping and designing digital media literacy programmes and taking them to scale; the Government would appear to have abandoned its role in this partnership.
376.We heard that Government was best placed to take the lead on unifying these approaches. Full Fact suggested that there was a role for Government or a regulator like Ofcom, who have a statutory responsibility to promote digital literacy, to co-ordinate initiatives to share best practice between providers. It should be noted that the ICO have a responsibility for data literacy. The LSE T3 Commission called for Government to mobilise an urgent, integrated new programme in media literacy for children in schools, adults in further and vocational education, as well as parents, teachers and the children’s workforce. As Professor Rasmus Kleis Nielsen, Director of the Reuters Institute told us:
“the Government are the one actor that could make a difference. I do not see that it could come from civil society organically or through the competition between different for-profit businesses alone.”
377.Within Government, it is unclear where responsibility for digital literacy falls. The need for cross-departmental collaboration and communication is evident. Helen Milner of Good Things Foundation argued that responsibility and interest from government was diffuse among departments. She stated that part of the problem was that her organisation worked with six different government departments, each with a slightly different digital literacy focus.
378.The Department for Digital, Culture, Media and Sport (DCMS) oversees digital policy and is the lead department on the Online Harms White Paper, one of the major strands of which is to improve digital and media literacy. However, the Department for Education administers education policy in England, with devolved administrations taking responsibility in the other nations of the UK, and this includes the way in which digital literacy is incorporated into the school curriculum. The difference in approaches can be seen in the variation between the list of online harms in the guidance for teachers by the Department for Education in teaching online safety, and the list of online harms in the White Paper owned by DCMS. Whilst there is a great deal of overlap, they reflect different priorities within the Online Harms White Paper, with DCMS focusing on more concrete harms and the teaching guidance prioritising the general mental wellbeing of young people. It is not clear to what extent Department for Education is contributing to the regulation of platforms proposed in the White Paper with a view to making them more understandable, or to what extent DCMS are feeding into digital literacy programmes based on their proposed regulation of online platforms. Neither take a holistic approach.
379.Again, for adult digital literacy it is not clear which part of Government is leading in this area. The Online Harms White Paper calls for more messaging and resources for adults. However, the only actor identified in the White Paper is Ofcom, which is assessing existing research to help guide policy makers to identify gaps and opportunities for future action. How these assessments have influenced government thinking is also unclear.
380.Ofcom has a statutory duty to promote media literacy under Section 11 of the Communications Act 2003. Ofcom interpret this as providing an evidence base of UK adults’ and children’s understanding and use of electronic media and share this evidence base with stakeholders. They do not appear to run any digital media literacy programmes, although Tony Close, then Ofcom Director of Content Standards, Licensing and Enforcement, highlighted the Making Sense of Media programme, which aims to bring organisations working in the digital literacy space together. In response to the COVID-19 pandemic, the Making Sense of Media panel collected a set of resources to provide people with useful tools to navigate news and information about the virus, including debunking misinformation and how to seek out reliable content.
381.The White Paper also identifies digital literacy as an area that the new Online Harms Regulator will cover. A previous public information campaign designed to reduce the spread of misinformation was disseminated by, and appears to have come from, the budget of the Cabinet Office. Caroline Dinenage MP, Minister for Culture and Digital, told us that DCMS leads on adult media literacy and digital policy, but that they worked closely with Department for Education. She stated that the Government had conducted an analysis of adult media literacy initiatives and that this showed there was a large proportion of initiatives focused on children and their parents, but less targeted at adults.
382.Many bodies have called for the various media literacy initiatives to be made more cohesive. The Cairncross Review into the sustainability of journalism in the UK recommended that the Government should develop a media literacy strategy, working with Ofcom, the online platforms, news publishers and broadcasters, voluntary organisations and academics, to identify gaps in provision and opportunities for more collaborative working.
383.Channel 4 also called for these initiatives to be brought together: “What is most crucial is policymakers encourage the coordination of all existing media literacy and digital skills activities into a cohesive strategy to have the greatest impact. Whilst there are myriad of industry-funded civil society programmes, there is a need for greater emphasis and support on the organisations and educational materials that have a proven impact e.g. the work of Media Smart and Internet Matters.”
384.The House of Lords Digital Skills Committee in 2015 stated that the Government has a responsibility to accelerate the attainment of digital literacy across the population and that the Government was responsible for ensuring the UK’s population keeps pace with the best in the world.
385.We would certainly add our voice to these calls for a large-scale evaluation of the landscape. However, we recommend that the Department for Education should take responsibility and lead on this with Ofcom. We also believe that much can be learned from what works internationally. Additionally, we think it is vital to place a clear time limit on our recommendation, as we believe enough recommendations have already been made to the Government for it to take this issue seriously, and act with urgency. This exercise is only made more urgent by the COVID-19 pandemic, which saw misinformation posing a threat to people of all ages.
386.Ofsted, in partnership with the Department for Education, Ofcom, the ICO and subject associations, should commission a large-scale programme of evaluation of digital media literacy initiatives. This should:
387.We heard that part of the issue was that teachers did not feel empowered to teach digital media literacy skills, nor did they have the resources and curricular time to do so. The Digital Life Skills Company found that teachers lack the skills necessary to help children better understand the online world, tending towards an ‘avoidance’ strategy when it comes to digital information.
388.We heard consistently that a cross-curricular approach was best, and that digital media literacy could and should be embedded into most subjects. National Literacy Trust stated that critical literacy was taught most effectively through a whole-school, cross-curricular approach. They stated that many skills were already included within several programmes of study and therefore did not advocate for a specific curriculum change. Stanford History Education Group highlighted that the effects of digital literacy interventions will be minimal as long as they are seen as an add-on to the regular school curriculum. They stressed that digital literacy initiatives must be embedded into other subjects and across the curriculum.
389.Ben Scott from Luminate told us that integrating digital literacy into existing curricula was more successful and popular with teachers and students, because they were not asked to digest new material from scratch that is unrelated to anything they have ever taught before.
390.Countries that have been highlighted as examples of good media literacy education incorporate digital media literacy across the curriculum. In Finland, for example, the curriculum for Upper Secondary Education includes cross-curricular themes in multiliteracy and media as well as technology and society. Competencies related to media and information literacy are included across different subjects. Siim Kumpas told us that in Estonia, as well as a separate course, media literacy efforts are ingrained in other subjects. He also told us that the Estonian universities that train teachers are obliged, by contract with the Government, to insert elements of digital competences in all their training programmes.
391.Liz Moorse from the Association of Citizenship Teaching highlighted the Finnish example, stressing that “every teacher in a Finnish school understands democracy education and citizenship education.” We spoke to six teachers convened by the Politics Project, who told us that whilst they already had significant pressures placed on them they would value more time to teach political and media literacy education, and some said that timetabled curriculum lessons would be helpful. Among the teachers we spoke to, there was a keenness to do more, but a significant limit in their capacity.
392.National Literacy Trust in their report on fake news and critical literacy argue that a whole-school approach to teaching critical literacy is essential to embedding critical literacy across the curriculum. However, they also argue that teachers and schools must be provided with the necessary CPD and resources to enable them to teach critical literacy actively and explicitly within the teaching of any and every subject. We concur and understand that incorporating digital literacy across the curriculum may place extra strain on teachers and appreciate they also deal with social problems brought about by technology. They must be given support to embed this into their subjects effectively.
393.The House of Commons DCMS Committee in 2019 called for digital literacy to be the “fourth pillar of education alongside reading, writing and maths,” but the Government rejected the recommendation, responding that “digital literacy is already taught across the national school curriculum.” We support the DCMS Committee’s recommendation and find the Government’s response to be particularly tepid. We have seen that the Government often refers to computer science when discussing digital media literacy. The evidence collected by the Joint Council on Qualifications (JCQ), below, shows that very few pupils take computing at GCSE and even fewer at A-Level, which confirms that relying on computing education as a vehicle for digital media literacy is insufficient, particularly where the gender disparity is so wide.
In 2019, 80,027 pupils took computing at GCSE. The subject remains dominated by boys, who make up 78.6 per cent of entries.
The number of pupils who took computing at A-level was 11,124 in 2019, with only 1, 475 girls choosing the subject.
The Department of Education estimates that between 684,000-693,000 pupils took GCSEs in 2019, meaning that only around 11 per cent of pupils took Computing GCSE.
394.It is not this Committee’s place, or aim, to re-organise the education system. However, better digital media literacy should be placed in the context of the need for a wider change in education in response to the influence and use of digital technology. When we asked civil servants about how the Government planned to respond to these changes, Michelle Dyson from the Department for Education told us that the Government’s “big computer science programme … aims to train one teacher in every secondary school… both in subject content and pedagogy”. We regard this as an underwhelming response demonstrating a lack of understanding within the Department about what kind of investment and additional commitment is needed to bring about change. We remain sceptical as to whether the Government has a full understanding of the critical ways in which digital media literacy and technical computing skills differ.
395.The Department for Education should review the school curriculum to ensure that pupils are equipped with all the skills needed in a modern digital world. Critical digital media literacy should be embedded across the wider curriculum based on the lessons learned from the review of initiatives recommended above. All teachers will need support through CPD to achieve this.
396.We also recognise that responsibility for aiding public understanding of online technologies lies with the online organisations themselves. Ed Humpherson from the UK Statistics Authority put it in these terms: “you can only be literate with something readable.” Professor Sonia Livingstone from the LSE has called media literacy the ‘policy of last resort’, stating:
“we cannot teach what is unlearnable, and people cannot learn to be literate in what is illegible … we cannot teach people data literacy without transparency, or what to trust without authoritative markers of authenticity and expertise. So people’s media literacy depends on how their digital environment has been designed and regulated.”
397.Lisa-Maria Neudert from the Oxford Technology and Elections Commission pointed out that digital literacy initiatives place a large onus on the citizen. Looking at the current digital landscape, she argued that much of the malicious material was very sophisticated and could take expert fact-checkers several hours or even days to identify whether something was fake or artificially generated. This implies that it is unfair and unrealistic to expect the average user to do the same.
398.Dr Elinor Carmi told us that it was important for people to understand the online ecosystem and how different platforms were funded. For example, research by Ofcom has consistently shown that most people do not understand that Google and Facebook are advertising companies, and that this affects the way in which they show you information. She told us that platforms use dark patterns in their interface designed to deceive people into making choices not necessarily in their interests. Companies often make compliance and reporting procedures obscure and inaccessible. For example, placing information about legal compliance on data protection and their own terms of service as far away from the user as possible enhances information asymmetries between the user and platform.
399.Part of increasing public trust and understanding is ensuring transparency about how personal data is used on websites. Platforms should make this more understandable as part of their duty of care to users. The CMA, in its Online platforms and digital advertising interim report, suggested a fairness by design duty. This would put a duty on platforms to ensure fairness in the design of data collection processes and would allow early intervention by a regulator to ensure that the duty is adhered to at the design stage. It is revealing that the CMA stated that: “we were surprised to find out how little testing is done by platforms in relation to consumer control over data and use of privacy settings, which stands in stark contrast to the very extensive trialling done on a daily basis in other parts of the business.” The CMA suggested that the regulator would set the high-level basis of compliance with this principle, with an option for ‘engagement and understanding’, which would seek to ensure that customers understand and are comfortable with the options available to them on an ongoing basis, and could include a requirement to help educate consumers about the use of their data in a manner agreed by the appropriate regulator. We discussed these issues in greater detail in Chapter 4 on transparency.
400.Platforms also have obligations under the General Data Protection Regulation and Data Protection Act 2018 to ensure public understanding of their processes, yet the Centre for Data Ethics and Innovation (CDEI) Review of Online Targeting found that only 36 per cent of people surveyed believed they have meaningful control over online targeting systems. They also found that there was a broad consensus that people should be considered responsible for their online behaviour, but that for this to work they needed to be genuinely empowered to understand and control their experience. Crucially, the CDEI also found that the people who participated in their ‘public dialogue’ activities agreed that significant changes were required in the design of online services and the information and controls afforded to users. These participants also thought that it was critical for the Government to direct online platforms to change and to scrutinise and enforce this work, as they did not trust online platforms to act in the interests of individual users or society more widely.
401.We recommend that Ofcom should require large platforms to user test all major design changes to ensure that they increase rather than decrease informed user choices. Ofcom should help devise the criteria for this testing and review the results. There should be genuine and easily understandable options for people to choose how their data is used.
402.In its research, Doteveryone found that from a survey of 2,157 individuals, 50 per cent believed that part and parcel of being online was that people would try to cheat or harm them in some way. They described a sense of powerlessness and resignation in relation to services online, with significant minorities saying that it doesn’t matter whether they trust organisations with their data because they have to use them, and that they have to sign up to services online, even if they have concerns about the terms and conditions. Strikingly, Doteveryone found that two fifths of the public disagreed with the notion that companies design their products and services with their best interest in mind.
403.It is clear that platforms need to be more transparent and easier to understand but have not been given clear guidelines of what that should look like in practice. We believe that the CDEI, as an advisory body that looks to maximise the benefits of data-driven technologies, is best placed to conduct this research. Their review should focus on how best to explain the ways in which individuals have been targeted and should keep accessibility to the public front and centre.
404.Dr Elinor Carmi told us that the Government should not silo its education policy away from its broader approach to digital. We agree and we believe it is important that pupils need to be taught why and how they have been targeted and what rights they have over their data online. This work on transparency should therefore feed into the review of the curriculum recommended above.
405.The CDEI should conduct a review of the implications of platform design for users, focusing on determining best practice in explaining how individual pieces of content have been targeted at a user. Ofcom should use this to form the basis of a code of practice on design transparency. This should feed into the Department for Education’s review of the curriculum so that citizens are taught what to expect from user transparency on platforms.
406.Content can be very difficult to evaluate critically for authenticity and trustworthiness when it is posted by an anonymous account. Anonymity has traditionally been seen as a core part of the internet, however there is an argument that anonymity means that people cannot be held to account for disinformation or malicious content that they post. Given that much democratic discussion occurs online, is it right that anonymous users are able to participate? We have received mixed evidence, which we review here.
407.Different technology platforms have different policies towards anonymity. Registering on YouTube (owned by Google) only requires an email address, not a real name. Facebook controversially has a ‘real name’ policy that requires users to provide the name they use in real life. According to Facebook, this policy means “you always know who you’re connecting with.” If Facebook believes users are not using a real name, it may ask them for ID to prove that they are who they claim to be. On Twitter, an account can represent whatever the user decides. Twitter requires users to provide either an email or a phone number upon signing up but does not require both. YouTube has a ‘three strikes and you’re out’ policy for impersonation: content intended to impersonate a person or channel for the first time receives a warning with no penalty to the channel. If it is the second time, a ‘strike’ is issued against the channel. The channel is terminated if three of these strikes are received (effectively meaning impersonation can occur on multiple occasions). Google can disable accounts whose users appear not to be old enough (the minimum age is 13 in the UK but varies between countries) and requires users whose accounts have been disabled due to age restrictions to confirm their age through a copy of government-issued ID or a credit card. Twitter’s policy on authenticity requires that users do not use Twitter to amplify artificially or suppress information or manipulate or disrupt others. Twitter also forbids interfering with elections or impersonating others.
408.We heard from Baroness O’Neill of Bengarve that there should not be a right to take part in democratic discussions anonymously. She told us:
“The question of when anonymity is needed is highly contextual. It is sometimes needed, but in my view one of the places where it is not needed is in exercising civic rights. As a citizen, I do not stand behind a hedge and throw stones; I stand in the public square and speak.”
409.However, Katy Minshall from Twitter told us that allowing pseudonymity on Twitter enabled people to speak out. She used the example of the account @thegayfootballer that sparked a wider conversation about homophobia in football and argued that allowing people to be anonymous encouraged people to share stories and experiences that they may not feel comfortable sharing under their real name.
410.Professor Derek McAuley and his colleagues from the University of Nottingham Horizon Digital Economy Research Institute suggested that despite the potential downsides of maintaining full anonymity online, policymakers should take care in this area in order to avoid a chilling effect on the use of online services. They argued that a ‘real-name’ policy may present significant threats to the confidentiality of communications; that bad actors are likely to circumvent the real name policy with false identities and that the policy could potentially expose vulnerable groups to harmful retaliation. Index on Censorship pointed out that anonymity can be important for human rights defenders working under repressive regimes and for members of minority groups and journalists.
411.Full Fact similarly urged us to consider that anonymity and the ability to communicate via encrypted messages is relied upon by many, such as whistle blowers and those in less democratic regimes. They also highlighted that what the UK does in this space will be watched closely elsewhere in the world.
412.We do not advocate that only people’s real names can be used on platforms, but we do believe it is important that people should retain the ability to identify disinformation promoted by fake accounts. In this, as in other areas discussed in this Chapter, platforms could boost free expression and reduce harms by doing more to empower their users; it is a false binary that one can only choose between anonymity and real names. Twitter currently empowers its users not to receive notifications from those who have not confirmed their phone numbers or who use the default anonymous profile photo.
413.The Government’s Verify service, which securely allows people to verify their identity and is used to access government services such as filing taxes or checking driving licence information, could be a resource to help build on this approach, allowing users to verify their identities. Another approach would be to visually identify users who had securely verified their accounts. Twitter, Facebook, Instagram and YouTube all have some form of verified user programme denoted with a checkmark next to a user’s name. Platforms could develop a similar feature which identified to users if another user had authenticated their account through a service like Verify.
414.When we asked the Government whether they had considered using the Verify service to allow users to confirm their identity, Caroline Dinenage MP, Minister for Culture and Digital, told us that they are running a pilot that would allow private sector organisations to use a document-checking service that is a component of the Gov.UK Verify service. She also told us that over the next 18 months, the Government was replacing Verify with a private sector-led digital identity market, which will make it possible for people to confirm their identity without having to show paper documents.
415.A substantial amount of content on digital platforms is posted by anonymous users who may not be genuine users at all. Indeed, it is sometimes posted by bad actors using sophisticated techniques to spread misinformation and abuse, and to undermine democratic debate. In general, we believe there should be a presumption against anonymity on digital media however we recognise that for many this is not possible.
416.Anonymity can be important to protect freedoms, for example where people from ethnic minority groups want to have a voice in debates but are afraid of retaliation or abuse, where LGBT+ people may not be ready to come out or live in jurisdictions where homosexuality is criminalised, or where journalists and citizens are living in autocratic regimes. However, there is a significant proportion of those who use anonymity who use it to abuse, to troll, to silence alternative views, or to spread hate.
417.We recognise that in a perfect world there would be no need for anonymity, but as it stands there remain many legitimate reasons for hiding an online identity. A great deal of abuse could be dealt with by robust application of platforms’ rules; by swift, consistent and transparent moderation as set out in Chapter 4 and by platforms being held responsible for the content they recommend as set out in Chapter 3.
418.Users should be empowered to verify themselves; those who wish to be anonymous should remain so, but other users should be able to filter anonymous users out.
419.Ofcom should work with platforms and the Government’s Verify service, or its replacement, to enable platforms to allow users to verify their identities in a way that protects their privacy. Ofcom should encourage platforms to empower users with tools to remove unverified users from their conversations and more easily identify genuine users.
501 Written evidence from Democracy Club ()
502 (Dr Elinor Carmi)
503 Department for Education, Essential digital skills framework (April 2019) [accessed 8 April 2020]
504 Written evidence from HM Government ()
505 (Angie Pitt)
506 (Nick Gibb MP)
507 Written evidence from HM Government ()
508 (Liz Moorse)
509 (Liz Moorse, Jonathan Baggaley)
510 Key Stage 1 refers to pupils aged 5-7, Key Stage 2 refers to children age 7-11, Key Stage 3 refers to children age 11-14.
511 Relationships Education applies to all schools providing primary education; Relationships and Sex Education applies to all schools providing secondary education.
512 Written evidence from 5Rights Foundation ()
513 Written evidence from The Digital Life Skills Company ()
514 (Angie Pitt)
515 Written evidence from 5Rights Foundation ()
516 National Literacy Trust, Fake news and critical literacy: The final report of the Commission on Fake News and the Teaching of Critical Literacy in Schools (June 2018) p 5-6: [accessed 8 April 2020]
517 Ofcom, ‘Adults: Media use and attitudes report 2019’ (May 2019) p 1-3: [accessed 8 April 2020]
518 Written evidence from Care England () and National Pensioners Convention ()
519 Written evidence from Index on Censorship () and University of Sheffield ()
520 (Helen Milner)
521 (Dr Elinor Carmi)
522 Written evidence from MySociety ()
523 Written evidence from Good Things Foundation ()
524 Written evidence from National Literacy Trust ()
525 Sonia Livingstone, ‘Media literacy: what are the challenges and how can we move towards a solution?’, LSE blog (13 March 2019): [accessed 9 April]
526 (Will Moy)
527 BBC News, ‘Coronavirus: The fake health advice you should ignore’ (8 March 2020): [accessed 13 April 2020]
528 ‘EU warns of pro-Kremlin disinformation campaign on coronavirus’, Financial Times, (17 March 2020): [accessed 13 April 2020]
529 Bloomberg, ‘5G Virus Conspiracy Theory Fueled by Coordinated Effort’ (9 April 2020): [accessed 13 April 2020]
530 (Lisa-Maria Neudert)
531 (Sir Julian King)
532 (Liz Moorse)
533 (Siim Kumpas)
534 European Commission, ‘Digital Education at School in Europe’: accessed 15 April 2020
535 European Commission, ‘EACEA National Policies Platform’: [accessed 15 April 2020]
536 Republic of Estonia Ministry of Education and Research, ‘The Estonian Lifelong Learning Strategy 2020’: [accessed 15 April 2020]
537 NESTA, ‘Digital Frontrunners Spotlight: Estonia’: [accessed 15 April 2020]
538 European Commission,
540 European Commission, ‘DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe’: accessed 15 April 2020
541 European Commission,
542 Finland Ministry of Education and Culture, Media Literacy in Finland (2019): [accessed 15 April 2020]
543 European Commission,
544 Finland Ministry of Education and Culture,
546 (Elisabeth Braw)
547 (Ben Scott)
548 (Sir Julian King)
549 Written evidence from Full Fact ()
550 Written evidence from Google ()
551 Written evidence from Facebook ()
552 TALIS – The OECD Teaching and Learning International Survey, TALIS 2018 Results, Volume II (March 2020): [accessed 1 June 2020]
553 Written evidence from Full Fact ()
554 Written evidence the LSE T3 Commission ()
555 (Professor Rasmus Kleis Nielsen)
556 (Helen Milner)
557 DCMS and Home Office, Closed consultation: Online Harms White Paper (February 2020): , [accessed 9 April 2020]
558 Ofcom, ‘About media literacy’: [accessed 9 June 2020]
559 (Tony Close)
560 Ofcom, ‘Cutting through the Covid-19 confusion’, (9 April 2020): [accessed 14 April 2020]
561 Cabinet Office, FOI329095, Freedom of Information Request (2 January 2020): [accessed 13 May 2020]
562 (Caroline Dinenage MP)
563 The Cairncross Review, A Sustainable Future for Journalism (February 2019) p 90: [accessed 9 April 2020]
564 Written evidence from Channel 4 ()
565 Written evidence from The Digital Life Skills Company ()
566 Written evidence from National Literacy Trust ()
567 Written evidence from Stanford History Education Group ()
568 (Ben Scott)
569 (Siim Kumpas)
570 For more information about the digital surgeries held with the Politics Project, see Appendix 4.
571 DCMS Committee, (Eighth report, Session 2017–19, HC 1791)
572 DCMS Committee,
573 JCQ, ‘GCSE Outcomes for key grades for UK, England, Northern Ireland & Wales, including UK age breakdown’ (August 2019): and JCQ, ‘GCE, A level and GCE AS Level Results Summer 2019’ (August 2019): [accessed 2 June 2020]
574 Statistics provided by the Department of Education.
575 (Michelle Dyson)
576 (Ed Humpherson)
577 Sonia Livingstone, ‘Media literacy: what are the challenges and how can we move towards a solution?’, LSE blog (13 March 2019) [accessed 9 April]
578 (Lisa-Maria Neudert)
579 (Dr Elinor Carmi)
580 Ben Wagner, Krisztina Rozgonyi, Marie-Therese Sekwenz, Jennifer Cobbe and Jatinder Singh, ‘Regulating Transparency? Facebook, Twitter and the German Network Enforcement Act’, (2019) p 1-11:
581 CMA, ‘Online platforms and digital advertising interim report’ (December 2019), p 257-8: [accessed 9 April 2020]
582 CDEI, ‘Review of Online Targeting: Final report and recommendations’ (February 2020) p 51, p 55: [accessed 9 April 2020]
583 Doteveryone, People, Power and Technology: The 2020 Digital Attitudes Report, (May 2020): [accessed 13 May 2020]
584 (Dr Elinor Carmi)
585 Facebook, ‘What names are allowed on Facebook?’: [accessed 13 April 2020]
586 YouTube Help, ‘Policy on impersonation’: [accessed 14 April 2020]
587 Google Account Help, ‘Frequently Asked Questions about Google Accounts and Age Requirements’: ? [accessed 14 April 2020]
588 (Baroness O’Neill of Bengarve)
589 (Katy Minshall)
590 Written evidence from the University of Nottingham ()
591 Written evidence from Index on Censorship ()
592 Written evidence from Full Fact ()