Impact of social media and screen-use on young people’s health Contents

Conclusions and recommendations

Research on social media and screen-use

1.In order to develop a more valid and reliable understanding of the relationship between the use of social media by young people, and its effects on their health, the information asymmetry between tech companies, the Government, other public bodies and bona fide researchers must be addressed swiftly. (Paragraph 29)

2.Regardless of whether Ofcom’s remit is eventually expanded to cover social media platforms, its existing obligation to collect data on the ‘media literacy’ of both adults and children (as set out in the Communications Act 2003) should be strengthened through establishing statutory information-gathering powers. Such powers should require social media companies with registered UK users to provide the regulator with the high-level data it needs to fulfil its duties with respect to media literacy, with legislation introduced in the next Session. (Paragraph 30)

3.While respecting data protection principles, social media companies should make anonymised high-level data available, for research purposes, to bona fide researchers so that a better understanding of social media’s effects on users can be established. The Government should consider what legislation needs to be in place to improve access by researchers to this type of data. (Paragraph 31)

4.We commend the Government for its efforts to think more closely about online harms and how best to address them, particularly when those harms have serious, detrimental effects on the lives of young people. While the Government has undertaken a wide-ranging consultation process through the publication of its Internet Safety Strategy Green Paper, it is disappointing that it has not sought to address the current limitations of the evidence base by actively commissioning new research. As the Government Response to its Green Paper acknowledges, the evidence on the impact of social media on mental health “is not yet conclusive”. That the field requires more robust research should not come as a surprise when the Chief Medical Officer described the evidence base, in 2013, as “sparse and contradictory”. (Paragraph 38)

5.To ensure that policy is evidence-based, and that the research needs of Government departments are met, departmental ‘Areas of Research Interest’ documents must be accompanied by periodic funding calls. Such calls need to take place ahead of an area becoming the subject of a major policy initiative. (Paragraph 39)

6.The existing Areas of Research Interest documents produced by the Department of Digital, Culture, Media and Sport and by the Department of Health and Social Care, should be expanded to include how to measure and monitor the harms related to social media use. As a matter of urgency, DCMS should also commission research focused on identifying who is at risk of experiencing harm online, and why, and what the long-term consequences of that exposure are on the young person. (Paragraph 40)

Risks, harms and benefits of social media and screens

7.The report of the Independent Advisory Group on Non-ionising Radiation on the ‘Health effects from Radiofrequency Electromagnetic Fields’ is now nearly seven years old. In its Response to our Report, we ask the Government to outline what assessment it has made of the quantity and quality of the research on this topic, published since 2012, and to explain whether another evidence review is now warranted. (Paragraph 64)

8.We welcome Dame Sally Davies’ work in this important area and look forward to reading the results of her review, and subsequent guidance, in due course. We note that many parents find it extremely challenging to moderate social media usage, especially where older children are involved. It would be helpful if this was recognised by those giving guidance to parents. (Paragraph 71)

9.Great strides have recently been made to address and remove content that incites terrorist activities. The same effort and determination must now be applied to curb the proliferation online of the physical, emotional and sexual abuse and exploitation of children, as a matter of urgency. The Home Secretary stated that he expects a more effective partnership between technology companies, law enforcement agencies, the charity sector and the Government to protect children from sexual abuse and exploitation online. Simply ‘expecting’ more, however, is an insufficient approach to tackle the grievous nature of the problem. It is worrying that we still do not have a good understanding of the scale of online child sexual exploitation. (Paragraph 108)

10.The Government must proactively lead the way in ensuring that an effective partnership is in place across civil society, technology companies, law enforcement, and non-governmental organisations aimed at ending child sexual exploitation (CSE) and abuse online. The Home Office should use its research budget to commission a large-scale study that establishes the scale and prevalence of CSE which should then be updated annually. Once this has been published, we recommend that the Government set itself an ambitious target to halve reported online CSE in two years and all but eliminate it in four years. That ambition should be matched with the necessary resources, raised by the digital services tax, to make it a reality and should occur in addition to—and not instead of—establishing a legal ‘duty of care’ by social media companies towards its users who are under 18. Where companies are not voluntarily working with the Government and law enforcement agencies to prevent CSE, the Government should consider whether legal action is necessary. (Paragraph 109)

11.Our inquiry has illuminated the broad spectrum of benefits, risks and harms that children and young people may encounter via social media and screen-use. While social media and screen-use is not necessarily creating these risks, it has, in numerous cases, amplified them. Initiatives are in place to address some of these harms—notably around cyberbullying—yet others are falling through the cracks. A comprehensive, joined-up approach to address the plethora of negative effects is needed. (Paragraph 131)

12.Underpinning the Government’s forthcoming White Paper, and subsequent legislation, should be the principle that children must, as far as practicably possible, be protected from harm when accessing and using social media sites. All the physical and mental health harms we have outlined in this chapter—including cyberbullying, grooming, child abuse and child sexual exploitation (CSE), ‘self-generated’ images and ‘sexting’, the live streaming of CSE, violence, hate speech and pornography—should be covered. (Paragraph 132)

Resources for schools and parents

13.As children spend an increasing proportion of their life online, there is a pressing need for the education system to catch up and ensure that young people are equipped with the skills that they need to navigate, and critically assess, what they are seeing on social media and beyond. The Children and Social Work Act 2017 presents the Government with a vital opportunity to establish digital literacy and resilience as integral parts of the curriculum for primary and secondary school students, through making ‘Personal, Social, Health and Economic’ (PSHE) education mandatory. This chance must not be wasted. (Paragraph 148)

14.We recommend that ‘Personal, Social, Health and Economic’ (PSHE) education be made mandatory for primary and secondary school children in the next parliamentary session and that the PSHE curriculum delivers an age-appropriate understanding of, and resilience towards, the harms and benefits of the digital world. (Paragraph 149)

15.The Department for Education should commission research early in 2019 to evaluate existing resources on online safety and digital resilience. This should be undertaken with a view to creating guidance on, and signposting teachers towards, high-quality information and teaching resources that can be used with primary and secondary school-age children. (Paragraph 150)

16.Parental engagement can play a vital role in helping children develop ‘digital resilience’, so that they can confidently identify and judge online risks themselves. Parents, however, need high-quality support to ensure these conversations are as effective as possible. (Paragraph 155)

17.In addition to identifying the gaps in the ‘online safety information available to parents’, the Government should commission the UK Council for Child Internet Safety to produce a toolkit in 2019 for parents and caregivers. The toolkit should enable them to have an effective, open and ongoing dialogue with their children about how to recognise, manage and mitigate online risks in relation to social media. This work should complement the proposed review of existing teaching resources recommended in paragraph 150. (Paragraph 156)

18.We have heard how children bringing smartphones into schools can be both a help and a hinderance to learning. While it is right that each school should have the freedom to decide its own policy on the use of mobile phones on its premises, it is essential that schools are supported to make that choice with evidence-based guidance. (Paragraph 164)

19.We recommend that the Government’s ‘What Works Centre for Education’ evaluates the different approaches to handling smartphone use in schools so as to provide a basis for making evidence-based guidance available to both primary and secondary schools. This evaluation should be produced by the end of 2019. (Paragraph 165)

Regulation and guidance

20.In February 2018, the Prime Minister described social media as one of the “defining technologies of our age”. Like many age-defining technologies, it has brought a raft of benefits to its users, together with a host of unintended consequences; a number of which have been particularly detrimental—and in some instances, dangerous—to the wellbeing of children. Currently, there is a patchwork of regulation and legislation in place, resulting in a “standards lottery” that does little to ensure that children are as safe as possible when they go online, as they are offline. A plethora of public and private initiatives, from digital literacy training to technology ‘solutions’, have attempted to plug the gaps. While the majority of these are to be welcomed, they can only go so far. A comprehensive regulatory framework is urgently needed: one that clearly sets out the responsibilities of social media companies towards their users, alongside a regime for upholding those responsibilities. The Government’s forthcoming Online Harms White Paper, and subsequent legislation, presents a crucial opportunity to put a world-leading regulatory framework in place. Given the international nature of social media platforms the Government should ideally work with those in other jurisdictions to develop an international approach. We are concerned, however, based on the Government Response to its Internet Safety Strategy Green Paper, that it may not be as coherent, and joined-up, as it needs to be. We recommend a package of measures in this Report to form the basis of a comprehensive regulatory framework. (Paragraph 226)

21.To ensure that the boundaries of the law are clear, and that illegal content can be identified and removed, the Government must act on the Law Commission’s findings on Abusive and Offensive Online Communication. The Government should now ask the Law Commission to produce clear recommendations on how to reform existing laws dealing with communication offences so that there is precision and clarity regarding what constitutes illegal online content and behaviour. The scope for enforcing existing laws against those who are posting illegal content must be strengthened to enable appropriate punishment, while also protecting freedom of speech. (Paragraph 227)

22.A principles-based regulatory regime for social media companies should be introduced in the forthcoming parliamentary session. The regime should apply to any site with registered UK users. One of the key principles of the regulatory regime must be to protect children from harm when accessing and using social media sites, while safeguarding freedom of speech (within the bounds of existing law). This principle should be enshrined in legislation as social media companies having a ‘duty of care’ towards its users who are under 18 to act with reasonable care to avoid identified harms. This duty should extend beyond the age of 18 for those groups who are particularly vulnerable, as determined by the Government. (Paragraph 228)

23.While the Government should have the power to set the principles underpinning the new regulatory regime, and identify the harms to be minimised, flexibility should be built into the legislation so that it can straightforwardly adapt and evolve as trends change and new technologies emerge. (Paragraph 229)

24.A statutory code of practice for social media companies, to provide consistency on content reporting practices and moderation mechanisms, must be introduced through new primary legislation, based on the template in the Government Response to its Internet Safety Strategy. The template must, however, be extended to include reports of, and responses to, child sexual abuse and exploitation. (Paragraph 230)

25.A regulator should be appointed by the end of October 2019 to uphold the new regime. It must be incumbent upon the regulator to provide explanatory guidance on the meaning and nature of the harms to be minimised; to monitor compliance with the code of practice; to publish compliance data regularly; and to take enforcement action, when warranted. Enforcement actions must be backed up by a strong and effective sanctions regime, including consideration being given to the case for the personal liability of directors. The regulator must be given the necessary statutory information-gathering powers to enable it to monitor compliance effectively. (Paragraph 231)

26.Those subject to the regulatory regime should be required to publish detailed Transparency Reports every six months. As a minimum, the reports must contain information on the number of registered UK users, the number of human moderators reviewing reports flagged in the UK, the volume of reports received from UK users broken down by age, what harms the reports relate to, the processes by which reports are handled—including information on how they are prioritised, the split between human and machine moderation and any reliance on third parties, such as Trusted Flaggers—the speed at which reports are resolved, data on how it was resolved, and information on how the resolution or response was fed back to the user. (Paragraph 232)

27.The Government should consider implementing new legislation, similar to that introduced in Germany, such that when content that is potentially illegal under UK law is reported to a social media company, it should have to review the content, take a decision on whether to remove, block or flag that item (if appropriate) or take other actions, and relay that decision to the individual/organisation reporting it within 24 hours. Where the illegality of the content is unclear, the social media company should raise the case with the regulator, who has the authority to grant the social media company additional time to investigate further. The Government should consider whether the approach adopted in Germany of allowing an extra seven days, in the first instance, to review and investigate further should be introduced in the UK. (Paragraph 233)

28.Given the innovation of new technologies such as “deep fake videos” which cannot be easily identified by human moderators, Social media companies should put in place artificial intelligence techniques to identify content that may be fake, and introduce ways in which to “flag” such content to users, or remove (as appropriate).(Paragraph 234)

29.Social media companies must put robust systems in place—that go beyond a simple ‘tick box’ or entering a date of birth—to verify the age of the user. Guidance should be provided, and monitoring undertaken, by the regulator. The Online Pornography (Commercial Basis) Regulations must be immediately revised so that making pornography available on, or via, social media platforms falls within the scope of the regulations. (Paragraph 235)

30.Safety-by-design principles should be integrated into the accounts of those who are under 18 years of age. This includes ensuring strong security and privacy settings are switched on by default, while geo-location settings are turned off. Strategies to prolong user engagement should be prohibited and the Government should consider improvements to ways in which children are given recourse to data erasure where appropriate. (Paragraph 236)

31.We believe that Ofcom, working closely alongside the Information Commissioner’s Office (ICO), is well-placed to perform the regulatory duties and recommend to the Government that it resource Ofcom, and where relevant, the ICO, accordingly to perform the additional functions outlined above. (Paragraph 237)

Published: 31 January 2019