188.In the previous Chapter, we explored aspects of children’s use of the internet. In this Chapter we examine in further detail who is responsible for maximising opportunities for children to make the best use of the internet by making it both more child-friendly and safe.
189.It is in the interest of the whole of society that children grow up to be empowered, digitally confident citizens. It is therefore a shared responsibility for everyone: to improve opportunities for children to use the internet productively; to improve digital literacy; to change the norms of data collection and privacy when the user is a child; to design technology in ways that support children by default; for adults to have better understanding of internet technologies so that they can support children’s online experience; to invest in the digital resilience of children in order to minimize harm; to deliver all the rights online that children enjoy offline, and to remember that they are not only end users, but children.
190.These responsibilities will not be static but continually changing as technology changes. It is imperative that general principles are established to provide a framework for future action.
191.In mapping out where specific responsibilities lie it is worth remembering that the ubiquity of the internet means that children can access it in the privacy of their room, at a friend’s house, on public Wi-Fi, using mobile data, or at school. No one can be supervising them at all times, nor would it be appropriate that they should.
192.Children of all age groups inhabit a world that seamlessly flows between on and offline. In order to thrive in both they need the protections and privileges that they enjoy offline. Digital technologies are the present and the future of these 21st century children. They will define their opportunities as workers and as citizens. These opportunities need to be upheld and shaped by many different stakeholders.
193.Parents and carers have a primary responsibility to ensure that their own children use the internet safely. Parent Zone, a charity, told us that “good enough parenting” was one of only two factors that could be positively correlated to building online resilience. Its evidence quoted the developmental psychologist Diana Baumrind as saying that the fundamental role of the parent is to raise a child “that is socialised to the society they are growing up in”207 and concluded that “parents are right to recognise the need to raise children who can flourish in a digital world”.
194.The Family Online Safety Institute (FOSI), an international non-profit organisation funded by industry, highlighted the importance of parents taking an active part in managing, or ‘mediating’, their children’s internet use noting that “parents who often use technology with their child are more confident that they can manage their child’s technology use.” Accordingly, it “strongly suggests that parents and children go online together from an early age to help develop ongoing conversations about appropriate use of technology.” 208
195.Ofcom highlighted four ways in which parents mediate their children’s internet use:
196.According to Ofcom’s research over 90 per cent of parents of 5–15s who go online mediate their internet access in one of these ways. 57 per cent of such parents use technical tools. FOSI agreed that the majority of parents have rules about their child’s technology use. According to research that FOSI carried out in 2015 in the USA, “75% of parents have specific rules about what their children can or cannot post publicly online.” 210
197.However, these figures conceal a variety of different levels of engagement. For example, Ofcom’s list of possible solutions includes “software to protect against junk email/spam or computer viruses”. While this is important, it does not address the risks posed by exposure to inappropriate content. Of the parents who do not use technical tools, around half say that they prefer to talk to their children and use other methods of mediation; others trust their children to be responsible. There is natural variation in how parents approach mediation, and “they can range from quite intrusive to just sitting down and talking about it”.211
198.Wendy Grossman, a journalist, noted that some view parents who choose not use filtering systems as “somehow negligent”. She repudiated this notion: “different people have different values and beliefs about educating their children, and there should be no stigma attached to electing a different path than the government of the day would like”212. We endorse her advice that parents should not be stigmatised on the basis of parenting style, but note that trusting children to be responsible cannot prevent them from seeing unsuitable content given much of that content is delivered to children without their actively looking for it.
199.Our witnesses identified several problems with relying on parental controls. Some parents simply lack the knowledge to mediate their children’s internet use effectively: “Almost a fifth (19%) of parents are worried their lack of tech skills could be putting their children at risk—44% say their children’s expertise outstrips their own.”213
200.David Miles, an online safety expert, suggested that these gaps are likely to be compounded as technology continues to become more sophisticated: “The growing encryption of browsers, websites, messaging services and many popular apps, is likely to make it increasingly difficult for parents to control or manage their children’s activities online. Accessibility through gaming devices, TVs and the inexorable move towards the Internet of Things, will only serve to compound the problem.”214
201.The Children’s Charities Coalition on Internet Safety agreed:
“What we are talking about, in essence, are the skills needed for 21st Century parenting. That repertoire of skills must now include a knowledge of how the internet fits into young people’s lives and how best to support children and young people in the use of the technology.”215
202.Research from Ofcom shows that many parents try to educate themselves using a range of different sources.216 The Children’s Media Foundation told us: “There is no doubt that parents need to be helped to play a bigger part in their children’s media literacy and media use. However, in our view the efforts to help adults understand their children’s digital lives are disjointed and piecemeal and therefore ineffective.”217
203.According to Internet Matters:
“In the past Government would have invested significant sums in public service broadcasting, to help drive home the message that parents need to get involved. However, pressure on budgets means that this is no longer an option … [despite] significant progress and investment in the range and availability of technical tools across networks, devices and platforms, there has not been a comparable investment in driving awareness, education and engagement of parents.”218
204.Professor Phippen was more critical of parents: “When working with charities who deliver parental information sessions around online safety, I have seen poorly attended (in some cases non-attended) sessions—it seems that for many parents their view is that schools should be attending to child development in this area.”219
205.Indeed some of our witnesses suggested that parents were deliberately acting contrary to the best interests of their children, in particular by posting content which affects their children’s privacy or data protection. FOSI carried out research in the US, which showed that “19% of parents who have social networking accounts, acknowledge having posted something online that their child may find embarrassing in the future. 13% of parents say that their child has already been embarrassed by something they have posted, and 10% say their child has asked them to remove an online post that relates to them.”220
206.Horizon corroborated this, citing a 2010 study by internet security firm AVG, which showed that “92% of children in the United States have an online presence (due to their parents’ disclosures) by the time they are two years old.”221
207.Relatedly, Simon Milner of Facebook told us that parents actively assist their children who are under 13 in setting up Facebook accounts in breach of their rules, which are designed to protect children’s privacy online.222
208.On the other hand, some witnesses said that too much emphasis was being placed on parents. Parent Zone advocated the need for greater and more coherent support:
“Parents … are being overwhelmed with information about specific risks–often through the lens of the tabloid press–with very limited access to parenting support. Helping parents to develop parenting skills that are adequate to the task of raising digital citizens is vital. We have a crisis that should be dealt with as a public health issue and the response should involve multiple stakeholders.”223
209.Others argued that industry should have a greater share of the responsibility. The Children’s Media Foundation wrote:
“The main focus of industry efforts on safe-guarding children has been levelled at better parental information. This is partially because of a lack of consensus about how to address the issues, but also because of lobbying from the main industry players that they are merely proving the ‘pipes’ for content providers and therefore not responsible for any digressions … In our opinion, this approach is not sufficient. And we would like to see the new distributors, gatekeepers and search providers make a 21st Century contract with parents and children that they will in future put the needs of children first and foremost, ahead of advertisers, data-miners and brands who all have a vested interest in [practices which] manipulate or influence younger audiences for commercial gain.224
210.A policy response that relies on parenting also fails to account for parents who are by choice or circumstance neglectful. Dr Dickon Bevington told us that evidence indicates that children with pre-existing vulnerabilities are most likely to suffer actual harm from “exposure to extreme internet-mediated experience”.225 By relying only on parents to mediate their children’s online activity, there is a risk that the most vulnerable children in society, and those who are most likely to experience actual harm from the internet, are not being protected.
211.Parent Zone told us that there had been a government department leading on parenting work and “significant investment was made in the creation of the National Family and Parenting Institute. That infrastructure has now gone. Parenting has lost its voice in government at a time when it needs it most.”226
212.Baroness Howe of Idlicote argued that there should be a duty on the Government “to educate parents about the use of family friendly filtering, online safety tools and how to protect their children from risky behaviour online (e.g. bullying and sexual grooming). This could lead to leaflets being available in places parents regularly go, such as schools, libraries, doctors’ surgeries etc.”227
213.Internet safety experts Will Gardner and John Carr discussed the possibility of the Government issuing a public service broadcast or media campaign to give parents a message on online safety.228
214.Mr Gardner suggested that there might be scope for such a campaign on individual topics but he was sceptical that an online campaign could deliver all that was necessary:
“There have been big public awareness campaigns before, and the UK Council for Child Internet Safety has those. “Zip it, Block it, Flag it” was the message that was put out on bus stops, and there have been other attempts to do that. My sense is that it has to be more sustained than that, and the budget is not there to provide that in a sustainable way.”229
215.Mr Carr told us that the idea was good in principle but the Government has simply not put in enough money:
“I think a sustained public campaign, public health-type of approach, would benefit us greatly. The problem up to now is that the Government have not been willing to spend any money on this type of public education work. They have relied entirely on the industry to do it. The industry has stepped up to a degree; there is no question about that. They have done very well; they have got something called Internet Matters … But in relation to the total size of the problem and the challenge, it is nowhere near being enough, and it certainly does not match anything like you get in the public health field. So I would certainly welcome a shift in emphasis in that sort of way.”230
216.Parents and carers need clearly communicated information about the digital world. We recommend that the Government and industry should invest in regular public campaigns to promote information and tools that help parents and carers. In particular, a campaign with a short memorable message, similar to the Green Cross Code, should be developed. It should focus on creating confidence in online parenting.
217.We recommend that specific training modules be developed and made compulsory as part of qualifying in frontline public service roles, including but not limited to, police, social workers, general practitioners, accident and emergency practitioners, mental healthcare workers and teachers.
218.There are a number of public, private and voluntary bodies which contribute towards children’s outcomes online.
219.A number of bodies regulate specific aspects of the internet. Ofcom regulates video-on-demand programme services which include on-demand internet services but it does not have a general remit to regulate internet content.231 Indeed Tony Close, the Director of Content Standards, Licensing and Enforcement at Ofcom, made clear to us that Ofcom does not think that it would be well placed to do so.232 The Communications Act 2003 does, however, require Ofcom to promote media literacy, to monitor internet content and to advise the public on online safety.
220.The British Board of Film Classification (BBFC) is the UK’s regulator of film and video. It operates a classification regime and publishes Classification Guidelines, with a primary aim to classify content according to the age for which it is appropriate. It has put itself forward for the new role of regulating online pornography for the purposes of the Digital Economy Bill.
221.The Advertising Standards Authority (ASA) is the regulator of advertising across all media. It applies the Advertising Codes, which are written by the Committees of Advertising Practice, and its work includes acting on complaints and proactively checking the media to take action against misleading, harmful or offensive advertisements.
222.The Information Commissioner’s Office (ICO) is responsible for handling complaints in respect of data protection law and for encouraging good practice. It will be instrumental in the implementation of the GDPR, and it has advocated that the UK should maintain equivalent provisions following its departure from the EU. However, the Children’s Media Foundation criticised it on the grounds that “Potentially unsafe practices are unlikely to be addressed [by ICO] unless there is a problem”.233
223.The police and other law enforcement agencies are responsible for enforcing the criminal law online. In respect of child sexual abuse, law enforcement agencies are assisted by the Internet Watch Foundation (IWF), an independent body set up to identify and block images of the sexual abuse of children on the internet. While it has no formal powers of its own, once it finds child sexual abuse content, it notifies the National Crime Agency (NCA) which give permission to it to issue a notice for take-down.
224.The NCA has a command for cybercrime and another for child sexual exploitation.234 These agencies also help provide resources:
‘Thinkuknow’ is an education programme developed by the NCA with three strands: children; parents and professionals. It provides high quality education about sex, relationships and the internet aimed at reducing the vulnerability of children and young people to sexual abuse and exploitation. These messages are delivered through a network of over 140,000 professionals across the UK.”235
225.The position of Children’s Commissioner was established by statute to promote and protect the rights of all children in England. The position works with the Government and public bodies to improve policy and practice relating to the care system, and to ensure that children’s voices are heard. This involves consistent and systematic consultation with young people in all aspects of the Children’s Commissioner’s work. The current Commissioner, Anne Longfield, has set up a Digital Taskforce to explore children’s experiences online. We noted in the previous Chapter that its report called for clear terms and conditions of use. It also recommended that the Commissioner’s existing power to request information from public bodies be extended to cover aggregate data from social media services.
226.The UK Council for Child Internet Safety (UKCCIS) is a group of more than 200 organisations drawn from across government, industry, law, academia and charity sectors. The Council was established following a Government-sponsored report by Professor Tanya Byron in 2008. It discusses and takes action on topical issues concerning children’s use of the internet. It has five working groups set up to consider social media, education, evidence, technology and digital resilience. It is chaired by Ministers from the Department for Culture, Media and Sport, the Home Office, and the Department for Education.236
227.Barnardo’s recommended that the remit of UKCCIS should be expanded to include “child internet welfare as well as child internet safety”.237
228.According to a number of our witnesses, the lack of a joined-up and coherent regulatory framework has given rise to a gap between regulation in the online and offline worlds, in particular with regard to inappropriate content. The BBFC told us:
“The regulatory framework that has developed in the offline world to protect children from content - for example dangerous and imitable behaviour, self-harm, suicide, drug misuse and violence - that is likely to impair their development and wellbeing has not transferred to the online space. Pornography is of particular concern.”238
229.Independent research commissioned by the BBFC in 2015 found that “85% of parents consider it important to have consistent classifications off and online … As more viewing takes place online, consumers expect that the same level of regulation will apply online as currently applies offline.”239
230.Baroness Howe of Idlicote and CARE agreed that there should be a consistency of approach in regulating different types of media, whether it is accessed online or offline. They each gave the example that material which has been rated ‘18’ by the BBFC should be deemed inappropriate online and should only be accessible through age verification.240 The NSPCC recommend that all websites and other online services should clearly show a ‘site-rating’ stating the age range for which its content is suitable.241 Girlguiding also advocated a consistency of approach and recommended “bringing online media in line with the principles of the broadcast watershed.”242
231.User-generated content poses a serious practical impediment to this suggestion, however. Most users are not trying to market their content and so have no incentive to have it certified. Moreover, there is far too much user-generated content for a body such as the BBFC to review. The Audiovisual Media Services Directive (AVMSD) is an EU legislative instrument which regulates (among other things) the provision of “TV-like content” online. The European Commission has proposed amendments to the AVMSD seeking to harmonise standards and to extend the scope of the AVMSD to user-generated content. In written evidence, the BBFC told the Committee that it supported the extension of the scope of the AVMSD. When we later asked David Austin, the Chief Executive of the BBFC, to explain how this could work, he told us that a Dutch regulator was already piloting a programme to allow other users to self-certify content and to certify other content which they see online.
232. Over the course of the inquiry the Committee heard evidence from all many of the above organisations, and from four Government Ministers. We were struck by the number and fragmented nature of organisations organised to manage internet harms. We also noted that the evidence we received showed increasing levels of reported harms by young people. Therefore we concluded that the current matrix of Government and regulatory responsibility was not working.
233.Businesses that operate online have a particular responsibility for ensuring children inhabit the online world in a way that is age-appropriate and empowering. Furthermore, industry is also best placed to create services and innovations which are child friendly.
234.Many of the largest online platforms have made clear that they are not interested in taking ‘editorial control’ over content posted on their site. With regard to the publication of false information on newsfeeds, Simon Milner of Facebook told us that his company did not wish to become “arbiters of truth”.243
235. Horizon, a research institute, explained that these sites rely on “protections afforded to communications service providers and prefer not to moderate content in advance, but rely on take-down requests for illegal or inappropriate content.” Horizon conceded that some do provide the means to label content as “adult”, but that there is no further granulation of advice according to different age groups.244
236.Alice Webb of BBC Children’s recognised that the BBC’s policy of moderating all content on the platforms was labour-intensive.245
237.Nonetheless, the Government told us that it expected social media and interactive services to have processes in place to address inappropriate or abusive content on their sites: “This includes having clear reporting channels, acting promptly to assess reports, and removing content that does not comply with their acceptable use policies or terms and conditions.”246 Most social media sites and online platforms publish ‘community standards’, but it has recently been reported that Facebook declined to remove 82 out of a 100 images which appeared to BBC journalists to break its guidelines when they reported them.247 It is unclear to what extent they, or other social media and content host services, take active steps to search for and take down content which is in contravention of their own published rules.
238.The government of Australia has established the position of Children’s eSafety Commissioner to administer complaints from children about cyberbullying.248 The Children’s Commissioner has recommended that the UK go a step further in establishing a Children’s Digital Ombudsman to “mediate between under-18s and social media companies over the removal of content. It should operate in a similar way to the UK Financial Ombudsman Service and be funded by social media companies themselves but be completely independent of them.”249 This would enable children to challenge “any content that they have accessed via common social media platforms that they are able to report”,250 for example pornography or hate speech.
239.As we saw in the previous Chapter, the terms and conditions of social media companies are themselves often at odds with children’s right to privacy. The group of children we spoke to said that they were aware that, if they did not like content of themselves posted on social media, they could report it, but they acknowledged that normally it is for the uploader of the content to take it down. When asked, they all said that they would like the right to have content taken down.251
240.The Committee supports children’s right to have upsetting content that concerns themselves removed. All businesses operating online, particularly companies which provide social media and content-sharing platforms services such as Google and Facebook, should respond quickly to requests by children to take down content. Where innapropriate content that concerns a child is reported by third parties, similar processes should be followed.
241.Minimum standards should be adopted that specify maximum timeframes for report and response. Companies should publish both targets and data concerning complaint resolution.
242.All platforms and businesses operating online should proactively remove content which does not comply with their own published standards.
243.We recommend that, as suggested by the Children’s Commissioner, her power to request information from public bodies should be expanded to include aggregated data from social media companies and online platforms.
244.We further recommend that there should be a mechanism for independently handling requests from children for social media companies to take down content. This might take the form of an Ombudsman, as suggested by the Children’s Commissioner, or a commitment from industry to build and fund an arbitration service for young people.
245.We call on the Government to give an undertaking that, irrespective of its membership of the EU, the UK should maintain legislation which incorporates the standards set by the General Data Protection Regulation in respect of children, including the right to be forgotten, as a minimum.
246.Businesses which provide internet access services, such as Internet Service Providers (ISPs), can play a key role in providing filtering systems which block websites containing inappropriate content for children. Such systems can apply, for example, to the whole network of a household.
247.The BBFC has made an arrangement with the UK’s largest four Mobile Networks Operators (EE, O2, Three and Vodafone) to act as an independent regulator of content. The BBFC explained “Using the standards in the BBFC’s Classification Guidelines, content that would be age rated-18 or R18 by the BBFC, is placed behind access controls and internet. In 2015, the BBFC and EE also adopted a Classification Framework for EE’s “Strict” parental setting, aimed at younger children, with filtering standards set at the BBFC’s PG level.”252
248.In 2013 the Government made arrangements with the four largest ISPs (BT, Virgin Media, TalkTalk and Sky) to present customers with an “unavoidable choice” to make as to whether they wanted family-friendly filters. This was done on a self-regulatory basis.253 The four ISPs cover 90 per cent of the broadband market.254
249.However, John Carr argued that this does not go far enough, as there remains 10 per cent not covered by the undertaking to present filter options.255 Furthermore, while all four of the largest ISPs require an ‘active choice’ to be made, only Sky has the setting on by default.
250.Virgin argued against ‘default on’ settings, on the grounds that the user experience is improved if they take an active role in deciding the settings. However, in a discussion on the merits and disadvantages of opt-in systems, Dr David Halpern, a behavioural expert, told the House of Lords Committee on Trade Union Political Funds and Political Party Funding that there is evidence that people have “a very strong tendency to stick with whatever the default had been set at”.256
251.Evidence shows that the usage of Sky’s filter systems is far higher than that of the other ISPs, as would be expected with a default on system. Customers are free to switch the filters off, but there a significant number who do not actively choose to.257
252.Children use multiple devices to access digital services and can connect from their home network, school, friends’ houses, or by using public Wi-Fi and mobile networks. According to the BBC, “These can all have different levels of filtering and present challenges to parents who want to try to control their child’s use of the internet.” Encryption of websites and the use of apps also limit the effectiveness of filters.258
253.Parent Zone was concerned by the risk that filtering might result in children “moving to encrypted services and less savoury parts of the web in attempts to bypass adult restrictions. We also need to ensure that in tackling the familiar risks and services, we are not ignoring new and emerging ones.”259
254.A large number of witnesses identified the need to teach digital literacy and parental mediation, rather than relying on filters alone. Virgin told us, “Supportive and enabling parenting does more to foster resilience than parents who restrict or monitor internet use. In fact, the research indicated that restricting internet access can have a deleterious effect on building resilience.”260
255.Professor Phippen cautioned: “In our rush to ensure children are “safe” online, we risk a dystopia where the young have limited access to relevant and valuable information (for example, sexual health, relationships advice, information about gender and sexuality), increasing erosion of the privacy, and a failure to meet their rights to an education that is fit for purpose and one they are calling for.”261
256.Wendy Grossman agreed, noting that “The Open Rights Group’s Blocked project262 has found that at least 19% of the top 100,000 sites as determined by Alexa are blocked on at least one network in the UK.”263
257.According to David Miles:
“Youth charities and advocates for freedom of expression were rightly worried about the impact of these new filters on online support services and resources. Vitally important to young people, there was a real danger that teenagers in particular, in seeking online confidential advice through web sites, would be blocked. The risks of over-blocking or under-blocking legitimate web sites could have significant consequences. The online gaming community, LGBT and sexual health charities felt particularly vulnerable.”264
258.We recommend that all ISPs and mobile network operators should be required not only to offer child-friendly content control filters, but also for those filters to be ‘on’ by default for all customers. Adult customers should be able to switch off such filters.
259.Those responsible for providing filtering and blocking services need to be transparent about which sites they block and why, and be open to complaints from websites to review their decisions within an agreed timeframe. Filter systems should be designed to an agreed minimum standard.
260.There is a widespread flouting of rules concerning age, for example on social media sites and gaming.265 Barnardo’s suggests, “There is potentially scope for credit cards or similar to offer some form of age-verification.” However, they argue that that this would be excessive for social media sites, and “given young people’s desire to socialise online may even drive them towards shadier sites with less moderation than platforms such as Facebook.”266
261.On the other hand, Barnardo’s accepted that more rigorous age-verification processes would be appropriate in the case of online pornography. The Government has sought to implement such a policy through the Digital Economy Bill, a wide-ranging bill which was going through Parliament at the time of our inquiry. During its passage many argued that it was a narrow provision since it dealt only with pornography but let extreme violence, user-generated adult material, self-harm and anorexia sites all untouched.267 It was also suggested that social media sites that host user-generated pornographic content should be included in the scope of the Bill.268
262.All of our witnesses supported the principle of the age verification provisions (at least as initially drafted) in the Bill. However, some have criticised the proposed mechanism for penalising websites which do not comply with specified age verification requirements. Virgin Media cautioned:
“The decision to include ISPs within the scope of the Bill and to compel ISPs to site block on notification from the BBFC is without precedent, and carries risks … It is therefore imperative that in bringing forward this legislation the Government is alert to the need for robust checks and balances.”269
Virgin Media also called for greater oversight of the age verification regulator.
263.The Bill has also been criticised by the Delegated Powers and Regulatory Reform Committee and the Constitution Committee on the grounds that it gives too much power to the regulator to determine key terms, such as “ancillary service providers”—that is, internet sites which enable or facilitate the access to pornographic material—and “guidelines” according to which the regulator will exercise its powers, and because insufficient detail is provided on the face of the Bill to allow Parliament to conduct effective scrutiny.
264.The Bill continues its progress through Parliament as we publish.
265.We support the age verification provision of the Digital Economy Bill. We hope that the Government will provide greater clarity about the powers of the regulator, and will include social media companies within the definition of ‘ancillary service providers’.
266.The technical solutions that we have considered so far concern preventing children from viewing unsuitable content. According to Dr Victoria Nash, of the Oxford Internet Institute, in recent years such measures have been the focus of policy debate with large tech companies expected to act as “sheriffs” in limiting content for children. She deprecated this focus which “risks obscuring the wider array of commercial actors whose products and services may pose risks to minors … It would be desirable therefore to widen the focus on the full range of commercial actors providing digital goods and services for children.” She also called for “data and privacy risks, as well as the more familiar content-based risks” to be addressed.270
267.Adam Kinsley of Sky agreed: “There is only so much that an internet access provider can do but, if you are talking about the end content applications, I think it is down to those companies—and it is often the big brands which are doing this—to do the right thing and build in the safety by design. If they stuck to the 5Rights principles,271 they would get there.”272
268.Baroness Shields told the Committee that safety by design, “was a concept that started to emerge in services where kids spent a considerable amount of time and there was concern that they would be exposed. Initially, that concern was primarily about grooming for sexual exploitation, but it became about exposure to all kinds of harms and criminals.” 273
269.However, many of our witnesses highlighted that internet services are not designed with children in mind. For example, Mary McHale, a teacher, argued that insufficient account of children’s need for privacy and data protection was taken in the design of digital products. For example, when an operating system is updated, default settings are restored automatically:
“We tell the students that their geolocations go on every time they have an update on the Apple phone devices, which a lot of students tend to have these days, and that it turns the geotagging or the geolocations back on. Therefore, every time a student takes a picture and posts it, you can actually find out the location of that. We have to keep saying to the students, “You must turn it off all the time”. We say that to our parents too.”274
270.Alice Webb told the Committee “One of the things that is hugely important in the digital space is about there being transparency about who is funding what, how things are paid for: are you advertising; do you have product placement?”275
271.The CMF recommended that there should be “Rules against behavioural mechanics that try to draw children into addictive behaviours or exhortation.”276
272.Horizon told us, “Youth Juries participants also pointed out that removing personal online content should be easier and suggested a self-tracking tool to gain control over their own content, as well as screenshot blocking tools.”277
273.ICO recommended that:
“Social networking sites should explain their data collection practices in language that all users of their services are likely to understand and to invest in a high standard of security for all users. This should also include privacy settings by default (e.g. publication of data).”278
274.Better design can also promote children’s wellbeing in internet use for example by discouraging habitual behaviours and enabling children to switch off. Dr Bush suggested it would “not be a big step to create your own rewards, or to have your own time limit built into a game. Could an app go on that said, “The half-hour is up. Why not reward yourself by walking round the garden or ringing a friend you have been putting off?” Young people could put their own rewards into those things. Young people have told us that they would really welcome that.”279
275.Dr Rudkin called for the “people who are creating these apps, websites and forums [to] be aware of child development informational research so that they know exactly what kinds of things are going on for kids who are going to be accessing this information, whether they are adolescents, three year-olds or seven year-olds, and to have some very clear classification for parents who are introducing their children to these different sites.”280
276.Dr Fossi stressed:
“Putting children at the forefront and putting their needs at the front, centre and heart of the designs in the online world would help us to develop a better internet for children. I am looking at tech developers and engineers, as well as at corporations, industry, schools, and parents and grandparents.”281
277.The NSPCC recommended that “Platforms that attract both adults and children should distinguish between their audiences by verifying the user’s age and providing specific features to under-18s:
278.The Government told the Committee that “The UKCCIS guide ‘Child Safety Online. A Practical Guide for Providers of Social Media and Interactive Services’ includes examples of current good practice for services targeted at and attracting users who are under 18 years old. It describes for industry how different social media, interactive services and child safety charities are currently dealing with key challenges.”283 We note, however, that UKCCIS have no system for monitoring uptake or evaluating improvement.
279.BT said that the guidelines had “examples of good practice from leading technology companies, and advice from NGOs and other online child safety experts. Its purpose is to encourage businesses to think about “safety by design” to help make their platforms safer for children and young people under 18.”284 Ofcom confirmed this: “To accompany the (UKCCIS) guide the working group is supporting a 12 month outreach plan targeted at smaller and start-up social media companies to promote a culture in the online content industry of “safety by design”.285
280.Facebook gave the Committee examples of where it already employs a number of child-friendly features for users who have identified themselves as being under 18:
281.However we did not receive detailed evidence of the extent to which the social media companies are using the guide in relation to their services. And many witnesses wanted to see a much broader set of standards that include respect for privacy and design for wellbeing.
282.By contrast it seemed that there have been some advances in online content aimed at young children, from major corporate companies like Sky (Sky Kids) and Google (YouTube Kids) to smaller start-ups like Azoomee. These products demonstrate the demand for ring-fenced services for children that allow them to have an age appropriate internet experience.
283.The toy manufacturer Lego launched a social network in February 2017 aimed at the under-13s which they believe will be a “safe” social network. It lets children post photos of their Lego creations but does not allow text comments apart from pre-written responses. There is no personal information requested and no tracking enabled.287
284.Alice Webb of the BBC referenced Playtime Storytime and Playtime Island which are specific apps for the BBC’s youngest audience. She said the reasons for developing them were “they are all touch screen and allow children to interact and play, so it gives us that opportunity. It also allows us to create standalone playgrounds, online playgrounds for children that they can go and play in and enjoy those things in.”288
285.She told the Committee that the BBC had developed iPlayer Kids to be an:
“environment that was child-centric, so our absolute focus was making sure that our design was child-centric … to give them an environment that was just for them, which they could feel at home in, to help them get content that way, and it has further safeguards against them wandering off into content that is not necessarily age appropriate.”289
286.The BBC told us that “digital products or content are often not user tested with a young audience, due to the complexity of reaching the broad range of ages with different abilities.”290
287.However, Adam Kinsley told us that children were at the heart of the design of the Sky Kids app:
“In the way we went through the design process, it was almost built by children, going through constant design refreshes with panels of children, which is just great to watch, seeing them trying to break the thing, and giving them something which they can really use and love.”
He concluded that “this has to be built into applications by responsible businesses by design” and that in order to achieve this Sky worked with 5Rights and was “driven by its principles to deliver a product specifically tailored for children.”291
288.While some companies and content providers have championed child-specific services, ICO was “sceptical about seeing an approach that seeks to differentiate between children’s and adults’ sites as being in itself a solution to the problem of children’s online protection.”
289.Alice Webb of the BBC argued that child-friendly design “is something that people will be demanding of commercial services, as these subjects are discussed more widely and there is greater awareness in the public about them.”292
290.The NSPCC recommended that “Minimum standards and best practice guidance must be established in these areas so … they can check that they are providing the requisite safety features to enable young people to participate in a safe environment.”293
291.The establishment of such standards is all the more pressing in the light of the projected growth of the Internet of Things (IoT), Artificial Intelligence (AI) and Machine Learning in the coming years. Yet IoT has already been deemed to pose a risk for cybersecurity and privacy. As Horizon told the Committee:
“Far too often security and privacy concerns are given too low a priority in the design process, resulting in easily hackable IoT devices. Particularly concerning are the examples, including connected baby monitors, voice controlled TVs and toy dolls (e.g. Hello Barbie), that continuously stream very personal video and audio information to data centres, often outside of the jurisdiction of the UK (and EU) data controllers.”294
292.Barnardo’s highlighted the issue that “most of the current literature by futurologists anticipating the effects of technological advance is not written with a specific narrative about the potential impact on children in mind.” It cited a study called Youth and the Internet which suggested “experts in the tech and children’s sectors should be brought together more regularly to conduct ‘Child Impact Assessments’ on tech developments perhaps similar to the way current policies across Government are already subject to Equality Impact Assessments.”295
293.BT recommended that “standards or best practice guidelines should be set as technology develops, especially with respect to children. Lack of reference to clear guidance and the fact that human rights impact assessments are not yet commonplace for new products and propositions are risks for businesses to mitigate. The civil society organisations Unicef and 5Rights are attempting to address this in the identification of online rights of children and young people.”296
294.Dr Fossi agreed: “Building on self-regulatory principles, and ensuring that children and child protection are the heart of the designs in the online space. Having a code of practice and minimum standards would go an enormous way towards ensuring safety and providing age-appropriate filtered experiences for children and young people in the online world.”297 She also recommended that such a code of practice and minimum standards should be incorporated into university courses.
295.The NSPCC recommended that an “independent regulator, as proposed within the Digital Economy Bill, should be endowed with the power to set minimum standards of child safeguarding across all social networks, platforms and ISPs to ensure that child safeguarding is incorporated into the design, content and functionality of all online services.”298
296.The BBC told the Committee, “Clear guidelines and rules regarding design and content that apply beyond content explicitly aimed at children could help open up wider ranges of content to younger audiences.”299
297.The 5Rights framework agrees and states:
“There is no technological impediment to delivering children’s rights online - it is a choice. To support the presence of young people online, we must design and implement as standard, into every interaction of the digital world ALL the rights they enjoy offline.”300
298.We welcome the development of internet services which are specifically designed for very young children but regret that there are no such services for children as they grow older. We have found that there is resistance to providing services which incorporate the support and respect for rights that would enable a better internet experience for all children as they explore the wider internet.
299.We recommend that the Government should establish minimum standards of design in the best interests of the child for internet products. For the avoidance of doubt this is for all products that might reasonably be expected to attract a large proportion of children, not only those designed with children in mind.
300.The minimum standards should require that the strictest privacy settings should be ‘on’ by default, geolocation should be switched off until activated, and privacy and geolocation settings must not change during either manual or automatic system upgrades.
301.Minimum standards should incorporate the child’s best interests as a primary consideration, and in doing so require companies to forgo some of their current design norms to meet the needs of children.
302.All platforms and businesses operating online must explain their data collection practices, and other terms and conditions, in a form and language that children are likely to understand. Their explanations should not try to obfuscate the nature of the agreement.
303.All platforms and businesses operating online must not seek to commercially benefit or exploit value from the sharing or transfer of data gained from a child’s activities online, including data transferred between services that are owned by the same parent company. They should uphold a principle of minimum data gathering necessary for the delivery of a service when the end user is a child.
304.All platforms and businesses operating online which large numbers of children use should incorporate a ‘time out’ function into their design even if it is not in their best commercial interests. It is the view of the Committee that the wellbeing of the child is of paramount importance, and in our view there is sufficient evidence that time-outs or breaks contribute positively to the mental health and wellbeing of children.
305.Many of our witnesses pointed out that whilst companies have a duty to support children by technological means children themselves need to grow up digitally literate. As we have seen, the largest companies already contribute to this through sponsoring Internet Matters, which acts as hub of information and resources. BT and Samsung also gave us evidence of their initiatives to educate children about technology and digital literacy.
306.Members of the Committee visited a school in Brixton where an online safety assembly was presented by Google in partnership with Parent Zone (see Appendix 7). We found the assembly to be a powerful learning tool but it was notable that it would only be produced at a small number of schools, and it was not clear that a similar standard message was being delivered at other schools.
307.We heard from a number of charities that the gap in provision for children, particularly on the digital literacy and resilience agenda, had been tackled by the third sector and industry. We found that this too was piecemeal and fragmented.
308.The NSPCC has developed a website, Net Aware, “where parents can find information about the top 50 sites that young people have told us they use … based on evidence collected from 1700 children and 500 parents about their experiences on the most popular platforms”.301
309.We recognise that education is a devolved issue and therefore our analysis and recommendations in this section apply to England but we feel that this is more widely applicable. The Government told us that it had “introduced the new computer science curriculum, which includes topics such as online safety and security, providing the computational thinking skills which will enable young people to adapt to emerging technologies.”302
310.On the other hand, a large number of witnesses said that computer science lessons did not go far enough in teaching digital literacy.
311.The Committee heard from teachers that computer science including esafety was taught to students from year 7. Karl Hopwood recognised that when e-safety began in schools, it “sat in the lap of the IT department—I understand why—but for a lot of people it made them think it is a technical issue rather than a behavioural challenge.” However both he and Mary McHale agreed that setting e-safety within the PSHE framework could be beneficial. Mr Hopwood said this was “because … safety pervades everything that young people are doing.”303
312.In addition to computer science standing at the heart of the curriculum there remains an urgent need to consider the social and sexual aspects of digital use. Witnesses were almost universal in recommending the implementation of a much broader, better resourced PSHE programme taught by trained teachers. PSHE education is defined by the schools inspectorate Ofsted as a planned programme to help children and young people develop fully as individuals and as members of families and social and economic communities. Its goal is to equip young people with the knowledge, understanding, attitudes and practical skills to live healthily, safely, productively and responsibly.
313.The PSHE Association told the Committee
“Worryingly, PSHE education lessons, through which these issues are taught, is increasingly being squeezed off school timetables. This means that many pupils miss out on education which could help to keep them safe online. The most recent Ofsted review of the subject has stated it is ‘not good enough’, pointing to the serious safeguarding implications of failure to teach many of these issues, while the Commons Education Committee says the situation is “deteriorating”. The National Council of Women said “the non-statutory status of much of PSHE education means that some schools are not prioritising the subject and not allocating sufficient curriculum time to it. Some schools are not delivering it at all.”304
314.The Children’s Commissioner told us that participants of their survey into online pornography called for “more education about pornography delivered in a relevant and engaging way. Young people wanted to be able to find out about sex, relationships and pornography in ways that were safe and credible.”305
315.The Government has announced that relationships and education (RSE) would be compulsory in all schools and laid open the possibility that PSHE could be too.306
316.The benefits of PSHE go beyond sex and relationship education. Dr Sarah Marsden advocated “developing young people’s critical thinking and political consumption skills and their digital literacy so that they are better able to assess and interpret the content they find online”.307 With regard to online radicalisation, Dr Akil Awan told us that better education of digital citizenship and media literacy could also help counteract the negative influence of extremist content.308
317.We agree with the Digital Skills Committee that no child should leave school without an adequate standard of digital literacy. It is the view of this Committee that digital literacy should be the fourth pillar of a child’s education alongside reading, writing and mathematics, and be resourced and taught accordingly.
318.We recommend that the Government should make PSHE a statutory subject, inspected by Ofsted. The Committee further recommends that PSHE be mandatory in all schools whatever their status. The PSHE curriculum must be designed to look broadly at the issues young people are concerned about online, including compulsive use, data gathering, body image—rather than the current e-safety agenda of risk. Children need support in developing their critical thinking and understanding the veracity of online information. This should form part of the curriculum. We also note Ofcom’s duty under the Communications Act 2003 to promote media literacy.
319.It is the Government’s responsibility to reassess the resources needed to deliver computer science and PSHE in all UK schools and to ensure that teachers are adequately trained and resourced. But we note with interest that graduates currently entering teacher training are the first group of teachers who might be considered ‘digital natives’. We recommend that the Government harness and further upgrade the skills of this new generation in the course of teacher training so that UK schools are at the forefront of the digital revolution.
320.We commend the work of the voluntary sector and industry in delivering information and resources about online safety and digital literacy for parents and children, but note that it is currently fragmented and insufficient to meet the needs of all children. Once a truly rounded computer science education and fully resourced Personal, Social Health and Economic education is established in schools, we believe that there will be a clearer role for the voluntary sector and industry.
321.Schools have a legal duty of care to protect children and staff from harm, and this duty extends to harm from electronic communications.309 But we heard that many are struggling to do so.
322.SWGfL, a non-profit organisation, published a report which showed that: “40% of primary schools only had a basic filtering system in place and 6% had none at all. It also highlighted that 55% of school governors and 50% of staff had received no online safety training. Policies around technology were also poor with 35% of primary schools having no policies around mobile phones.”310
323.The NSPCC recommended:
“Teachers need to have concrete risk assessments so as to be able to spot signs of online abuse, escalate and report cases appropriately and know how to signpost and support each child taking into consideration the additional impacts that online abuse has on the child.”311
324.The Department for Education has published statutory guidance, which recommends filtering and “appropriate monitoring”.312 However, it does not define what this means.
325.In oral evidence, Mark Donkersley, the Managing Director of e-Safe, explained that he provided an early warning system for safeguarding risk, which monitored any school devices or devices using the school network. This could detect images and video, in an attempt to find pornography. It also references against a list of “literally tens of thousands of terms, phrases, euphemisms, slang, in multiple languages associated with a range of behaviours”, to identify inappropriate content or harmful behaviour. If it has a match, a screenshot of the material is sent to a team of “multilingual behaviour specialists”, who review it. Depending upon how serious they deem it to be the school is notified immediately or as part of a weekly or monthly report. It is then for the school to deal with the matter.313
326.Mr Donkersley told us that his monitoring went beyond detecting inappropriate content. It could detect speech related to “paedophile grooming, child abuse, FGM, bullying, self-harm risk and so on and so forth.” The system could also detect “depression indicators”, which could be “either cries for help or low-level things going a bit awry at home”.314
327.DefendDigitalme was highly critical of Department for Education data policy and practice, especially in regard to what it viewed as “statutory web surveillance, [which will] affect children across all State education, age 2-19 in England.” According to DefendDigitalme this and other policy changes were “characterised by lack of transparent due diligence, public engagement, or democratic debate before imposing significant policy with far reaching potential, and that encroach on children’s rights.”315
328.With regard to e-Safe Systems, Mr Donkersley explained that schools are responsible for developing and managing a system for children or parents to consent to the monitoring system, but students, parents and staff sign a consent form which includes terms of use for electronic equipment.
329.Mr Donkersley told us that his system had detected “some horrible situations”, and as a result “We have managed to alert somebody to intervene and protect, help and support an individual”.316 He also explained that his system was limited to school devices or personal devices used in the school environment, and furthermore that information about children was anonymised and encrypted. Nonetheless, we are concerned by a system that allows any personal thought, which is typed out, potentially to be scrutinised in such a way.
330.In order for schools to undertake the responsibilities that we have outlined above, they need to allocate sufficient time, resources and personnel to meet the task.
331.The Government should ensure that schools are sufficiently resourced and directed to meet their obligations of child protection, including the ability to train their teachers and to develop digital policies which are right for them and to discern what sort of filtering and monitoring systems are appropriate, together with pastoral care, education and supporting parents.
332.We caution that internet safety systems should not undermine children’s rights to privacy, to learn about the world and to express themselves. The Government should require schools to obtain the informed consent of parents and students, and they should have the opportunity to opt out.
333.Dr Bush emphasised that children are fundamentally “active” users of the internet: “When we talk about the internet, sometimes we assume that children and young people can be protected from everything, yet frequently they are the people creating as well as using that content.”317
334.It is therefore also children’s responsibility to practise self-governance but they have to be supported by “community guidelines with clear ground rules of what is and is not acceptable, including hateful content, nudity or sexual content, and online harassment and bullying.”318
335.Dr Bush explained ways that children’s responsibility for one another can go beyond self-restraint. Peers can provide positive role models for one another and be a source of information to encourage digital literacy.
336.The Committee was struck by the repeated calls for research, particularly qualitative research with young people. Dr Dickon Bevington called on mental health professionals to ask child patients about their internet use. Dame Sue Bailey, chair of the Children and Young People’s Mental Health Coalition, described mental health services as a:
“Car crash waiting to happen … if we want a sustainable society that can help young people support themselves. We need to listen to them more about what their problems are, with all the risks that surround them, such as all [those] that come out of social media.”319
337.FOSI also argued that Government funding should be provided for research into the social aspects of the internet, “online behaviours and educational efforts that promote digital literacy and parental engagement”320.
338.Children are often first to encounter problems online because they are digitally active, but often not consulted about the nature of those problems. We recommend that the Government should commission research based on in depth consultation with children. We note that because of the rapid nature of technological change public policy may on occasion have to anticipate the conclusion of long-term research. Such research should include:
339.There is no comprehensive system of regulation of the internet in the UK.321 Instead in the UK businesses operate under a system whereby they are expected to develop and employ good practice and self-governance. In specific areas where regulation exists, for example in respect of advertising inappropriate content in video on demand, a system of ‘co-regulation’ has developed whereby businesses work in partnership with regulators such as Ofcom to develop and enforce a system of good practice. BT explained the advantages of the UK’s current approach:
“We believe that the multi-stakeholder self- and co-regulatory approach has made the UK a world leader in online child protection. It has allowed children to experience and reap the benefits of the internet whilst improving online safety via technical tools and providing the education, awareness and skills to allow children, parents and teachers to manage and avoid risks.”322
340.The Government agreed:
“Self-regulation also allows a broad range of interested parties to participate and can be an effective way of coming up with innovative and effective solutions to issues which, due to the nature of the internet, are often global.”323
341.However, John Carr of CCCIS argued that the current regulatory regime is not adequate in the face of fast-changing innovation.
“I think we should try to establish, either through law or culturally, that any and every company has a duty of care to children if it brings out a new product or a new service, just as it does in the physical world. If you bring out a new iron, a new toaster, a new motor car or a new anything, there is a whole set of hoops that you have to go through to prove that it is fit and proper to be put in the marketplace in which you are about to put it … That idea of establishing a duty of care would be a very big step.”324
342.Parent Zone agreed:
“It is rather like some playgrounds having play equipment that children routinely fall off. It is unfeasible in the offline world that such a playground would be allowed to continue without some warning information for parents.
343.Parent Zone went on to argue for specific changes to increase child protection:
“It is time for the Children’s Act and Working Together to Safeguard Children guidance to be reviewed to consider whether a legal duty of care could be included to ensure that services that identify a child experiencing harm are required to report that child to the appropriate authority. Online services have a unique window into children’s lives. It cannot be right that they are allowed to look through that window, observe a child experiencing harm and have no legal duty to do anything with that information.” 325
344.Wendy Grossman, on the other hand, was highly critical of this notion of a duty of care:
“Successive British Prime Ministers have sought to make Britain “the best place in the world for ecommerce”. Requiring advance permission for all such experiments, Carr’s idea here, would effectively void that long-held policy … I think policy makers would struggle to define such a “duty of care” for services and technologies that are still in the research phase.”326
345.Many witnesses argued that it is simply not possible for legislation to account for all possible future changes. Sky noted, “For legislation to be truly effective, it needs to be developed so that it can be applied and enforced globally. This is clearly a significant challenge … Care needs to be taken to avoid focusing on in-country operators subject to local regulation, but to ensure that the largest and most popular global platforms are part of any response.”327
346.Indeed, SWGfL, a non-profit organisation, were concerned that unilateral legislation on the part of the UK may cause internationally based companies to withdraw “both from UK geographical locations and from agreements shown above, returning to foreign territory and therefore not engaging as they currently do.”328
347.Dr Nash and Dr Slavtcheva-Petovska argued that any legislative burdens placed upon companies should be proportionate, particularly in the light of evidence to establish causal links “that translate particular risks into harms for specific children.” In the view of Dr Nash, “Policy intervention may still be justified in this area, however any resultant interventions should be understood as ‘precautionary’ only, implying a particular responsibility to ensure proportionality, frequent review of policy efficacy as well as reconsideration if new evidence emerges.”
348.However, YouthLink Scotland wrote, “An agile mind set is required in order to future proof legislation. Rather than referring to specific platforms or sites, legislation should focus on the common elements and principles.”329
349.It further recommended that “Child’s Rights Impact Assessments are carried out on future legislation in order to ensure that children and young people will not be adversely affected. This is in line with the recent UN Convention on the Rights of the Child Concluding Observation 9a.” 330
350.The law firm Schillings undertook a review of existing English law to see whether the 5Rights Framework was adequately reflected in it. It concluded that while the laws are “broadly and theoretically sufficient to provide protection for children’s rights online, the key issue is that the laws are routinely ignored”. The author suggests that there is a lack of will and resources directed at applying rights on behalf of children online.331
351.The Children’s Media Foundation argued that “UK regulators need to have ‘teeth’ to ensure that regulation can be enforced”, while conceding that it is important not to stifle innovation. In its view, “the status quo which is based on self-regulation is not adequate to ensure”332 that an ethos of mutual trust between the digital sector and parents is maintained.
352.We find that the current regime of self–regulation is underperforming and believe it will take a step change from the highest level of the Government to represent the needs of children online.
353.Any future policy should be based on principles which firmly place children’s rights, wellbeing and needs as the preeminent considerations at all points of the internet value chain where the end user is a child. This shared responsibility requires all stakeholders to play their part, and all parties to sustain their commitment to children’s wellbeing in what is a rapidly changing landscape that will include on the near horizon the Internet of Things and Artificial Intelligence.
354.In its report, the House of Lords Digital Skills Committee concluded that “the Government should act as the ‘conductor of the orchestra’ and play an enabling role” in order to lead the UK through changes brought about by changing technologies.333 It also advocated that their efforts should be better coordinated and that it should develop a ‘Digital Agenda’ for the UK, which would include as an objective that “No child leaves the education system without basic numeracy, literacy and digital literacy”.334
355.According to BT, the current system of self-regulation requires multi-stakeholder collaboration in order to maximise the benefits and minimise the risks of children’s online activity. It recommends that “The Government would be an appropriate convener of academia, civil society and business”.335
356.Internet Matters further explained how this should work:
“It is the role of government to set clear direction and strategy that engages all parties and effectively uses the resources that already exist.”336
357.The NSPCC welcomed the work of the Government to help protect young people online, but noted “Some companies were more committed than others. In a self-regulatory regime, the Government is in a strong position to make decisions about uniform standards, guidance and best practice”.337
358.Three, a mobile network operator, concluded that it should be for Government to “adopt a strategy that empowers the parents, teachers, guardians and children alike - viewing filtering, age verification, and other technological methods as supporting tools for staying safe online, in addition to a critical skillset that prepares children and adults to deal with a wider variety of situations.”338
359.The Children’s Charities Coalition on Internet Safety emphasised the need for urgency:
“Finding ways to help parents to help their children get the most out of the internet while remaining safe is a major and urgent societal challenge. We cannot blithely assume it is a problem which will solve itself with the passage of time.”339
It advocated that the matter should be treated as a public health issue.
360.The Government has a key role in providing an appropriate framework for stakeholders to act in a concerted, joined-up way. It has a particular obligation to comply with the UN Convention on the Rights of the Child to ensure that children’s wellbeing is protected, to promote children’s right to be heard in matters that affect them, and to act in the best interests of the child in all cases.
361.Our inquiry showed that the subject of ‘Children and the Internet’ covers a number of Government Departments. We interviewed three ministers–Nicola Blackwood from the Department of Health, Edward Timpson from the Department of Education and Baroness Shields, Minister for Internet Safety and Security at the Department for Culture, Media and Sport as well as being a minister in the Home Office. Baroness Shields’ role was altered in December 2016 when she was appointed as the Prime Minister’s Special Representative on Internet Crime and Harms and became solely a Home Office minister. We are concerned that the apportionment of responsibility and accountability is still not clear and we detect a danger of departmental segregation.
362.We note that the Secretary of State for Culture, Media and Sport has announced a new Internet Safety Strategy initiative involving ministers and officials from departments across Government including the Home Office, Department for Education, Department of Health and Ministry of Justice. We recognise the primary intention of this is “preventing children and young people from harm online”.340
363.While we commend this, we are concerned on two fronts. First, we are concerned that the focus of the Government’s policy is primarily danger and risk. We call on the Government to recognise that rights, literacy and education are as important in equipping children with the necessary tools to navigate the online world. Second, we note that there have been meetings and reviews in the past without sufficient progress. We are concerned that the recommendations of this report will not be implemented meaningfully in the long term without commitment to agree minimum standards and without a champion to advance them.
364.We recommend that the Government should establish the post of Children’s Digital Champion at the centre of the Government within the Cabinet Office, with a remit to advocate on behalf of children to industry, regulators and at ministerial level across all Government departments.
365.The remit of the Children’s Digital Champion should include:
366.We welcome the Government’s promotion of an Internet Safety Strategy and the intention to hold round table meetings with industry leaders. We see this as the opportunity for the Prime Minister to take forward the recommendations of this report culminating in a summit which would establish minimum standards for child-friendly design, filtering, privacy, data collection, terms and conditions of use, and report and response mechanisms for all businesses in the internet value chain, public bodies and the voluntary sector.
367.The standards should be set out in a code of conduct, which should also seek to promote digital literacy. If industry fails to implement the recommendations, then the Government should take action. The UK must be an exemplar in raising standards.
368.We further recommend that the Government should commission a version of the code of conduct which is written by children for children and that it builds on ‘in depth’ contributions of young people from existing research.
369.We note the NSPCC’s suggestion for creating a user generated age rating system. We recommend that the Children’s Digital Champion work with others to investigate the potential of such a scheme.
370.The Committee feels that the role of the UK Council for Children’s Internet Safety in research and convening stakeholders should continue but in order to enhance its effectivness it should report to the Children’s Digital Champion who has the independence from industry and access at a ministerial level. Its remit and membership should be extended to support a broader delivery that includes children’s rights, digital literacy, industry codes, as well as safety.
371.The Government should also involve further education providers as well as universities and encourage them to incorporate the standards and the code of practice in relevant courses.
372.It is the Committee’s view that this is issue is of such critical importance for our children that the Government, civil society and all those in the internet value chain must work together to improve the opportunities and support where the end user is a child. Ultimately it is for the Government to ensure that this happens. We look forward to its response to this report.
207 Written evidence from Parent Zone (CHI0011), citing A shared responsibility, building children’s online resilience, Dr Andrew Przybylski, 2014: http://parentzone.org.uk/article/building-childrens-online-resilience, and New directions in socialization research, Diana Baumrind, 1980. The other factor was digital literacy.
210 Written evidence from Family Online Safety Institute (CHI0033). Family Online Safety Institute, Parents, Privacy & Technology Use (November 2015): https://www.fosi.org/policy-research/parents-privacy-technology-use/
213 Written evidence from Three (CHI0016). Three cited uSwitch, Parents’ grip slipping on youngest children as four-year-olds surf internet unsupervised (6 January 2016): https://www.uswitch.com/media-centre/2016/01/parents-grip-slipping-on-youngest-children-as-four-year-olds-surf-internet-unsupervised/
215 Written evidence from the Children’s Charities’ Coalition on Internet Safety (CHI0001), (CHI0057)
220 Family Online Safety Institute ‘Parents, Privacy & Technology Use’ (November 2015), p. 22: https://www.fosi.org/policy-research/parents-privacy-technology-use/ [accessed 13 March 2017]
231 See Appendix 6.
234 This is the Child Exploitation and Online Protection Centre (CEOP).
236 Department for Culture, Media and Sport, UK Council for Child Internet Safety (UKCCIS): https://www.gov.uk/government/groups/uk-council-for-child-internet-safety-ukccis [accessed 13 March 2017]
247 BBC, Facebook failed to remove sexualised images of children (7 March 2017): http://www.bbc.co.uk/news/technology-39187929
248 Australian Government, Office of the Children’s eSafety Commissioner https://esafety.gov.au/about-the-office/role-of-the-office [accessed 17 March 2017]
249 Children’s Commissioner, Growing Up Digital: A report of the Growing Up Digital Taskforce (January 2017): http://www.childrenscommissioner.gov.uk/sites/default/files/publications/Growing%20Up%20Digital%20Taskforce%20Report%20January%202017_0.pdf [accessed 23 February 2017]
250 Children’s Commissioner, Growing up Digital: A report of the Growing Up Digital Taskforce (January 2017), p 13: https://www.childrenscommissioner.gov.uk/news/children-left-fend-themselves-digital-world [accessed 22 February 2017]
251 See Appendix 7.
253 Ofcom Report on Internet Safety Measures, Strategies of parental protection for children online, (16 December 2015), Executive Summary, pages 3-5: http://stakeholders.ofcom.org.uk/internet/internet-safety-dec-2015
256 Select Committee on Trade Union Political Funds and Political Party Funding, Report (Report of Session 2015–16, HL Paper 106), page 18
262 Open Rights Group, Blocked: http://www.blocked.org.uk [accessed 13 March 2017]
265 See, for example, Q 117 (Simon Milner) and written evidence from South West Grid for Learning (CHI0009)
271 The 5 rights principles are The Right to Remove, Know, Safety and Support, Informed and Conscious Use and Digital Literacy.
287 BBC ,‘Lego launches ‘safe’ social network for under-13s’ (31 January 2017): http://www.bbc.co.uk/newsbeat/article/38806540/lego-launches-safe-social-network-for-under-13s [accessed 2 March 2017]
300 5 Rights, http://5rightsframework.com/ [accessed 24 January 2017]
306 Department for Education, Schools to teach 21st century relationships and sex education (1 March 2017): https://www.gov.uk/government/news/schools-to-teach-21st-century-relationships-and-sex-education [accessed 13 March 2017]
319 ‘Act on children’s mental ill health or risk national crisis, warns expert’, The Guardian (1 October 2016): https://www.theguardian.com/society/2016/oct/01/fund-nhs-child-mental-health-services-to-avoid-crisis [accessed 24 January 2017]
333 Select Committee on Digital Skills, Make or Break: The UK’s Digital Future (Report of Session 2014–15, HL Paper 111)
334 Ibid.
339 Written evidence from the Children’s Charities’ Coalition on Internet Safety (CHI0001); see also written evidence from Parent Zone (CHI0011).
340 Department of Culture, Media and Sport, ‘Government launches major new drive on internet safety’: https://www.gov.uk/government/news/government-launches-major-new-drive-on-internet-safety [accessed 13 March 2017]