15.In the UK, an estimated 32 million people play video games. For the vast majority, playing games is a positive hobby and form of entertainment enjoyed as part of a balanced lifestyle. As trade body Ukie identifies, gaming offers many “educational, physiological, psychological, recreational and social benefits” including the ability to connect with other players. For some, including those with physical disabilities or forms of neurodiversity, gaming can be a profound source of enablement and connection. However, there is a minority of players who experience significant challenges related to gaming, such as struggling to maintain control over how much they are playing. This was articulated by Dr Daria Kuss of Nottingham Trent University, who told us:
If you take the whole population of gamers, involving millions of people all around the world, only a very small percentage are developing problems that may be associated with addictions […] Although we do not want to over-pathologise something that is a very enjoyable pastime activity for the large majority of gamers, we do need to be aware of the significant problems that a small minority do experience.
16.We heard from two members of the Game Quitters network—a global community that offers support to people who are, in their own words, suffering from “video game addiction”. James Good told us that during his first year of university he played for “32 hours straight” without eating, sleeping or leaving his room. It was only when his studies started to suffer that he realised the detrimental impact that his gaming was having. Similarly, Matus Mikuš told us about the impact that playing up to 12 hours a day had on his social life and relationships:
I was very comfortable online, I was very good at the games I would play. Then I would go outside for a lecture and would not know how to interact with my friends or people around me. It is a feedback world. You go online, you are doing well and feel good. You then go outside, it is stressful and you do not know how to relate to people. You go back and play a bit more and feel happy. It is self-perpetuating in that way but I think it made me much less open to people in the real world.
17.James and Matus’s experiences was echoed by Dr Henrietta Bowden-Jones, the spokesperson on behavioural addictions for the Royal College of Psychiatrists. She told us that from a behavioural perspective disordered gaming is accompanied by “a withdrawal from real life” in which:
The individual spends more time online and invests in online relationships. The fact that these are often global relationships means that across multiple time zones it is possible never to want to go to bed. A lot of people I have treated dropped out of university or school because they were compelled—and some of them did have slightly obsessional traits—not to let go because they feared that in doing so they would be letting their new friends and new teams down, having already isolated themselves from their real-life cohorts who used to drive them and give them pleasure.
18.In 2018, the World Health Organisation formally included ‘gaming disorder’ in its International Classification of Diseases. The WHO characterises gaming disorder as a “pattern of persistent or recurrent gaming behaviour”, either online or offline, that is manifested by:
(1) impaired control over gaming (e.g., onset, frequency, intensity, duration, termination, context);
(2) increasing priority given to gaming to the extent that gaming takes precedence over other life interests and daily activities; and
(3) continuation or escalation of gaming despite the occurrence of negative consequences.
It goes on to say that this behavioural pattern is “of sufficient severity to result in significant impairment in personal, family, social, educational, occupational or other important areas of functioning.” This was again echoed in evidence to us by Dr Bowden-Jones, who summarised the negative consequences of disordered gaming as “a loss of control over what matters most to you”.
19.We heard that the WHO’s decision was not uncontroversial. Academics from 34 international institutions cautioned the WHO against including the term, on the basis that formally diagnosing disorders requires a “stronger evidence base than we currently have.” However, supporters pointed to the growing numbers of people around the world who are seeking treatment “because they are suffering from functional impairment related to GD [gaming disorder] symptoms”, and the lack of suitable services for them, as evidence of an “unmet need”.
20.This debate was reflected in the evidence we received. On the one hand it is argued that high engagement with gaming is not inherently harmful, and may in fact be a coping mechanism for other underlying conditions. Moreover, as evidence to us stated, it is held that misuse of the term “addiction” in this context implies:
that video games, or digital technologies, are inherently harmful, such that excessive use is a cause for concern—in the same way that excessive substance use or gambling is. This assumption is incorrect, however: such technologies are, by their very nature, immersive and interactive hobbies. Therefore, a clear distinction needs to be made between high-engagement (yet unharmful) use, and problematic or addictive use.
On the other hand, written evidence from Nottingham Trent University’s International Gaming Research Unit told us about the associations between excessive gaming and “many psychosocial and physical health problems” including:
poorer response-inhibition and emotion regulation, impaired brain functioning and cognitive control, poorer working memory and decision-making, lower visual and auditory functioning, and a reward system deficiency, akin to that found in substance addiction.
21.Overall, the evidence we received acknowledged that there is a spectrum of behaviour in relation to gaming, ranging from positive to harmful, and that this can vary significantly between individuals and over the course of their lives. Indeed, there is evidence to suggest that gaming disorder develops as a response to pre-existing life stress. A study of World of Warcraft players found that:
less stressed individuals manage to play WoW so as to enhance their offline lives. By contrast, more highly stressed players further magnify the stress and suffering in their lives by playing problematically the online game within which they sought refuge from their offline problems.
This evidence aligns with that presented by Matus Mikuš and James Good who both described to us how their problems developed when they left home and went to university. The transition to more independent living at university can be a stressful time for young people and a trigger for mental health issues. Therefore, even if specific games have innocuous effects on most players, for others, events may conspire to create a situation in which they develop problematic behaviours related to gaming.
22.Dr Bowden-Jones told us that she is alarmed by much of the public rhetoric around there being an “‘epidemic’ of […] ‘gaming addiction’” because that is not an accurate reflection of prevalence, especially when compared to other addictions and behaviours. We were also told that a large-scale cross-national study put the prevalence of disordered gaming at 0.3% to 1.0% of players.The number of people in the UK who experience such significant problems with video gaming can therefore be estimated to be 96,000 to 320,000.
23.There is broad consensus that research on gaming disorder is “scarce” and more high-quality “longitudinal and epidemiological” studies are needed to understand it. Dr Kuss identified that “within the UK the research base is relatively poor”, especially when compared to southeast Asian countries that have recognised the term for much longer. Professor Andrew Przybylski of the Oxford Internet Institute also told us that many of the studies in this area are of poor quality and lack academic transparency, or are based on self-reporting or diagnostic tools that are inherently flawed. As the then Minister for Digital and the Creative Industries, Margot James MP, observed in her evidence to us:
there is a paucity of research done in this country, which holds the field back and holds treatment of people back.
24.Very little is known about what sort of games or game mechanics are more or less associated with levels of harmful gaming, and what the most effective interventions or treatments are. The high levels of comorbidity—the presentation of two or more disorders in the same individual—related to gaming disorder can also present challenges to those establishing the causal links between gaming and problem behaviours. It is therefore unsurprising that we have heard consistent calls for further data and research on gaming disorder. A group of psychiatrists with an interest in the links between mental health and gaming told us that:
there should be appropriate funding and encouragement of further research into gaming disorder, of a high quality (emphasising pre-registered studies using open data), covering both general and clinical populations.
25.The role of games companies in supporting research into gaming disorder is crucial, as the data they hold on player behaviour could be highly valuable to researchers. We have been told that “governmental pressure should be exerted on the gaming industry to share their data with academic researchers and other interested parties for independent third party analysis.” Furthermore, we would argue that a global industry that generates billions in revenue should contribute financially to research on potential harms associated with its products, as the gambling industry is already expected to do. However, we have also heard strong calls for any resulting research to be independent of the games industry in the interests of legitimacy and impartiality. The then Minister told us that when it comes to understanding the potential harms of an online product such as games, “there is a role there for public funding of research.”
26.When we put the proposal to games makers of sharing aggregated data on player behaviour with researchers, they expressed willingness in principle; however, none have been able to point to examples of doing so in practice. Moreover, in its evidence to us the industry has been highly reluctant to acknowledge that it might have a role in understanding gaming disorder. For example, neither Epic Games, which makes Fortnite, nor Electronic Arts, which makes the FIFA series, have commissioned any research into potentially harmful engagement with their games. In response to this, the then Minister told us that if games companies are not sharing their player data, “perhaps it is time they did or at least learned from it themselves.”
27.In its written evidence, the Department for Digital, Culture, Media and Sport told us that the Government is “looking closely at studies and research around the potentially addictive nature of some technologies”. However, it also acknowledged that evidence:
is still emerging, and at this stage it can sometimes provide a conflicting picture. It is important that we take steps to better understand both the positive and negative impacts of new technologies.
It was therefore disappointing that the then Minister was unable to tell us what the Government is doing to facilitate or commission research on gaming disorder, and was unfamiliar with the Department’s own ‘areas of research interest’, which makes no mention of it.
28.While the academic evidence base may take a while to establish, we have heard calls for improved support for people with gaming disorder in the UK. Dr Bowden-Jones told us that she does not “know of any free, evidence-based, high-quality intervention that uses the latest research-based techniques” and that she has:
a list of about 40 letters from desperate people […] waiting for me to see their children, and I cannot see them because I do not have the funding or the commissioning stream to see gamers.
However, she also told us that she was “hopeful” that the NHS would fund gaming disorder treatment in the near future, which in turn could contribute to building the evidence base on prevalence and effective interventions.
29.Dr Bowden-Jones also made a persuasive case that preventing disordered gaming is preferable to treating problems once they have become too severe. She told us:
It is integral for us to implement, in my opinion, campaigns and sessions within the school context for students, teachers and parents to raise awareness of potential problems and to stop the problems from occurring in the first place.
She was also clear that the games industry needs to “play its part” and “take on the responsibility of making sure it is not polluting the world out there with stuff that is harmful.”
30.In direct contrast, representatives of games companies appearing before Parliament for the first time told us that they do not consider it either possible, or their responsibility, to define what counts as normal or excessive engagement with their games, which could help them to intervene to protect players from the potential harms of excessive play. Canon Pence, Epic Games’s General Counsel, told us that he does not consider it the company’s “primary responsibility to determine how much individual players should play Fortnite” because engagement “varies from person to person and it varies from time to time, even, on a person-by-person basis”. Similarly, Shaun Campbell, UK Country Manager of Electronic Arts, observed that harmful levels of play are “what feels out of balance for the individual.” Furthermore, in their evidence to us some industry representatives have disputed the significance and basis of the WHO’s decision, which chimes with Professor Przybylski’s observation that “entrenched interests in industry will interpret an absence of evidence with evidence of absence.”
31.The industry’s evidence to us has instead focused on positive impacts that games can have. For example, British games company Jagex told us that it invests significantly in mental health awareness and fundraising campaigns, including putting characters in their games from mental health charities so that players can “ask questions and interact with them around issues of mental health.” While such best practice is to be welcomed, it is far from the norm among the companies we have heard from, and by no means all games companies consider it their responsibility to acknowledge or tackle players’ concerns around gaming disorder or other gaming-related harms.
32.Games companies repeatedly stressed the significance of parental responsibility in promoting responsible gaming. For example, Epic Games’s Director of Marketing, Matthew Weissinger, told us:
Parents can monitor play time through things like our weekly play usage report and then take advantage of some of these parental controls around screen time, access and purchasing access, in order to make decisions based on how they would like either their child or somebody else who they share an account with to play the game.
However, as James Good observed, parental controls can be easily subverted when “most young people playing videogames know more about their computers than their parents do.”In an attempt to tackle this, trade body Ukie provides advice to parents through resources such as Askaboutgames.com, and advocates for parents to play games with their children, claiming that “parents who talk to their children, for instance, or play games themselves, have a far better understanding of how to protect them from excessive play time.”
33.Yet we believe that some responsibility still lies with games providers to protect users from disordered levels of engagement, including where parents are unwilling or unable to do so. Moreover, with the average age of gamers in the UK being the mid-30s, and evidence we have heard of disordered gaming being linked to stressful life situations such as starting university, it is clear that problem gaming can occur at any age. The industry’s focus on parental controls does not address the needs of vulnerable adults who may struggle to maintain control over how much they are playing. We find this worrying, especially given that the Royal College of Psychiatrists’ written evidence states that “adults may be more vulnerable to excessive internet use if they have pre-existing depression or anxiety.”
34.Although the vast majority of people who play games find it a positive experience, the minority who struggle to maintain control over how much they are playing experience serious consequences for them and their loved ones. At present, the games industry has not sufficiently accepted responsibility for either understanding or preventing this harm. Moreover, both policy-making and potential industry interventions are being hindered by a lack of robust evidence, which in part stems from companies’ unwillingness to share data about patterns of play.
35.The Department should immediately update its areas of research interest to include gaming disorder, working with researchers to identify the key questions that need to be addressed and develop a strategy to support high-quality, independent research into the long-term effects of gaming.
36.The Government should also require games companies to share aggregated player data with researchers and to contribute financially to independent research through a levy administered by an impartial body. We believe that the industry should pay a levy to fund an independent body formed of academics and representatives of the industry to oversee research into online gaming and to ensure that the relevant data is made available from the industry to enable it to be effective.
37.According to Ofcom, 80% of adults, 70% of 12–15-year-olds and 20% of 8–11-year-olds who use the internet have a social media profile. Like gaming, social networking enables people to communicate widely and express themselves creatively. Most users will do so happily; however, there is a growing awareness that for some people maintaining control over social media use is a cause of concern. The Diana Award says that 70% of young people participating in its digital resilience programme “had seen their peers affected by excessive tech/internet use or feeling ‘hooked’ to a device”.
38.The evidence on the links between social media use and mental health is limited, as our colleagues on the Science and Technology Committee found in their recent inquiry on the impact of social media and screen-use on young people’s health. However, a University of Pennsylvania study on the causal link between social media use and mental wellbeing found that limiting use of Facebook, Instagram and Snapchat to 10 minutes per day led to “significant reductions in loneliness and depression”. Moreover, it suggested that simply being more aware of one’s own social media use led to a decrease in “fear of missing out” and anxiety. Dr Jacob Johanssen of the University of Westminster states in written evidence:
The widespread discussion of terms like ‘digital detox’ in the media points to a trend that suggests that many individuals use digital technologies, like social media, to an extent that feels overwhelming and unhealthy. It can lead to difficulties in controlling their online habits. Members of the public report that it is difficult for them to disconnect, unplug and distance themselves from the platforms and apps they use.
39.Jack Edwards, a lifestyle blogger whose YouTube channel has more than 140,000 subscribers, said that although he would not recognise his social media use:
as an addiction in the traditional sense, when I wake up I do check it straight away, it is the first thing I do. When I am at university, I come out of a lecture and the first thing I do is open my phone and check social media. I suppose those are essentially addictive tendencies. Maybe the vocabulary we have to talk about social media does not think of it in that particular way because it is so accessible, it is in your pocket all the time.
He also drew a comparison between social media and online gaming, which offers an “endless universe of possibilities” by enabling people to play with others in different time zones. He observed that:
because there are so many people on the planet and a vast majority will be on social media now there is always someone to talk to, there is always someone awake, there is always someone sharing something and there is always something to discuss.
In particular, YouTube was highlighted as a “rabbit hole”, with the seemingly infinite stream of content enabling people to watch one video after another without making a conscious decision to do so. YouTube’s Marketing Director, Rich Waterworth, confirmed to us that around 70% of the time people spend on YouTube is spent watching videos that have been ‘recommended’ to them by the platform’s algorithms, rather than content they have actively searched for.
40.All the major social media platforms use ‘engagement’ metrics that quantify people’s use and encourage them to extend it. For example, within the image-messaging platform Snapchat, a ‘streak’ is a number icon that counts how many consecutive days two friends have contacted each other. Both Facebook and Instagram detail how many friends or followers a user has, and alerts them to the number of times someone has ‘liked’ their post or image. As we shall explore in Chapter 4, these metrics are one of the ways platforms reward users psychologically, which in turn incentivises them to keep using them; however, it also raises concerns about how they impact people’s ability to maintain meaningful control over their use of technology.
41.In order to maintain a Snapchat streak, both users have to send an image within every 24-hour window—if they miss one, the running total is lost. The 5Rights Foundation, founded by Baroness Beeban Kidron, states that this means “children feel unable to disengage” and Tristan Harris told us that “Snapchat has taken over the currency of whether or not kids believe that they are friends with each other.” Indeed, teenagers themselves say that streaks are considered such a measure of friendship and sign of popularity that losing them can cause stress, and maintaining them takes priority over other things, including sleep. Moreover, streaks appear by default and cannot be switched off by users, which Harris describes as an example of the platform using “agency-inhibiting or disempowering architectures.”
42.We invited Snapchat to give evidence to a Select Committee for the first time, and put those concerns to them. Snapchat’s Senior Director of Public Policy, Stephen Collins replied that the idea behind streaks “was to deepen individual friendships. It was not meant to create extra time on the application”, and committed to going away and review their use on the platform. We will be monitoring the platform’s ongoing design strategies as evidence of whether Snapchat has taken our comments on board.
43.On Instagram, the popularity of a posted image is measured through the number of people who signal they ‘like’ it, and such reinforcement systems may understandably have an impact on self-esteem. At a recent developer conference, Instagram announced that it would test hiding the ‘like’ and ‘view’ count from some users’ photos and videos—although the company would still hold the data about those users’ engagement with content on the platform for advertising targeting purposes. At the platform’s first ever public appearance in Parliament, Instagram’s Head of Product, Vishal Shah, told us that this is to lessen the influence of the engagement metrics that may cause people to compare themselves to others:
The idea is to reduce the pressure around feeling like you are not only competing with yourself on your previous posts but with the rest of Instagram, frankly, and feeling that every time you have to always be perfect and achieve a certain number of likes in order to feel like your expression was worthwhile. That is something that we are looking at more broadly, not just for young people.
44.The effects of engagement metrics sit alongside the fact that platforms such as Snapchat and Instagram use augmented reality technologies to provide users with image-enhancing filters that can be applied to photos before they are shared. There are concerns that ‘beautifying’ filters can have a potentially negative impact on self-esteem, and Instagram was ranked worst by young people surveyed by the Royal Society of Public Health for its effect on body image. These filters allow users to make their lips appear fuller, hips rounder or waist narrower in order to conform to others’ pre-conceived ideas of physical beauty. It is also argued that this can lead to body dysmorphic disorder, with cosmetic surgeons reporting that patients are increasingly bringing ‘filtered’ pictures of themselves to consultations, despite such images being unobtainable using surgical procedures. Vishal Shah told us that the company takes the issue of body dysmorphia “seriously”.
45.Jack Edwards also told us that seeing others share a curated version of their lives on social media can impact self-esteem:
If you are having a bad day and click onto Instagram, instantly, at the touch of a button, you have everyone’s best day of their life right in front of you and it can be really harmful.
Therefore he argued that social media influencers—those with large followings, whose posts influences others’ behaviours or purchasing habits—have a responsibility to present a more realistic image. He told us:
I think it is really important for us as people who portray a lifestyle that is really motivated, productive and proactive to also show this is not every second of every day and we can talk about when we trip up and things do not go to plan.
46.We have heard clear evidence that games, social media and other immersive technologies can expose players to other online harms explicitly in scope of the Government’s proposed legislation. Online games enable players to connect and collaborate in a game remotely by means of voice and/or text chat. Often, players use them to connect with friends, which led Dr David Zendle of York St John University to describe games “as a playground or a social space”.However, Matus Mikuš told us that he found playing League of Legends with strangers “very stressful” because of the verbal abuse he received. He told us that:
After every game my heartbeat would be elevated, and I would be shaking because of the things people say to you. Let us say you die in the game, people would tell you to kill yourself because you are so terrible at the game. This was a very common occurrence, it was almost every game.
Such harassment can be particularly acute for female players, who may find that their “mistakes are amplified” or that they are perceived “not as a person but as a girl”.
47.We have heard that such experiences are by no means new or isolated cases. Marie-Claire Isaaman from Women in Games told us that “the industry recognised that it is a problem” following a 2014 harassment campaign against several women in the video game industry, colloquially known as ‘Gamergate’. Yet Jodie Azhar from diversity initiative POC in Play told us that the industry has been “very good at self-regulating”, including introducing community management and moderators. Electronic Arts told us about work it is doing to combat what it calls “toxicity”, or players behaving badly in games, including developing a machine learning natural language tool to monitor voice chat.
48.Similar challenges around bullying and harassment are seen in other immersive technologies. Some virtual reality environments facilitate social interaction with other VR users—this is known as ‘social VR’. However, concerns have been expressed about the safety of players in social VR, and Sarah Jones told us that recent research found that 49% of female VR users had felt sexually harassed within an immersive experience. She described how “it is the same as harassment that you would feel in everyday life”; however, it still raises the ethical and potentially legal question:
If you feel like you have been harassed within virtual reality, is that the same as being harassed in a non-virtual world?
Yet we again heard that the industry has introduced measures to tackle harassment. For example, social VR has seen the introduction of personal ‘safety bubbles’ that surround virtual avatars and prevent other users from invading virtual personal space, as well as the ability to mute users.
49.Ofcom’s research indicates that 38% of 8-to-11-year-olds and 58% of 12-to-15-year -olds use chat features within online game to talk to others. Yet this behaviour and the popularity of livestreaming platforms such as Twitch, which enables players to stream video of themselves playing games, fuels concerns about grooming. For example, an NSPCC survey of nearly 40,000 school children found that 6% of young people who had livestreamed had been “asked to change or remove their clothes”. We also heard from James Good and Matus Mikuš that it can be “hard to tell” how old a person communicating through in-game chat is, especially as it is “really easy” to “download software to change your voice” and pretend to be someone you are not. This is of particular concern to us as Ofcom found that a quarter of 12-to-15-year-olds chat in games to people they only know online, with boys of that age twice as likely as girls to chat to people they only know through gaming.
50.We felt that the games industry’s response to the challenge of protecting users from the risk of grooming was mixed. Electronic Arts’s Vice President for Legal and Government Affairs, Kerry Hopkins, told us that the company does “not have a monitoring policy” for its in-game chat. When we raised concerns that children have been groomed through Fortnite, Matthew Weissinger admitted that Epic Games is not able to protect players before an act of harm is committed. However, evidence our Sub-committee on Disinformation has heard, in particular from the Finnish company Utopia Analytics, demonstrates that natural language tools already exist to enable companies to monitor chat for patterns of harmful behaviour. Epic’s General Counsel Canon Pence acknowledged that “there is room for growth and sophistication on our side” in the company’s use of technology and policies around safeguarding. More encouraging was the evidence from British games company Jagex, which Director of Player Experience, Kelvin Plomer, described as “among the leaders in our space in chat moderation and screening”. As well as identifying references to suicide and self-harm, the company says that it reviews “all chat 24/7 for additional triggers, particularly around areas of sex and minors”, which are reviewed manually and escalated to law enforcement if necessary.
51.Immersive technologies including online games and virtual reality facilitate interaction between users and user-generated content. Given the technological sophistication of the games industry, and the popularity of its products among children, there is more that companies could be doing to safeguard players, and the future online harms regulator will need to pay due attention to monitoring the industry’s efforts in this regard.
52.In the UK, games are age-rated using the PEGI (Pan European Game Information) system. This includes an age-rating system and eight content labels warning consumers that games contain features such bad language, violence or drug-use. To obtain a rating, game developers complete an assessment questionnaire about a game’s features and content. The Video Standards Council—the statutory body responsible for the age rating of video games in the UK using the PEGI system—then compares that assessment to video footage and examples of the game play. Finally, the full game is tested and the company receives a licence stating which rating labels the game must display. Trade body Ukie states that “only 4% of games across Europe are rated 18+, and 49% of all games are rated suitable for all (rated 3).”
53.Yet the different legal status of age-ratings across the variety of distribution methods in the games industry does not help to provide clarity for consumers. In the UK, the physical distribution of games is regulated under the Video Recordings Act 1984 and its subsequent amendments. Under the Act, PEGI 12, 16 and 18 rated games cannot legally be sold in a physical format to anyone under those ages. However, the 1984 Act has not kept pace with the changing technology of video games, especially given the wholesale move towards online distribution.
54.At present, games that are published and played online are not subject to a legally enforceable age-rating system—instead, PEGI ratings and the new International Age Ratings Coalition (IARC) system serve voluntary, consumer advisory roles. Ian Rice of the Video Standards Council explained that this is because “the Video Recordings Act applies to devices capable of storing media”—in essence, “a physical disk rather than online distribution.” He went on to explain that while games companies may voluntarily display the PEGI age-rating for their games:
Legally speaking, there are no restrictions as to who those storefronts can sell those products to.
This has led to a situation where Epic Games can make Fortnite: Battle Royale, a PEGI 12 rated game, available to any player through its website without asking their age—although as we shall explore in Chapter 4, this approach presents separate problems regarding data protection. As the British Esports Association observed in written evidence:
There are some competitive games whose digital download versions do not carry an age rating, and that is troubling.
55.The plethora of distribution methods for online games also highlights the challenge of enforcing age-ratings across the distribution of games. The games studios that develop games are rarely the only ones who distribute them. Instead, there a range of what Epic’s Canon Pence referred to as “delivery mechanisms”, which can have different voluntary approaches to displaying or enforcing age-ratings. For example, the British Esports Association told us that “platforms like Steam do not have to adopt enforced age ratings on their games. This is because every country’s law is different.”
56.Representatives of the games industry have also expressed concern that not all parents take age-ratings, or their responsibilities to observe them, seriously. Timea Tabori told us that:
A lot of games are still viewed as children’s toys, even though a lot of video games are rated 18-plus. If you go into a physical game shop, as much as they are on the decline, you will see adults, parents, purchasing 18-plus rated games for their 12 year-old kids. I have seen it happen.
She went on to argue that parents have a responsibility to understand the impact that games might have and to talk about that with their children:
If you do not recognise it as something that can have a powerful impact on you or the way you see the world, then you are surprised when it changes the way you see the world or the way you interact with the world.
57.There are inconsistencies in the games industry’s self-regulation around the distribution of games. If companies hold that it is not their responsibility, but that of parents, to enforce age ratings, and parents themselves are not willing or able to do so, further legislation may be needed to protect children from playing games that are not appropriate for their age. This could include extending the statutory duties that apply to physical distribution to the online distribution of games. Likewise, games companies should not assume that the responsibility to enforce age-ratings applies exclusively to the main delivery platforms: all companies and platforms that are making games available online should uphold the highest standards of enforcing age-ratings. The Video Recordings Act should be amended to ensure that online games are covered by the same enforceable age restrictions as games sold on disks.
58.Through our previous inquiry on disinformation and ‘fake news’, and our work as part of the International Grand Committee, we have repeatedly questioned online platforms, including YouTube, on their failure to remove harmful content in a timely fashion. Moreover, we have attempted to understand how these company’s revenues compare to their expenditure on content moderation, which we believe is illustrative of the lack of priority and attention paid to the issue.
59.We have heard commitments from platforms that they will do more; however, that does not seem to stop violent or otherwise distressing content from being shared on their platforms, such as occurred with the attack in Christchurch, New Zealand. Following that tragic incident, YouTube’s Public Policy Director, Marco Pancini, told us that:
We learn from the failure of some of the systems that we put in place, and that is why […] changing our policy is a way to address the risk that something like this can happen in the future, and make sure that the content that is related to a video uploaded of this nature is blocked as soon as possible.
60.Instagram has also been criticised for the time it has taken to remove violent images, and content related to self-harm, from its platform. We asked what more the platform could do to use the data it holds on its users to identify and support people who may be at risk of self-harm. While Instagram uses language recognition tools to offer support to users whose behaviour on the platform indicates they may be in need of support, Vishal Shah said that the company will:
think about how we can use all of the signals that we have on the platform, all the ways that people interact with content, and the accounts that they follow, to understand if there is additional risk, and from a product perspective, see if we can get ahead of some of these issues.
15 Ukie ()
16 Ukie ()
17 Dr Paul Cairns and Dr Christopher Power ()
19 , accessed 22 July 2019
24 World Health Organisation, , (Version : 04 / 2019), 6C51
28 Antonius J van Rooij, et al. “” PsyArXiv, (8 February 2018), Web
29 Hans-Jürgen Rumpf, et al., “,” Journal of Behavioral Addictions, vol. 7.3, (16 July 2018)
30 Professor Andrew Przybylski, Netta Weinstein, Pete Etchells and Amy Orben ()
31 Professor Andrew Przybylski, Netta Weinstein, Pete Etchells and Amy Orben ()
32 The International Gaming Research Unit/ Cyberpsychology Research Group, Nottingham Trent University ()
33 The International Gaming Research Unit/ Cyberpsychology Research Group, Nottingham Trent University ()
34 Jeffrey G. Snodgrass, et al. .” Computers in Human Behavior, vol. 38, (2014) pp 248–260.
36 The International Gaming Research Unit/ Cyberpsychology Research Group, Nottingham Trent University ()
37 The International Gaming Research Unit/ Cyberpsychology Research Group, Nottingham Trent University ()
39 Qq29, 31
41 The International Gaming Research Unit/ Cyberpsychology Research Group, Nottingham Trent University ()
42 Gaming the Mind ()
43 The International Gaming Research Unit/ Cyberpsychology Research Group, Nottingham Trent University ()
44 Qq68, 1536
46 Qq539, 755
47 Q1078 ff.
49 Department for Digital, Culture, Media and Sport ()
50 Qq1528, 1545
55 Qq, 1140, 1078
57 Qq25, 1126 ff., 1391
62 Qq1404, 129
63 Royal College of Psychiatrists ()
64 Ofcom, , (May 2019), p 9 and , (January 2019), p 8
65 The Diana Award ()
66 Science and Technology Committee, Fourteenth Report of Session 2017–19, , HC 822
67 Melissa G Hunt, et al. “”, Journal of Social and Clinical Psychology, vol 37.10, (2018), pp 751–768
69 Dr Jacob Johanssen ()
74 5Rights Foundation () and oral evidence taken on 22 May 2018, HC (2017–19) 363,
75 “”, Business Insider, 14 April 2017
76 Oral evidence taken on 22 May 2018, HC (2017–19) 363,
78 “”, CNN, 30 April 2019
80 Royal Society for Public Health, , (May 2017), p 23
81 Susruthi Rajanala, Mayra B.C. Maymone, and Neelam A. Vashi, “”, JAMA facial plastic surgery, vol. 20.6 (August 2018), pp 443–444
83 We will pursue the role that social media’s influencer culture plays in promoting unhealthy or unrealistic expectations through our inquiry on reality TV.
91 Qq44, 47
94 Ofcom, , (29 January 2019), p 7
95 NPSCC ()
96 Qq151 [Mr Mikuš], 251 [Mr Good]
97 Ofcom, , (29 January 2019), p 7
99 Q1209 ff
100 Oral evidence taken before the Sub-Committee on Disinformation on 20 June 2019, HC (2017–19) 2204,
104 Ukie ()
105 “”, The Telegraph, (8 December 2017)
109 British Esports Association ()
111 British Esports Association ()
114 Oral evidence taken on 28 May 2019, House of Commons Canada, Standing Committee on Access to Information, Privacy and Ethics,
115 Oral evidence taken on 8 February 2018, HC (2017–19) 363, ff.
117 Q927 ff.
Published: 12 September 2019