Changing the perfect picture: an inquiry into body image Contents

6Body image harms online

Social media

92.Social media’s impact on body image was raised repeatedly throughout the inquiry and in our survey. We found that majority of young people spend over 2 hours on a typical day on social media and that social media had the biggest influence on their body image of all mediums. Most adults also reported spending 2 hours on social media a day and that social media was one of the biggest influences on how they felt about their appearance.175

93.The NHS reported that daily social media use was more common in young people with a mental health disorder and that 87% of 11–19-year olds with a mental health disorder used social media everyday, compared to 77% of those without a disorder. 24% of daily users with a disorder were on social media for over four hours a day compared with 12% without a disorder. Additionally, young girls with a mental health disorder were more likely to compare themselves to others on social media.176

94.Professor Widdows told the Committee that social media can fuel body image anxiety as our peer comparison group grows from solely people we know in our communities to most in the virtual world.177 The Centre for Appearance Research told us that research into the dramatic increase in social media use in the last decade has highlighted that engagement with social media is associated with poor body image as well as the desire to undergo cosmetic surgery.178

95.We heard that social media is a space in which people can face appearance-based bullying and harassment, something particularly faced by groups protected by the Equality Act. Changing Faces told us that 40% of people with a visible difference have had negative experiences online including significant trolling and online abuse. One in ten said they had been repeatedly harassed on social media and that negative behaviours have stopped them using it completely.179 Stonewall told us that 40% of LGBT young people, including 58 per cent of trans young people, had been the target of individual homophobic, biphobic and transphobic abuse online; and 65% of LGBT people think that online platforms are unlikely to do anything about tackling such abuse when it is reported to them.180

Our inquiry sought to determine the impact of different elements of social media on body image. We found that:

Posts from friends. Social media users are likely to encounter a stream of highly idealised images that portray narrow appearance and body ideals. While in traditional media these images would typically be of celebrities and models, social media presents the additional opportunity for users to view content generated by their peers. Research shows that women are more likely to make appearance comparisons through social media than traditional media, which is associated with poor body image.181

Posts from influencers and celebrities. Many people follow celebrities, and some will form strong connections with them. In cases where individuals feel they have a special relationship with celebrities—known as celebrity worship—researchers have found that there is a significant relationship with negative body image.182

Content promoting eating disorders and diet culture. Pro-eating disorder content can be found on all social media platforms and can include ‘thinspiration’ (images and messages idolizing thinness), bonespiration (images and messages idolizing emaciation whereby bones, e.g., hipbones, ribs, spine, are clearly visible), and tips to maximise weight loss. This can cause people to develop or exacerbate negative body image and disordered eating. Women who follow “health food”, “clean eating” or “fitness” accounts on Instagram demonstrate higher levels of disordered eating such as extreme dietary restrictions and preoccupation with health than the general population.183

Content promoting cosmetic surgery/interventions. Studies have found that cosmetic surgery adverts elevate body dissatisfaction. There have also been reports that consistent filtering of pictures is leading to ‘snapchat dysmorphia’ where people seek surgery to look more like they do in their edited photos on social media.184

96.We did, however, hear evidence that social media can have a positive impact on body image. A witness with lived experience of poor body image told us that social media allowed them to positively to connect with other users who have burns and scarring, share their story, and gave them the confidence to engage with people on appearance issues.185A respondent to our body image survey also told us that social media allowed them to see people like themselves, particularly when they have been underrepresented in other forms of media.186

Advertising via social media

97.Adverts on social media can take a variety of forms including user-generated adverts. These adverts can come from fake accounts which look like ‘normal people’ rather than influencers. An example of such would be adverts for ‘Keto Activator’ that Which? found in 2019 to be using false advertising to sell products promising miraculous weight loss.187 The ASA noted that complaints about ‘influencers’ dominate its complaints.188

98.We heard from Girlguiding UK that girls viewed more harmful ads online when it came to the use of gender stereotypes, with a particular emphasis on harm caused by body and appearance issues. Additionally, they felt adverts on social media were more invasive because they’re embedded in the place where individuals engage with friends and are harder to disengage from than adverts on TV. We also heard that social media adverts were more exploitative as they are targeted based on previous search history and on stereotypical ideas of what young women might be interested in–such as beauty or weight loss products.189

99.The ASA’s gender stereotyping research in 2017 showed that user generated content, advertisements and paid-for online content contributes to a culture of idealised appearance, which can in turn lead to self-esteem issues, and body dissatisfaction.190

Are social media companies protecting their users from body image harms?

100.During our inquiry we received evidence from Facebook and Instagram, Twitter, TikTok and Snapchat on their policies to protect users from developing or worsening poor body image for its users. These measures include community standards or guidelines that set out what is acceptable for users to post to about. In relation to body image harms, social media platforms regulate content relating to self-harm, suicide and eating disorders.191 For example, Instagram’s tool to reduce the availability of eating disorder content uses machine learning to automatically identify hashtags that are being used to share eating disorder content and will remove them or add a sensitive content screen before users view the related content.192 Social media platforms also have advertising policies with restrictions on certain goods and services that can be promoted, such as unsafe supplements, weight loss products and plans, and cosmetic procedures that should be restricted to over 18s.193Adverts on Facebook and Instagram are also prohibited from containing ‘before’ and ‘after’ images.194 TikTok has also banned ads for fasting apps and weight loss supplements in an effort to protect users.195

101.Despite the number of safeguarding and advertising policies social media companies have in place, we heard extensive evidence that there is a gap between the policies and the real-life experience for users. Our Body Image survey demonstrated how damaging social media use can be for an individual’s body image. Respondents told us that how they feel about their appearance was influenced on social media either by ‘influencers’ or adverts. People reported struggling with their body image as a result of repeatedly looking at edited pictures, or content or advertising which encouraged their body to look a certain way.196

102.Both adults and young people told us that they feel pressured to make changes to their bodies and their appearance due to persistent advertising, most commonly from those promoting weight loss. People reported this despite curating their social media feeds to be ‘body positive’ and free from ‘diet culture’. This was also true for people who currently have, or have previously had, an eating disorder.197 We also received testimonials from the public highlighting the difficulty in avoiding adverts around ‘diet culture’ on social media.198

Evidence to our inquiry outlines that despite good intentions from social media platforms, their safeguarding advertising policies simply aren’t protecting people from body image harms and they need to do better. The Mental Health Foundation agreed that social media companies need to be more active in promoting good body image by generating positive exposure to a diverse range of body images, and in protecting against the promotion of unrealistic and unobtainable body ideals. Social media companies cannot claim to be passive in this–their provision of filters, advertising guidelines, and algorithms all contribute to the promotion of unobtainable body ideals.199

Research

103.The Centre for Appearance Research and the Nuffield Council of Bioethics informed us that a first step for social media companies to take to tackle the problem of body image pressures is to fund research so they can understand the impact their platforms are having.200 Social media companies informed us of the research they had engaged with in partnership with organisations such as Beat, Samaritans, the Diana Award and Childnet.201 The Government acknowledged in its evidence to the Committee that it was important for social media companies to take more responsibility for the content on their platform.202 The Committee heard from social media companies in December 2020 and they committed to working with the Government to do more research into the relationship between social media use and body image.203

Online Harms legislation

104.The full Government response to the consultation on the Online Harms White Paper204 was published in December 2020. The majority of the evidence we received on Online Harms was based on the Government’s initial response from 2020. The resulting legislation is expected this Spring.205 The Online Harms legislation will set out a new regulatory framework establishing a duty of care on companies to improve the safety of their users online, overseen and enforced by Ofcom in their role as the independent regulator.206As well as setting out how the Government will tackle illegal content and activity online, the upcoming legislation will also address increasing levels of public concern about online content which is lawful but potentially harmful, such as, online bullying and abuse, the advocacy of self-harm and misinformation.207 The legislation will be applicable to all companies whose services host user-generated content which can be accessed by users in the UK. This includes the social media companies the Committee has received evidence from.

105.The Government has said that the legislation will set out a general definition of what can be considered a ‘harm’ online. A limited number of priority categories of harmful content, posing the greatest risk to users, will be set out in secondary legislation. The legislation will set out that online content and activity should be considered harmful, and therefore in scope of the regime, where it gives rise to a reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals.208 Given the extensive evidence found in this inquiry, as well as in academic and Government research, it’s clear that there is a link between social media use and poor body image. As such, the Committee is minded to recognise any online content that contributes to the proliferation of negative body image as a ‘harm’.

106.The evidence we received on Online Harms legislation and body dissatisfaction noted that body image was not explicitly listed as a ‘harm’ and so might ‘fall between the gaps’.209 Professor Widdows told us that the White Paper has far too little to say on ‘body image anxiety’ or ‘body dissatisfaction’ and the related harms.210 This is despite the psychological harms that we know can result from negative body image, such as disordered eating consequences, effects on self-esteem, and young girls reporting that body image anxiety holds them back and stops them from speaking up in class or engaging in physical activity.

107.Online abuse and bullying are mentioned throughout the White Paper, but nowhere is appearance-bullying mentioned—despite it being the most prevalent form of bullying.211 The Mental Health Foundation suggested that the promotion of images, products, and games that present idealised body appearance, such as those that endorse diet products, or apps that encourage young people to ‘play’ at cosmetic procedures, should fall in scope of the legislation.212 Since a central objective of the Online Harms White Paper is to prevent harm to individuals arising on social media platforms, there is arguably potential for the legislation to tackle content that is likely to cause physical, mental or moral harm to individuals, including on the grounds of adversely affecting body image.

108.We sought to reach an understanding of how the legislation will function, and the roles that Parliament, Ofcom and UK Research and Innovation (UKRI) will play in identifying which online harms users need protection from. In regards to this, Oliver Dowden MP, Secretary of State for Digital Culture, Media and Sport, appeared before the DCMS Committee and informed them that “the regulator and others” would give advice on specific harms meaning that harms will be identified. Ministers would then recommend for them to be added to the legislation via the statutory instruments’ procedure. This will ensure Parliament will have a say as to what is and isn’t regarded as an ‘online harm’.213

109.The Nuffield Council of Bioethics told us that online harms legislation will be most effective if social media companies are fully engaged in the process as platforms have the potential to deliver innovative solutions to body image harms encountered online separately from the application of sanctions. It stated that social media companies must recognise their duty of care and should investigate positive and innovative ways of promoting healthy body image and protecting their users from body-image-related harm.214 The Centre for Appearance Research supported this and we heard that proposals in the online harms consultation could be useful strategies to apply to body image, however, further consultation with experts and the scientific evidence-base is necessary to ensure any proposals are specific and targeted enough to foster positive body image.215 It is important that Ofcom, the regulator, works closely with the UKRI to ensure support for identifying online harms, including in the view of this Committee online harms relating to appearance dissatisfaction.216

110.Changing Faces told us that proposals in the Online Harms White Paper will only have the potential to protect people from the harms caused by social media content in regard to body image, if Ofcom commits to working closely with people with a visible difference to understand their experiences.217 We agree with this assessment and would like to see Ofcom work closely with the groups described throughout this Report who are at high risk for suffering with poor body image. More generally, for Ofcom to maximise its impact and effectiveness, its work must be user-driven throughout its design and practices, which could include people impacted by negative body image being brought in for co-production workshops when creating processes such as a code of conduct.218

Age verification

111.There is currently no robust age verification process in use on social media or download apps.219 The British Board of Film Classification told us that the majority of social media platforms state that they require users to be at least 13-years-old. However, a 2016 survey by the BBC found that more than three-quarters of children aged 10 to 12 in the UK have social media accounts, despite the notional age limit. In principle, existing age-verification solutions could be adapted or new solutions developed to verify users are aged 13 or over at the point of registration.220 When we heard from social media companies in December 2020, they acknowledged this and emphasised that they are keen to engage on the issue of children under the age of 13 accessing their apps. They noted the potential to use the app stores or the devices apps are accessed from as an option to introduce some sort of age verification, but highlighted that this issue needs international cooperation.221 The Mental Health Foundation also highlighted to us that image-editing apps which, as described above, contribute to body image anxieties, are available in app stores and are often labelled as appropriate to people aged 4+ with no checks whatsoever. Apps such as the ‘Body Editor’ and ‘Facetune 2’ have been downloaded over 10 million times and both allows users to change photos with various features including whitening teeth, removing blemishes and pimples, and contouring faces.222

112.We questioned the Government on the provisions for age verification in the Online Harms legislation intended to protect children from accessing content that is inappropriate for their age and which could cause negative body image. Minister Dinenage told us that age assurance is the umbrella term for the technology that assesses a user’s age, and that age verification is the most stringent measure as it checks against officially provided data. Less stringent measures are known as age estimation. The Minister informed us that the Government is legislating for companies within the scope of regulation to use age assurance or age verification technology to prevent children from accessing services that pose the highest risk of harm. The legislation acknowledges that suitable technology may not be available currently, but states that it must be used once that technology is much more refined and easily accessible. Examples of age assurance technology include behaviour analytics (where keyboard can accurately tell how old someone is by the way that they type) as well as machine learning which estimates the age of users to check they are old enough to access the platform.223

113.The Committee was pleased to see some progress on the Government’s Online Harms legislation during our inquiry. We are of the view that any online content and activity that contributes to the proliferation of negative body image is a ‘harm’. The Online Harms Bill should be a legislative priority and the Government should inform us of its proposed timetable within two months. We recommend that harms related to body image and appearance-related bullying are included within the scope of the Online Harms legislation due to the foreseeable risk of a significant adverse physical or psychological impact on individuals who are at risk of developing negative body image.

114.Despite the number of controls in place on social media platforms, users continuously experience content that, by the platforms’ own admission, shouldn’t be accessible. We recommend that the Government should ensure that social media companies enforce their advertising rules and community guidelines and introduce strong sanctions for failing to do so, including but not limited to, significant fines.

115.We were pleased to hear that the Government recognises the impact social media can have on body image and that it is encouraging social media companies to take more responsibility for the content on their platforms. We are also pleased that social media companies are committed to working with the Government to do more research into the relationship between social media use and body image. We recommend that the Government works closely with social media companies and academics to ensure that research on social media use and body image are up-to-date, evidence-based, and sufficiently funded.

116.We welcome Ofcom’s role in regulating online harms and Parliament’s role in identifying harms. We recommend that the Government work closely with the UKRI and Ofcom to ensure that online harms legislation sufficiently encompasses protections from harms caused by body image pressures. We also ask that the Government engages with social media companies on developing innovative solutions to protect users from body image harms encountered online, and that Ofcom works with groups at high risk of developing poor body image to ensure the new regulatory system works for them. We ask that the Government takes this recommendation into account in advance of the Online Harms Bill passing into law.

117.Young people are particularly at risk of developing poor body image, and access to social media and other online content is linked with negative feelings about appearance. We recommend that the Government ensures that any age verification or assurance processes used by online companies are effective and protect young people from harmful content. We ask the Government to respond to us within 12 months on how effectively age controls have restricted access to harmful content for young people.

187 Which?, Keto diet pill scam targets Facebook users, [accessed 18 March 2021)

195 TikTok for Business, Coming together to support body positivity on TikTok, 23 September 2020

204 Department for Digital, Culture, Media & Sport and Home Office, Consultation outcome Online Harms White Paper, [accessed 18 March 2021]

206 Department for Digital, Culture, Media & Sport and Home Office, Online Harms White Paper: Full government response to the consultation, [accessed 18 March 2021]

207 Department for Digital, Culture, Media & Sport and Home Office, Online Harms White Paper: Full government response to the consultation, [accessed 18 March 2021]

208 Department for Digital, Culture, Media & Sport and Home Office, Online Harms White Paper: Full government response to the consultation, [accessed 18 March 2021]

213 Digital, Culture, Media and Sport Committee, Q179, Oral evidence: The work of the Department for Digital, Culture, Media and Sport, 14 October 2020

222 Mental Health Foundation, Image-editing apps and mental health, 2020




Published: 9 April 2021 Site information    Accessibility statement