Online abuse and the experience of disabled people Contents

2Neither considered nor consulted

Accessibility and inclusion

59.We heard that disabled people are marginalised not only because hostile language and imagery towards disabled people and disability is widely tolerated, but because disabled people themselves are ignored. Disabled people told us that they are often not consulted, or even considered, when policy or practice are developed.

Government

60.The Government response to the Internet Safety Strategy green paper, which “looks at how we can ensure Britain is the safest place in the world to be online.”, was published in 2018.47 It almost entirely ignored the needs of disabled people in both its creation and its content. In response to our questions about the consultation on the strategy, Margot James MP, Minister for Digital and the Creative Industries, told us that the consultation didn’t record whether people responding had a disability, but “A roundtable will shortly be held with disability groups, along with social media companies, to discuss what more platforms can do to tackle online abuse.”48 That letter was dated 10 April 2018. The consultation closed on 7 December 2017. As examples of consultation, the letter mentioned that “A link to request an accessible format was supplied.” and that the Government had held roundtables with “teachers representing mainstream and specialist provision schools”. Neither of these represent consultation with disabled people—most disabled people in the UK are not school children who need teachers to speak for them. She went on to say that “I understand therefore the need to ensure people with disabilities need a high level of protection against abuse online.”49 Disabled people were clear that they are not inherently vulnerable or in need of a “high level of protection”, but adults asking for access to the same level of protection as other internet users. Although disability is included in a list of characteristics in the Green Paper and the Government’s response, neither mention the experiences of disabled people nor any specific needs they may have.50

61.We heard that failing to consider or consult disabled people was sadly the norm rather than the exception. That is particularly worrying given that disabled people are statistically more likely to be unemployed, to live in poverty and to have left education early.51 The Government is bound by equalities legislation and commitments, including the Public Sector Equality Duty. However, we heard that dealing with inequality for disabled people often seems to extend only to thinking about physical or technological changes, such as screen readers. The point was repeatedly made to us that creating an inclusive environment for disabled people is not only a matter of providing alternative formats or making necessary and welcome changes to the physical environment, but also about ensuring that that the toxic environment caused by abuse is tackled.52 If the Government does not adequately consider or consult disabled people in developing its Internet Safety Strategy, it is difficult to see how it can tackle the online abuse of disabled people.

Social media

62.Like the Government Minister, social media companies responded to questions about disabled adults with answers about children. For example, in response to questions about Easy Read terms and conditions, Karim Palant of Facebook responded:

Many of the programmes that we run—for example, the digital safety ambassadors programme, which we run with the Diana award and Childnet—work with children with disabilities and particular vulnerabilities. They operate in those contexts, but we believe we can do more in that space to provide extra guidance and support for those young people and for people supporting them. It is something that we’re actively looking to do.53

63.The social media companies we heard from admitted that they haven’t done enough to engage with disabled people. Karim Palant of Facebook told us:

I have to say that, certainly in the UK, we haven’t been as good as we could have been at dealing with disability NGOs specifically. A lot of the NGOs that we deal with will address and deal with disability issues, but it is not their main focus. We could do more work with those NGOs specifically to understand these issues a bit better.54

Nick Pickles of Twitter also told us that his company needed to do more:

I’m planning on going to the trust and safety council and asking them, “How could we hear more from groups that work with disabilities?” because it’s an audience that perhaps hasn’t had the same level of engagement as other areas.55

Listening to disabled people

64.A concern raised on numerous occasions by disabled people was an apparent unwillingness to engage directly with them. Disabled people told us they were not hard for government, in particular, to reach.56 Many use multiple public services, so can be easy to identify and approach. Many of those we spoke to felt that government and social media companies already had the data and the means to contact them—for example, through emails from the DWP or leaflets in doctors’ surgeries. The disabled people we heard from were well-informed and eager to be heard. Many felt that they had been shut out of the conversation by groups who claimed to speak for them but were not led by them. As our consultation shows, there are multiple ways to hear directly from disabled people.

Box 21: Attendee at Swansea consultation event

We’re not hard to reach, only easy to ignore.

65.Our own consultation has shown that there are many disabled people around the country with the skills and experience to help the Government to consult properly in the future. The diverse views we heard from reflect the diversity among disabled people, but they almost all agreed that consultation must be with disabled people themselves, not intermediaries. Although the temptation might be to consult with disability charities, the message we heard was that if disabled people aren’t in the room, they aren’t being consulted.57

66.Both the Government Minister and social media companies responded to questions about disabled adults with answers about children. This is sadly evidence of the problem that disabled people repeatedly described to us. They are not considered capable of controlling or understanding their own lives. Disabled people are not inherently vulnerable or in need of a “high level of protection”, but adults asking for access to the same level of protection as other internet users.

67.Disabled people are not hard to reach, only easy to ignore. We recommend that the Government include disabled people explicitly and directly in all consultations, including on digital strategy. If disabled people aren’t in the room, they aren’t being consulted. All consultations must be accessible to, and directly involve, disabled people, including people with physical, neurological, developmental, sensory and learning disabilities. We recommend that the Government to report to Parliament on how it has consulted with disabled people and what changes that consultation has led to. We recommend that the Government set out in its response to this report how and how often it will make such reports to Parliament.

Accessibility of social media policies and reporting mechanisms

68.Disabled people are powerful advocates for the benefits of social media. It is mystifying that social media companies have failed to recognise the benefits to them from engaging fully with disabled people. From the widespread sharing of abusive images and messages to Twitter only recently adding “disability” to its reasons why someone may be targeted for abuse, we’ve been told that social media companies have overseen a toxic environment for disabled people. Whilst we heard that the industry was taking online safety and abuse more seriously, it was clear from our witnesses that these were recent developments. Nick Pickles of Twitter, told us:

I think it is fair to say that as an industry, we have stepped up our efforts on safety more broadly in recent years. [ … ] I think that this hearing is highlighting an important area where we can do more, and where perhaps the response you see in other areas hasn’t been mirrored fully in regards to this. Certainly we are looking to do more. You may be aware that in April this year, we changed our reporting function to explicitly call it out. Disability was covered under our behaviour and conduct policy. That was based on feedback from groups of disabled people. So there is more to do, and we are starting to make progress where we can.58

Karim Palant of Facebook told us:

It is a new step on Instagram. For some of the most egregious, really personally degrading comments attacking people’s images—direct attacks on somebody’s appearance, for example—we are starting to filter out the bullying comments, as we call them.59

Katie O’Donovan, Public Policy Manager, Google, told us:

Most recently, we’ve also made a video that describes what happens to anyone who flags content—what happens on that journey—to help make that a little bit more accessible.60

From what we were told by disabled people, it was clear that there is much further to go.

69.We had the pleasure of meeting the Royal Mencap Society digital champions. They were self-assured and well-informed internet users. Although many of them had experienced online abuse, they were confident that they knew how to report negative experiences and had strategies to keep themselves safe online.61

Box 22: Participant at Royal Mencap Society’s digital champions roundtable

Some people kept adding me as a friend. They looked like fake accounts. I reported it. Facebook said thank you and their accounts were deleted. I can understand the system, but I worry about other people with learning disabilities.

70.However, it was clear from our consultation that many people do not know what to do when they feel unsafe online. The number of people with learning disabilities and neurological or developmental impairments who told us that the first thing they should do if they felt worried online was to dial 999 was worrying.62 Some behaviour, such as threats and harassment, may cross the line into criminal behaviour that requires police involvement. However, many concerns about abusive behaviour online are more appropriately dealt with by the social media platform itself and do not require an emergency police response. The police service should not bear the costs of social media companies’ failure to communicate effectively with their users. We look at this point in more detail in the subchapter Rules and the law below.

71.We welcome the different ways that responsible social media companies have tried to engage their users, particularly with simple “how to” videos. However, what we heard from disabled people demonstrates that it is not enough. At our London consultation event, Karim Palant, UK Public Policy Manager, Facebook, shared his view that the simple explanatory videos Facebook used were sufficient to explain Facebook policies to adults with learning disabilities, and therefore Easy Read versions were unnecessary. The disabled people attending the event strongly disagreed.63 In written evidence, Google told us:

We have examined the ‘Easy Read Guidelines’ and do believe our community guidelines, including the use of short sentences, pictures and videos fit with these standards.64

We put this statement to representatives from Dimensions and Mencap, who both disagreed. Dimensions told us:

Many people with learning disabilities will find the community guidelines, which do not conform to established good practice in easyread, difficult to understand. For example, there is too much complicated information about each section. Some of the words are hard to understand. And the current use of imagery does not help an easyread user to contextualise each individual point.65

Mencap told us in reference to social media terms and conditions in general:

The examples we have seen would leave many people with a learning disability struggling to understand them due to complicated words, jargon and abstract language.66

72.When asked about making terms and conditions more accessible, Nick Pickles of Twitter told us that:

One of the biggest challenges we have is that sometimes simplifying our policies makes them harder to understand. So there is a tension between adding more detail, so that people can understand, and making it simpler.67

However, Easy Read versions of complex documents are regularly produced, including Select Committee reports,68 NHS consultations69 and tenancy agreements.70

73.The language used in policies, rules and guidance was not the only concern we heard. What the policies are called was also seen as a problem. We were told that people find it difficult to understand what to look for when seeking guidance and difficult to know the status of different policies. Behaviour policies go by different names on different sites. For Facebook “Community standards” explain the limits of acceptable behaviour.71 Twitter has a “Hateful conduct policy”.72 LinkedIn has “LinkedIn Professional Community Policies”.73 Instagram has “Learn how to address abuse” in its “Privacy and Safety Centre”.74

74.Google, Facebook and Twitter also told us about their work to make it easier to report harmful, extreme or abusive content, with Twitter, for example, reducing this from fifteen clicks to five clicks.75 However there has not been enough focus on making reporting more accessible for disabled people. Disabled people are a diverse group with diverse requirements. Many of the disabled people we spoke to told us that reporting mechanisms were still difficult to understand and not accessible enough. We were also told that it’s difficult for some people, particularly those with learning disabilities, to recognise when content is unacceptable and to know how and where to report it. We also heard concerns about what happened to abusive content, particularly content that may cross the line into criminal activity. We heard repeated complaints that people weren’t updated on whether a user had been warned or punished for their behaviour and whether content had been removed.

Box 23: Respondent to online survey

While laws against cyberbullying need to be tightened, I think it is much more important at this stage to force social media platforms to actually apply their T&Cs in practice. Even when hate speech is reported, nothing is done in most cases, because social media providers do not want to spend money on employing enough people to deal with thousands of reported comments every day.

Rules and the law

75.We heard that many people think that social media companies, not the criminal justice system, control and police online spaces. We also heard that users, and adults with learning disabilities in particular, find it difficult to judge whether something they see online should be a police matter or a matter for the social media company. It is not the role of social media companies to enforce or interpret the law, and we look at the difficulties with the law on online abuse in chapter 3. However, social media companies are responsible for some of the confusion their users feel. Social media companies do not, on the whole, distinguish between company policy and the criminal law. Confusion over where responsibility lies and where to go with concerns is the result. For example, Twitter rules76 state that:

You may not make specific threats of violence or wish for the serious physical harm, death, or disease of an individual or group of people.

The potential punishment is described thus:

Accounts found to be posting violent threats will be permanently suspended.

However:

Given the severity of this penalty, rare exceptions for permanent suspension may be made, based on a limited number of factors. In such a situation, the account will still be required to remove the violating Tweet.

There is no mention that making threats to kill is a serious criminal offence in the UK with a maximum penalty of 10 years in prison.

76.Conflating breaking terms and conditions with serious criminal behaviour only adds to the confusion about what is acceptable behaviour online. In written evidence, Dr Loretta Trickett suggested that people often believe that online behaviour is dealt with by the terms and conditions of social media sites and is therefore outside the criminal law.77 It is not acceptable for social media companies to allow that perception to stand.

77.Ensuring that terms and conditions and reporting mechanisms are compatible with specialist equipment, such as screen readers, screen magnifiers and refreshable Braille displays, came up in evidence. However, accessibility is not only about welcome physical changes and ensuring compatibility with assistive technology. Accessibility requires guidance and terms and conditions that are accessible to people with learning disabilities. The vast majority of adults with learning disabilities we spoke to wanted policies and guidance in an Easy Read format. When we suggested short films, people reacted positively, but were clear that such films should be in addition to, not instead of, Easy Read format.78 If adults with learning disabilities are to take a full and active part in public life, they need to be online and able to make informed decisions. Disabled people have diverse needs and experiences; to establish what accessibility looks like will require genuine consultation.

Privacy and photo sharing

78.The use of photos of disabled people, particularly disabled children, to create “jokes” seems to be a form of abuse that disabled people are disproportionally subjected to.79 At our face-to-face events we heard that people were afraid that the photos they posted on their social media accounts would be copied and used for this purpose. Although we expect the law to be reviewed to cover whether creating and sharing such “jokes” needs to be taken more seriously as a form of abuse, there is more that social media companies can do now. Given what we have heard about the use of images of disabled people, it is essential that Facebook and similar platforms make it clear, including in an Easy Read format, what “sharing” images means for a person’s ability to control how that image is used. They must review privacy settings to ensure they are fully accessible to disabled people. They must ensure that moderators recognise this use of images as abusive behaviour.

79.Disabled people need to be able to manage their settings, report abusive content and see action taken, and make informed decisions about how they use social media. All policies must be fully accessible, including to those with learning disabilities. We were told that social media companies felt that simplifying policies and legal documents could cause greater confusion and potentially lead to needlessly complex explanations of what is and isn’t acceptable behaviour online. However, Easy Read versions of complex documents are regularly produced, including Select Committee reports, NHS consultations and legal contracts, such as tenancy agreements. In our inquiry, we have met experienced disabled experts ready and willing to assist with such work. We believe that appropriate consultation with disabled people will help social media companies overcome this perceived problem.

80.We recommend that the Government require social media companies to have polices, mechanisms and settings that are accessible to all disabled people. That must include Easy Read versions of all relevant policies. Policies may include, but are not limited to:

Mechanisms and settings may include, but are not limited to:

81.To ensure that the particular concerns of disabled people are recognised, we recommend that social media companies be required to demonstrate that they have consulted and worked in partnership with disabled people themselves when developing policies and processes.

82.The rules for social media platforms should be easy to identify, find and understand. It should be clear what behaviour is offensive and how to report abuse. It is unacceptable that police services are bearing the costs of social media companies’ failure to communicate the difference between unacceptable behaviour and criminal behaviour and how to report abuse appropriately. We recommend that social media companies be required to be more proactive, not only in searching for abusive and extreme content, but in ensuring their users understand the limits of acceptable behaviour, including the use of images and hashtags, and in actively reporting potentially criminal behaviour. We recommend that this covers the use of images of disabled people, particularly disabled children, to create “jokes”.

Regulating social media?

83.Multiple Select Committee inquiries have examined concerns about how social media companies operate.80 The United Nations has named Facebook as bearing responsibility for hatred incited against the Rohingya Muslim minority in Myanmar.81 It has also been repeatedly criticised for hosting videos and images of child sexual abuse and violence against children.82 Whether it’s data sharing and fake news, or extreme abusive content and alleged complicity in the spread of violent hatred, the impression is of repeated problems from data misuse, harmful content and hate speech. Social media giants seem to wait to see where the next outcry will rear its head before turning only a small proportion of their huge revenues to tackling that problem. That is not responsible self-regulation.

84.When talking about the rise in “internet-based hostility”, Paul Gainnasi, cross-government hate crime programme manager, stated:

Part of that came about because of the ease of being anonymous in that sphere, and part of it was the lack of editorial control. Before that, if I wanted to post on your website, I needed you to approve it or to give me a password to do it, or I had to have my own space, whereas Web 2.0, through social media, allowed instant interaction and it changed everything.83

Anonymity and lack of editorial control are business decisions, not a necessary part of the technology. We should not accept that the online space is inevitably more dangerous and abusive than the offline space.

85.This inquiry was never intended to look into the different potential models for regulating social media, but it is obvious that the current model has failed disabled people. Participants in our events were clear that something has to change. We heard suggestions ranging from using anti-social behaviour orders to prohibit home internet connections to prosecuting internet companies under joint enterprise laws. The Government needs to be realistic about how much can be achieved without formal controls. Technology moves at such a speed that agreements with companies that are currently popular quickly become meaningless as users move on to other platforms.

86.The Government must accept its responsibility for ensuring disabled people’s safety online. We recommend that the Government acknowledge that the current model of self-regulation of social media has failed—and is still failing—disabled people. We recommend that it takes steps to ensure that social media companies accept their responsibility for allowing illegal and abusive content on their sites and the toxic environment this creates for users. We recommend that the Government ensures that social media companies accept their responsibility to make sure that disabled people can make use of online tools as other users can.


47 Department for Digital, Culture, Media & Sport, “Internet Safety Strategy green paper”, 11 October 2017, last updated 22 May 2018

48 Department for Digital, Culture, Media and Sport (ONL0006)

49 Department for Digital, Culture, Media and Sport (ONL0006)

50 Department for Digital, Culture, Media and Sport (ONL0006)

51 Equality and Human Rights Commission, “Being disabled in Britain A journey less equal”, April 2017, p 65

52 Summary of online engagement and consultation events

56 Summary of consultation events

57 Summary of consultation events

61 Summary of roundtable meeting with Mencap digital champions

62 Summary of consultation events

63 Summary of consultation events

64 Google UK (ONL0010)

65 Dimensions Supplementary Written Evidence

66 Mencap Supplementary Written Evidence

68 Women and Equalities Committee, Disability and the Built Environment, Ninth Report of Session 2017–19,

70 Dimensions, Housing

77 Dr Loretta Trickett and Nottingham Civic Exchange (ONL0007)

78 Summary of consultation events and roundtable with Mencap’s digital champions

79 People With Disabilities Are Having Their Photos Stolen And Facebook Isn’t Helping Them, Buzzfeed news, 23 February 2017, last updated 24 February 2017

80 Home Affairs Committee, Fourteenth Report of Session 2016–17, Hate Crime: abuse, hate and extremism online, HC 609
Digital, Culture, Media and Sport Committee, Fake News inquiry
House of Lords Communications Committee, The Internet: to regulate or note to regulate inquiry

81 Culture, Media and Sport Committee, Disinformation and “Fake News: Interim Report, Fifth Report of Session 2017–19, para 27




Published: 22 January 2019