Culture, Media and Sport CommitteeWritten evidence submitted by Facebook

Summary

1. This submission sets out Facebook’s policies and actions around safety to help inform the Committee’s inquiry into these matters.

2. Facebook’s mission is to make the world more open and connected and to give people the power to share. Facebook is a global community of more than 1.15 billion people and hundreds of thousands of organizations. Each person and organization that uses Facebook represents unique opinions, ideals and cultural values. With this immense diversity, we work to foster and safe and open environment where everyone can freely discuss issues and express their views, while respecting the rights of others.

3. The policies Facebook has adopted are designed to reflect real world interactions. While ignorance still exists both on an off of Facebook, we believe that ignorance will not be defeated by covering up its existence, but rather by confronting it head on.

4. We have learned that requiring people to engage in conversations and share their views using their real names and identities promotes an environment of accountability, where contributors must take responsibility for their own thoughts and actions.

5. Facebook’s detailed Statement of Rights and Responsibilities (“SRR”) describes the content and behaviour that is and is not permitted on our service. With respect to safety, our SRR specifically prohibits the following types of behaviours:

(a)Bullying, intimidating, or harassing any user.

(b)Posting content that: is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence.

(c)Using Facebook to do anything unlawful, misleading, malicious, or discriminatory.

6. Further, Facebook encourages people to report content that they believe violates our terms. We have “Report” buttons on every piece on content on our site. When we receive a report, we have a dedicated team of professionals that investigate the piece of content in question. If the content in question is found to violate our terms, we remove it. If it does not violate our terms, then we do not remove it. We also take action, such as disabling entire accounts (eg of trolls) or unpublishing Pages, if deemed necessary.

7. We want everyone on Facebook to feel well-equipped to keep themselves safe when using the service. In particular, we have focused on educating our teenage users, their parents and teachers. We understand that younger users warrant additional protection and we work in partnership with external audiences to educate them on our safety tools.

How we Combat Child Sexual Exploitation

8. Facebook has a zero tolerance policy for child exploitation and abuse and we fight against these activities aggressively. We employ advanced technology to protect minors on our site, and we deploy innovative, industry-leading measures to prevent the dissemination of child exploitation material. We have also built complex technical systems that either block the creation of this content altogether, including in private groups, or flag it for immediate review by our safety team.

9. For instance, in collaboration with Microsoft and the National Center for Missing and Exploited Children (“NCMEC”), we utilize a technology called PhotoDNA that allows us to instantaneously identify, remove and report known images of child exploitation content to NCMEC. PhotoDNA is a game-changing technology that has helped enormously in preventing the sharing of abusive materials on Facebook and other services. Once a piece of content is reported to NCMEC, NCMEC then coordinates with law enforcement authorities around the world to investigate and prosecute people who are creating and sharing these images online.

10. In the rare instance where child exploitation content is reported or identified by our users on Facebook (Ie, if the content is a new image of abuse which is not already in our database of known images), we similarly, take it down as soon as possible and report it to NCMEC.

11. All related reports for the UK are referred to CEOP. CEOP is then able to evaluate each case and where appropriate, engage local police forces for further investigation and subsequent prosecution.

12. Additionally, we provide a dedicated escalation channel for the IWF and other Internet hotlines to inform us of illegal images being shared on Facebook,. It is a real testament to the effectiveness of our counter measures, and particularly PhotoDNA’s technology, that very few reports of such content are received from these hotlines in any 12-month period,

13. We also work hard to identify, investigate and address grooming behaviour—direct communication by an adult with a minor with the objective of illegal sexual contact. We work closely with law enforcement, external partners, expert academics and the Facebook community itself, to spot grooming and to combat it.

How we Keep People Safe

Handling reports

14. We have a comprehensive and well-resourced User Operations (“UO”) team that services all of our users twenty-four hours each day, seven days a week. This team handles every report that is submitted to Facebook (including by people who do not have a Facebook account). Hundreds of employees work on the User Operations team, which is located in four offices across the globe, namely Hyderabad (India), Dublin (Ireland), Austin (US) and Menlo Park (US). They handle reports in over twenty-four languages and cover every time zone. Structuring the teams in this manner allows us to maintain constant coverage of our support queues for all our users, no matter where they are in the world. 

15. In order to effectively review such reports, UO is separated into four specific teams, which review different report types—(1) the Safety team, (2) the Hate and Harassment team, (3) the Access team, and (4) the Abusive Content team. When a person reports a piece of content, depending on the nature of the report, it will be directed to the appropriate team. For example, if you are reporting content that you believe contains graphic violence, the Safety Team will receive and assess the report.

16. If one of these teams determines that a reported piece of content violates our policies or our Statement of Rights and Responsibilities, we will remove it and warn the person who posted it. In addition, we may also revoke a user’s ability to share particular types of content or use certain features, disable a user’s account, or if need be, refer issues to law enforcement. We also have special teams dedicated to handle user appeals for instances where users feel Facebook might have made a mistake in taking a specific action. We recently published an infographic, which shows the processes involved in Facebook’s user operations for handling reports.1

17. Further, we provide all users with a tool, “The Support Dashboard,” which is designed to give a user much better visibility and insight into the reports they make on Facebook. The Support Dashboard enables people to track their submitted reports and informs them about the actions taken by our review team. We think this will help people better understand the reporting process and will educate them about how to resolve their issues in the future. As people see which of their reports result in a removal of content, we believe users will be better equipped to make actionable reports. We posted about the Support Dashboard when we unveiled it in April 2012, on our Safety Page.2

Keeping young people safe

18. We provide educational materials through our Family Safety Centre3 which provides information for anyone interested in keeping children safe online. The Centre includes specific guidance for teenagers and also provides information about our global Safety Advisory Board.

19. Our Safety Advisory Board is comprised of five leading Internet safety organizations from North America and Europe. These organizations serve in a consultative capacity to Facebook on issues related specifically to online safety. The members of our Safety Advisory Board include the following:

Childnet International.

National Network to End Domestic Violence.

Connect Safely.

The Family Online Safety Institute.

WiredSafety.

20. We maintain a Facebook Safety Page,4 which has been “liked” by over one million people. All these fans can therefore see the latest information on safety education directly in their Facebook Newsfeeds. We regularly post information, tips, and articles about safety on Facebook and highlight debates on the topic of digital citizenship, as well as links to useful content from third-party experts.

Under 13s

21. It is well understood that, like many other online services, Facebook’s SRR requires users to be 13 to sign up for a Facebook account. We therefore require all users to provide their date of birth when signing up for Facebook to ensure that people under 13 are not able to open an account.

22. However, we are well aware of different studies in the UK and elsewhere that demonstrate how many under-13s are falsifying their ages to open accounts, often with their parents’ knowledge and even their help. There is no fail-safe way of addressing the issue of under-age users and we continue to explore best approaches to the issue with policymakers, our Safety Advisory Board, and our safety partners.

UK partnerships and schools

23. We work with a range of expert partners in the UK who go into schools to provide training and guidance for teachers, pupils and parents. We take a partnership approach because we have found it to be particularly effective. Organisations like Childnet, the South West Grid for Learning, the Diana Award (Anti-Bullying), Childline and Parentzone are trusted safety and advice brands, who are able to help schools and parents navigate the great variety of online services used by children including Facebook—from online games to mobile devices, from Twitter to Snapchat and Ask.fm. 

24. We particularly support the work of Childnet and the South West Grid for Learning, which runs the InSafe hotline for professionals in education, and work directly in schools. In recent years, we have funded the production of thousands of Facebook guide booklets, which these organizations distribute widely. In November 2012 we launched an anti-bullying initiative in partnership with Childline, which encouraged bystanders to take action when they saw others being bullied.

25. We participate and support the annual “Safer Internet Day” run by the UK Safer Internet Centre, which comprises these two organisations and the IWF. 

26. Facebook was the principal sponsor of The Festival of Education at Wellington College in June 2013. The Festival is the biggest gathering of its kind in the UK with over 2,500 delegates attending. Over 500 different schools, colleges and universities were represented at the Festival. In addition to these delegates, more than 600 pupils from over 50 schools took part. At the Festival, the Education Foundation partnered with us to launch an updated Facebook Guide for Educators.5 This guide provides up-to-date advice on privacy and safety topics as well as a general introduction to Facebook. The guide features two pilot programmes where Facebook has been used for learning in two different schools, the Wellington College and the Nautical School in London. A short film summarising the work was premiered at the Festival (see link on the Education Foundation website). A copy of the Facebook Guide was provided to every attendee at the Festival and is also available to download for free from the Education Foundation website. 

27. We will continue to partner with these and other relevant organisations over the coming months and years to help equip young people, teachers and parents with the best information and tools they need to promote online safety and healthy digital citizenship.

September 2013

1 https://www.facebook.com/notes/facebook-safety/what-happens-after-you-click-report/432670926753695

2 https://www.facebook.com/notes/facebook-safety/more-transparency-in-reporting/397890383565083

3 http://www.facebook.com/safety

4 http://www.facebook.com/fbsafety

5  http://www.ednfoundation.org/2013/06/21/facebook-guide-for-educators/

Prepared 18th March 2014