Digital Economy Bill




Digital Policy Alliance

The Digital Policy Alliance (DPA, formerly EURIM) is the independent, not‐for‐profit, politically neutral, cross‐party policy voice of the internet and technology sector. The DPA alerts EU and UK Parliamentarians and policy makers to the potential impacts, implications, and unintended consequences of policies that interact with and leverage online and digital technologies.

As a broad‐based membership organisation, informing policy for a competitive, inclusive, networked society, we focus on digitally‐related issues affecting the regulation of industry and consumer experience. Our working groups produce plain language briefings for EU and UK Parliamentarians, Commissioners, Ministers, civil servants and industry members, and we promote our findings to policy decision‐makers and influencers through a programme of targeted events.

The DPA is committed to supporting Her Majesty’s Government, and the Department for Culture, Media and Sport, in bringing forward the provisions related to age verification in the Digital Economy Bill. Moreover, the DPA wants to ensure that the Bill will be effective in practice, and will deliver the Conservative Party Manifesto pledge with tangible results. To this end, we have identified a number of loopholes in the Bill as currently drafted, and we suggest some solutions as to how these may be tightened up so that child safety online may be demonstrably improved.

Notes for Suggested Amendments to Part 3 (Child Safety Online: Age Verification for Pornography)


The changes below are recommended to ensure:

- An enforceable law;

- An effective law;

- A law with minimal loopholes;

- A law which matches the expectations of the electorate;

- A law which reflects the Conservative Manifesto Pledge;

- A law which robustly protects children.

We note that some of these concerns have already been raised by MPs during the second reading of the Bill.

1) Scope Loophole

a. The law should apply to all pornographic material made available on the internet, not just that which is made available on a commercial basis.

i. The Conservative Manifesto pledge states "All pornographic material", the Bill waters down that pledge.

ii. Sites which may fall outside the scope as it currently stands:

1. Social Media platforms - e.g. Twitter is awash with pornographic images and videos.

2. Dating sites – Pornographic images are currently available without AV controls. The pornography itself is not being made commercially available.

3. Forums – There are many discussion forums where users post and share pornographic images; these are not made commercially available.

4. Free tubesites – A good lawyer could possibly argue that free tubesites making pornography available are not doing so on a commercial basis.

iii. Such is the extent of pornography provided "non-commercially" that when the media inevitably test the law, pornographic material will be found just as easily as it is now.

b. What would the electorate think? – The government is reneging on its Manifesto pledge, watering down the law, leaving known loopholes, and does not have the will or courage to tackle the real problem. Most importantly it leaves children unprotected.

c. Proposed Amendments:

i. Remove "on a commercial basis" from S15(1), S15(2), S19(3), and S23(1)(a).

ii. Remove "which is operated on a commercial basis" from S15(2)(a).

iii. Remove "which is operated or provided on a commercial basis" from S15(2)(b).

iv. Remove "as operated or provided on a commercial basis" from S15(3)(b).

v. Remove "as done on a commercial basis" from S13(3)(c).

2) Enforcement Loophole

a. Non-compliant sites must be blocked at ISP level: without such enforcement they will continue to flourish.

i. The regulator must be given the necessary tools with a mandate to enforce the law.

ii. The power to block non-compliant sites is fundamental to child protection.

iii. A substantial number of sites hosting pornography will not respond to financial penalties, if they can be contacted at all.

iv. Many sites hide their contact details behind anonymous registration services, so their ownership and contact information is not listed on the WHOIS database (the publically available database containing the records of which person or company owns a particular site/domain).

v. ISP blocking is already used for Child Sex Abuse Images and enforcing Intellectual Property Rights (e.g. geo-blocking of copyrighted material), conditions of service (e.g. usage caps).

vi. The Bill proposes that Ancillary Service Providers be only asked to withdraw their services, what if they decline? What power will the regulator then have to enforce the law?

vii. Non-compliant sites could take payments via anonymous currencies such as BITCOIN, which cannot be withdrawn.

b. What would the electorate think? - It would be a very poor message to send that the government prioritises Intellectual Property Rights over child protection, and is not serious about enforcement.

c. Proposed amendments:

i. Either amend S22(1) to read "it may give notice of that fact to any payment-service provider, ancillary service provider, or internet service provider.

ii. Or include "internet service provider" in the definition of "ancillary service provider" (see point 3 below).

3) There is no clear definition of "Ancillary Service Provider"

a. The Bill should clearly define "Ancillary Service Provider" and use this definition to capture all sectors involved in distributing pornographic material, giving greater power to the enforcement agency.

i. The Bill allows the withdrawal of Ancillary Services, in addition to Payment Service Providers, from non-compliant websites.

ii. Would "Ancillary Service Provider" include:

1. Social Media platforms used to advertise pornographic material?

2. Domain name registrars such as Nominet?

3. Internet Service Providers providing connectivity to the sites?

4. Search engines, such as Google, linking to the sites?

b. Proposed amendment: Add to S22(6) "For the avoidance of doubt, ‘ancillary service provider’ includes but is not limited to: domain name registrars, social media platforms, internet service providers, and search engines."

4) Speed of Enforcement Loophole

a. The enforcement timeline is not defined within the Bill. We believe it is such an important aspect to its success that it should be clearly defined.

b. The time from identifying a non-compliant site to enforcement action taken against it must be swift. Identification > Issue of notice > Deadline > Block, pending appeal should be a matter of days, not weeks or months.

i. Non-compliant sites will use the time gained in the enforcement process to move their site to another domain, starting the process again.

ii. Non-compliant sites must not be permitted to continue flouting the law, whilst compliant sites are penalised.

iii. Any amount of time between identification of site and enforcement against site, leaves children unprotected.

c. Once the law achieves Royal Assent, existing sites should be given 6-12 months to comply. Notices can be issued from the regulator within that timeframe. This acts as a long notice period for the first wave of enforcement and will gain worldwide awareness. This also gives a clear enforcement date with no excuse for non- compliance on the first day of enforcement.

d. What would the electorate think? – Not enough resources are being put into enforcing the law. Do the government really care about protecting children?

e. Proposed Amendments:

i. Amend S20(8)(b) to read: "during the ‘Initial determination period’ fix the date for ending the contravention of section 15(1), as the initial enforcement date".

ii. Add S20(8)(c) to read "after the ‘Initial determination period’ fix a period of 1 week for ending the contravention of section 15(1)".

iii. Add S20(14) The "Initial determination period" is a period of 12 months from the date of the law achieving Royal Assent, to the initial enforcement date.

5) Inconsistency with other media standards

a. The Bill should not create a divergent standard for the regulation of adult content, nor redefine "pornography" as it is currently defined in other regulatory codes.

i. Establishing a new standard for online goes against the DCMS strategy paper from July 2013, which called for a "Common Framework for Media Standards" (

ii. A key reason for OFCOM to subsume ATVOD was to ensure a common standard across platforms, aligning on-demand with linear broadcasting.

iii. Inconsistencies:

1. The Bill creates a new definition for "Pornographic material" – i.e. it is inclusive of 18, R18, and any material which is "produced solely or principally for the purposes of sexual arousal".

2. Currently the Ofcom Broadcasting Code, and ASA BCAP Code allow material to be broadcast free-to-air after 10pm for the purpose of sexual arousal (e.g. the so-called "babe channels" such as "Babestation").

3. Customers are not currently permitted to purchase R18 rated DVDs via mail order, but 18 rated DVDs are permitted.

iv. Either the Bill should only apply to R18 material; be expanded to apply to other platforms; or the Codes pertaining to other platforms should be rewritten to apply the new definition of "pornographic material", therefore requiring age verification to access adult content on TV.

b. What would the electorate think? – Without any age checking, free-to-air TV is more explicit than the internet.

c. Proposed amendments: Remove all instances of "internet" and "online" from Part 3 of the Bill.

6) The On Demand Programme Service (ODPS) Loophole

a. The Bill should not exclude sites already registered as ODPSs via ATVOD/Ofcom.

b. Section 15(5)(a) appears to exclude ODPSs. This means that ODPSs will be subject to a much looser AV requirement.

i. ODPSs (such as UK based pornography sites) are currently only required to add AV controls to R18 content.

ii. The Bill requires AV controls added to 18, R18, and any material which is "produced solely or principally for the purposes of sexual arousal".

iii. This could result in overseas sites circumventing the Bill by registering in the UK as an ODPS in order to show 18 content prior to their AV controls.

c. What would the electorate think? – The Bill creates confusion about what children should be protected from. Surely there should be a single content standard defined within the Bill.

d. Proposed Amendment: Remove S15(5)(a) and S15(6).

October 2016


Prepared 18th October 2016