HC 772 Defence CommitteeWritten evidence from Research Councils UK (RCUK)

1. Research Councils UK is a strategic partnership set up to champion research supported by the seven UK Research Councils. RCUK was established in 2002 to enable the Councils to work together more effectively to enhance the overall impact and effectiveness of their research, training and innovation activities, contributing to the delivery of the Government’s objectives for science and innovation. Further details are available at www.rcuk.ac.uk

2. This evidence is submitted by RCUK and represents its independent views. It does not include, or necessarily reflect the views of the Knowledge and Innovation Group in the Department for Business, Innovation and Skills (BIS). The submission is made on behalf of the following Councils:

Engineering and Physical Sciences Research Council (EPSRC).

Economic and Social Research Council (ESRC).

Natural Environment Research Council (NERC).

3. NERC contributions primarily from British Antarctic Survey (BAS) and NERC National Centre for Atmospheric Science (NCAS).

NomenclatureDefining the Terms RPAS, UAS and “Drone”; More than just a Flying Drone

4. “Drone” is a term often associated with 1960s, and earlier, platforms or their use in military applications.The most commonly used term for their use in scientific research and other civil applications is UAS or unmanned aerial systems. Remote piloted aerial systems (RPAS) and unmanned aerial vehicles (UAVs) are also accepted terms.

5. These can be remotely piloted from the ground and/or operate autonomously and have a number of potential scientific applications (see response to question 2).

6. Autonomous and intelligent systems which are capable of independent/semi-independent action in dynamic, unpredictable environments are a challenge in a number of scenarios. A proportion of industrial applications uphold a human in loop model of working, where machines are becoming more integrated and automated. Some relevant areas1 of research are:

software architectures;

sensor exploitation;

situational awareness;

use in battle and other contested spaces;

operational efficacy;

decision making and planning;

information management;

verification of autonomous systems; and

model building and learning.

Current Utility and DispersalFor what purposes are RPAS used Currently?

7. Within the science community they are used for a number of applications including species surveys, terrain mapping and geophysics surveys with researchers working on applications in areas such as remote inspection in hostile environments, autonomous driving, defence, logistics, security, and environmental research (eg atmospheric and climate studies).

8. Unmanned aerial vehicles (UAVs) have important future roles to play in the support of civil contingency events such as industrial accidents, natural hazards such as earthquakes and volcanos and CBRN-related events.

9. The advantages of UAVs can relate to their endurance, ability to enter hazardous environments and perform repetitive activities with high precision. UAVs in themselves provide only the platform for supporting civil and defense sensing activities; they are simply the means by which instruments and sensors may be brought in to theatre.

10. The British Antarctic Survey (BAS) uses UAVs during the Antarctic winter to acquire meteorological data and undertake ice flux measurements over sea ice. Unmanned quadrocopters are used for species measurements on seals and penguins.

11. NERC is currently supporting a major research programme to examine upper atmospheric interactions in a collaborative project using the Northrop Grumman created NASA Global Hawk UAV platform. This programme is enabling UK participation in high impact UAV enabled science, through collaboration with NASA partners. The programme is studying the chemical and physical properties of the tropical tropopause layer (TTL) and the impacts of the TTL in controlling the composition of the upper troposphere and lower stratosphere2.

12. UAVs have ISTAR (Intelligence, Surveillance, Target Acquisition and Reconnaissance) applications in battle and other contested spaces. This contributes to situational awareness.

13. Where offensive operations are concerned, UCAVs (Unmanned Combat Aerial Vehicles), such as Predators and Reapers are deployed and used.

Lessons Learned from Operations in Afghanistan

14. No response from RCUK, though Professor Wheeler’s Project will cover this area in its plan of work (see para 43).

Tomorrow’s PotentialWhat Additional Capabilities will the UK Seek to Develop from now to 2020?

15. The Research Councils have provided support for a balanced community by creating a number of opportunities for a healthy research base in this area of research that has encouraged academic excellence, collaboration with industry, training for future leaders and responsible innovation by those working in the field.

16. The development of instrumentation for UAVs is a complex and challenging activity, requiring substantial reductions in mass and power consumption in parallel to greater autonomy over, for example, hand portable or manned aircraft-mounted sensors. The specialised nature of such devices has for many applications resulted in the platform UAV capability running some considerable way ahead of sensors themselves.

17. Autonomous and intelligent systems represent a key potential growth area for UK industry. The Research Councils are committed to understanding the research and training needs of industry and connecting industry and academia to ensure maximum impact from the research we fund.

18. The development of greater payload and endurance platforms and enhanced platform sensors are of key interest for future development.

19. UAVs have a role in environmental research as part of an integrated approach to remote sensing and Earth Observation that also includes space based observations. This was highlighted at a recent Challenge Workshop held by the UKSA Centre for Earth Observation Instrumentation.3

20. EPSRC focuses efforts on nationally important industrial sectors; of particular relevance in this area are the Aerospace, Defense and Marine sector and the Electronics, Communications and IT sector. EPSRC currently supports a portfolio of approximately £22.1 million of research directly relevant to Unmanned Air Vehicles.

Examples of the key activities supported by EPSRC are given below.

Autonomous and Intelligent Systems Academic Network

21. EPSRC partnered with BAE Systems, Schlumberger, National Nuclear Laboratory (NNL), Sellafield Ltd, Network Rail, SCISYS, DSTL and the UK Space Agency to fund robotics research and the development of intelligent autonomous systems, such as unmanned aircraft, which are vital to many major UK companies, emerging industries, and SMEs, from advanced manufacturing to oil and gas exploration, nuclear energy to railways and automotive, healthcare to defense.

22. The £16 million partnership between the government and industry has helped to bring together virtual network for Autonomous and Intelligent Systems concepts and research in the UK known for its academic excellence, industrial teaming and exploitation. It has created a broad cross-sector community in understanding intelligent systems technology and the broad challenges that are shared across the different sectors. The partnership has created a dynamic and collaborative model between industry and academia with the promise of real exploitation in the near future. The funded proposals include research into safe ways of monitoring in dangerous environments such as deep sea installations and nuclear power plants; “nursebots” that assist patients in hospitals, and aerial vehicles that can monitor national borders or detect pollution.

23. Examples of the funded projects:4

24. Accessing Hazardous Environments: The University of Oxford will explore how multi-unmanned vehicles can be coordinated to act together to perform different tasks and intelligently navigate without access to aids like GPS. This work can have applications in areas such as remote inspection in hostile environments, autonomous urban driving, defense, logistics, security and space robotics.

25. Improving Human autonomous systems interaction: The University of Bath is to test different models of information gathering, communication and decision-making between humans and autonomous systems with the aim of improving reaction speed, safety and reliability.

26. The Self-drive submarine: Kings College London plan to demonstrate how Autonomous Underwater Vehicles (AUVs), performing inspection and investigation missions, can cooperate and pool information to achieve success when communications are intermittent and external control restricted, this could apply to space or other hostile environments. The team will focus on finding ways to address uncertainty and changing conditions, how plans can be modified and how sensor data is perceived and interpreted. The questions relate to platforms in the airborne environment and this section which relates to marine vehicles may not be relevant.

27. Improving automated, intelligent maintenance: Cranfield University extends research in novel sensing, e-maintenance systems, and decision-making strategies. The integration of sensor-based information in geographically dispersed and less structured environments poses challenges in technology and cost justification which will be addressed for rail, aerospace and industrial applications.

Centres for Doctoral Training

28. EPSRC aims to deliver “a balanced skills portfolio to avoid systemic skills shortages in the UK and increase the satisfaction of both students and their employers.”5 Centres for Doctoral Training (CDTs) are one of the three main ways by which EPSRC provides support for Doctoral Training. Quality (of student, training and research environment) is our first priority and is in addition to academic excellence which is a prerequisite. The new round of centres, where we expect to invest £350 million are currently in the process of being assessed and are due to be announced in December 2013.

29. These flagship investments create Centres that train up to five cohorts of 10 or more students over a period of up to eight years and are a key part of EPSRC’s strategy for supporting the next generation of scientists and engineers. One of the priority areas chosen is Autonomous Systems and Robotics. This priority area covers training in fundamental skills underpinning the requirements of the autonomous systems and robotics industrial and research capability, and the development of platform combinations of sub disciplines to create flexible self-functioning systems.

30. Highly skilled researchers are required to maximise the wide ranging UK strengths & future opportunities of this area, which include surgical and service robotics, manufacturing and the broad applications of autonomous systems within leading UK sectors such as automotive, oil and gas and aerospace and defence. Proposals should include a cross -sector approach with training in underpinning aspects of autonomous systems research spanning the breadth of Engineering and ICT. Links to users and clients should be tangible & relevant and not confined to a single sector. Centre based cohort training is essential to develop deep fundamental knowledge of individual aspects of autonomous systems, consider ethical implications and their application to catalyze vital cross sector advancement in the field.

Capital for Great Technologies, Robotics and Autonomous systems

31. The Chancellor of the Exchequer announced additional capital funding for the eight great technologies in the pre-budget statement. The EPSRC has invested the £25 million funding it received to strengthen research capability by supporting requests for capital equipment in the area of Robotics and Autonomous systems research. The main objectives of the call were to:

Strategically grow the research base in this area for the UK’s benefit, focussed on areas of strength or areas where the UK has the potential to lead.

Deliver advanced technology with the potential for translation and take-up.

Enable talented people access to state of the art equipment.

32. The £25 million awarded to 10 UK Universities by EPSRC attracted additional funding contributions of £8.4 million from higher education institutions and £6 million from industrial partners.6

33. NERC is investing £10 million for research and technology development of marine autonomous robotics. The UK has a strong track record in development of autonomous underwater vehicles. NERC, particularly through its National Oceanography Centre (NOC), has over several decades led the world in use of autonomous technologies in the exploration of our oceans, which poses significant technical challenges whilst offering enormous scientific opportunity.

34. Marine autonomous robotic vehicles also pave the way for the development of new UK industries like offshore carbon capture and storage, where in situ monitoring of storage sites in the North Sea will need high levels of technical assurance and innovation.7


35. Researchers at the universities of Southampton, Oxford and Nottingham have been awarded £5.5 million in EPSRC funding to establish the ORCHID programme. The ORCHID programme aims to develop a detailed understanding of the collaborative relationships that can be formed between humans and computational agents and ensure that these “human-agent collectives” are able to operate in a robust, reliable and trustworthy manner.

36. The ORCHID team are working with a number of industrial partners to apply their findings to a range of real-world applications, including the development of a disaster response system that is designed to bring together unmanned autonomous vehicles and first responders.

Summer school

37. EPSRC support has contributed to Cambridge hosting an international summer school on Human-Robot Interaction (HRI) in August 2013. The summer school brought together students and early career researchers with internationally recognised experts in the emerging field of HRI from academia and industry. HRI research offers the potential to improve the capability of unmanned autonomous systems by developing new and improved control mechanisms. For example, EPSRC-funded researchers at the University of Bath are investigating new technologies to support collective decision making between humans and autonomous agents.

Special Interest Group in Robotics and Autonomous Systems

38. This Special Interest Group in Robotics and Autonomous Systems (RAS SIG) has been established to support the development of a new industrial RAS sector in the UK in order to realise increased productivity and growth. Members include BIS, the Technology Strategy Board and EPSRC.

39. Currently much of the required capability in this area is fragmented across universities and companies, and across market sectors, though it is world-class in many aspects. The SIG will support the development of such a community, using existing Knowledge Transfer Network infrastructure to enable the relevant knowledge to be shared more readily. The RAS SIG will develop a strategy for Robotics and Automated Systems that will identify UK resources and capabilities to respond to market drivers with the specific aim of creating jobs and wealth, promoting public understanding of the role that RAS can play in growing the UK economy and in meeting societal needs.

Constraints on the use of RPAS in the UK and Overseas

40. In the UK, operating in controlled airspace is a problem. In Antarctica these constraints do not exist. For science, the term UAS is most appropriate as the unmanned platform is not effective without the requisite sensors. The combination of sensors and platforms is most important.

41. The costs and timescales for development for CBRN sensors for UAVs are potentially very high; however such devices share many common technologies with environmental sensors. Environmental science has been an early adopter of UAV technologies, and many devices being developed for environmental UAV sensing have potential dual-uses in a defence context, although the exchange of innovation between these sectors is less than perfect. Accepting limitations imposed by security, substantial gains can be made by improving the flow of technological knowledge for UAV sensing between sectors and a greater appreciation of common requirements. Sharing of this kind is now beginning—for example NERC’s National Centre for Atmospheric Science and the University of York are working with AWE plc. on mapping environmental airborne sensing capability to support national security applications.

42. The UK has a dynamic instrument development sector, both commercial and with the University base. A barrier however to the acceleration of new UAV-borne sensors, and by extension application of UAVs in new roles, has been a lack of accessible large UAV test-bed facilities within the UK. The national position would be greatly enhanced through the provision of a shared development and test UAV platform for new sensor technologies, for not only defence and environmental sectors but also potentially in areas such as communications, precision agriculture and aviation research.

43. In terms of political constraints, the ESRC has funded Professor Nick Wheeler, Birmingham University, to pursue a project on The Political Effects of Unmanned Aerial Vehicles on Conflict and Co-operation within and between States. The primary objective of the project is two-fold: first, to contribute to building the evidentiary base for informed policy making on the use of US/UK drones in overseas theatres of operation. Second, to explore how far different values, belief systems, narratives, and historical contexts lead to radically different interpretations of whether US/UK drone strikes are increasing or decreasing the security of both the intervening and targeted actors. To this end the project addresses the research question of whether the use of drones by a state on the territory of another actor increases or decreases the propensities for conflict and cooperation both within and between those actors. The project started in September 2013.

Ethical and Legal Issues Arising from the use of RPAS

Principles of Robotics: regulating robots in the real world8

44. In September 2010, experts drawn from industry, technology, the arts, law and social sciences met at the joint EPSRC and AHRC Robotics Retreat to discuss robotics, its applications in the real world and the huge amount of promise it offers to benefit society.

45. Robots have left the research lab and are now in use all over the globe, in homes and in industry. We expect robots in the short, medium and long term to impact our lives at home, our experience in institutions, our national and our global economy, and possibly our global security.

46. However, the realities of robotics are still relatively little known to the public where science fiction and media images of robots have dominated. One of the aims of the meeting was to explore what steps should be taken to ensure that robotics research engages with the public to ensure this technology is integrated into our society to the maximum benefit of all of its citizens. As with all technological innovation, we need to try to ensure that robots are introduced from the beginning in a way that is likely to engage public trust and confidence; maximise the gains for the public and commerce; and proactively head off any potential unintended consequences.

47. The rules for real robots, in real life, must be transformed into rules advising those who design, sell and use robots about how they should act. The meeting delegates devised such a set of “rules” with the aim of provoking a wider, more open discussion of the issues. They highlight the general principles of concern expressed by the Group with the intent that they could inform designers and users of robots in specific situations. These new rules for robotics (not robots) are outlined below please see the appendices for Tables 2 and 3.

48. The five ethical rules for robotics are intended as a living document. They are not intended as hard-and-fast laws, but rather to inform debate and for future reference. Obviously a great deal of thinking has been done around these issues and this document does not seek to undermine any of that work but to serve as a focal point for useful discussion.

49. Professor Wheeler’s project will look at these issues with regard to Afghanistan, Pakistan, Yemen and Somalia. Of particular importance here are considerations relating to International Humanitarian Law, International Human Rights Law and, where applicable, US Domestic Law. This is particularly contested with regard to intervention under the former. There are also legal questions on intelligence sharing which the project will cover.

Trusted Autonomous Systems Fellowship, Professor AR Lomuscio

50. EPSRC funded Leadership Fellow Professor AR Lomuscio, Imperial College London, to develop the scientific and engineering underpinnings for autonomous systems (AS) to become part of our every-day’s life. To achieve this Professor Lomuscio is researching the formulation of logic-based languages for the principled specification of AS, the development of efficient model checking techniques, the construction and open source release of a state-of-the-art model checker for autonomous systems to be used for use-cases certifications and the validation of these techniques in three key areas of immediate and mid-term societal importance: autonomous vehicles, services, and e-health. This fellowship intends to pursue novel techniques in computational logic to answer the technical challenges above. A success in these areas will open the way for the verification of AS, thereby opening the way to their certification for mainstream use in society.

Responsible innovation project

51. The increasing use and capability of autonomous systems, raises complex ethical issues for researchers. An EPSRC-funded research network led by the Oxford e-Research Centre aims to raise the profile of ethical issues within the academic research base by bringing researchers together to discuss ethical challenges and possible responses. In addition to raising awareness, the network aims to develop recommendations and ensure that best practice is disseminated to researchers.

September 2013


Table 1


1.0 Software Architectures

This area of research addresses software architectures for housing intelligent autonomous systems software. Software architectures are essential for organising the component software functions that form an autonomous system. A good architectural approach offers modularity, traceability, certifiability and robustness.


2.0 Sensor Exploitation

Autonomous systems are data driven and rely on stored and/or sensed data to create plans and make decisions. Effective exploitation of sensor and stored data improves the performance of an autonomous system through inferring useful information from potentially large volumes of sensor data.


3.0 Situational Awareness and Information Abstraction

Autonomous systems have to create an internal model (world model) of the external environment if they are to simulate a degree of human awareness.

Autonomous systems also need awareness of their own current internal state and predicted future state, so that their own capabilities and limitations are understood. This should help to make systems more robust to failures and capable of self-sustainment

A combination of information inferred from real-time sensor data feeds together with stored or communicated reference information provides means, through reasoning, to raise the abstraction level to provide inferred contextualisation and beliefs to inform planning and decision making processes.


4.0 Planning

Processes, algorithms and tools for creating, maintaining and metricating plans. Plan formats in this sense are generic, and could be seen as a set of simple instructions or steps for a machine to follow. Plans may be expressed at varying levels, including at the level of high-level goals which inform detailed planning within the autonomous system.


5.0 Trusted Decision Making and Human Machine Interaction

Methods techniques and algorithms for machine decision making, and human interaction with the decision making system. Solutions should consider varying degrees of software integrity levels in the chosen solution.


6.0 Information Management

Intelligent methods of managing information transfer from an autonomous system, including managing bandwidth, prioritising communications and observing operating constraints. Optimising transfer of information.


7.0 Verification and Validation of Autonomous Systems

Autonomous systems can be non-deterministic and existing methods of testing systems software may not provide sufficient evidence to support certification. New techniques for the verification and validation of complex decision making and planning software are needed.


8.0 Model Building and Learning

Autonomous systems will carry out their activities by reference to governing reference models of various types, which will place hard and soft constraints on the operation of the system, as well as more general guidance. Over time, such reference models need to be developed and adapt to new external influences and to learning about effective operation

Table 2


Note: The rules are presented in a semi-legal version; a more loose, but easier to express, version that captures the sense for a non-specialist audience and a commentary of the issues being addressed and why the rule is important.


General audience



Robots are multi-use tools. Robots should not be designed solely or primarily to kill or harm humans, except in the interests of national security.

Robots should not be designed as weapons, except for national security reasons.

Tools have more than one use. We allow guns to be designed which farmers use to kill pests and vermin but killing human beings with them (outside warfare) is clearly wrong. Knives can be used to spread butter or to stab people. In most societies, neither guns nor knives are banned but controls may be imposed if necessary (eg gun laws) to secure public safety. Robots also have multiple uses. Although a creative end-user could probably use any robot for violent ends, just as with a blunt instrument, we are saying that robots should never be designed solely or even principally, to be used as weapons with deadly or other offensive capability. This law, if adopted, limits the commercial capacities of robots, but we view it as an essential principle for their acceptance as safe in civil society.



Humans, not robots, are responsible agents. Robots should be designed & operated as far as is practicable to comply with existing laws & fundamental rights & freedoms, including privacy.

Robots should be designed and operated to comply with existing law, including privacy.

We can make sure that robot actions are designed to obey the laws humans have made.

There are two important points here. First, of course no one is likely deliberately set out to build a robot which breaks the law. But designers are not lawyers and need to be reminded that building robots which do their tasks as well as possible will sometimes need to be balanced against protective laws and accepted human rights standards. Privacy is a particularly difficult issue, which is why it is mentioned. For example, a robot used in the care of a vulnerable individual may well be usefully designed to collect information about that person 24/7 and transmit it to hospitals for medical purposes. But the benefit of this must be balanced against that person’s right to privacy and to control their own life eg refusing treatment. Data collected should only be kept for a limited time; again the law puts certain safeguards in place. Robot designers have to think about how laws like these can be respected during the design process (eg by providing off-switches).

Secondly, this law is designed to make it clear that robots are just tools, designed to achieve goals and desires that humans specify. Users and owners have responsibilities as well as designers and manufacturers. Sometimes it is up to designers to think ahead because robots may have the ability to learn and adapt their behaviour. But users may also make robots do things their designers did not foresee. Sometimes it is the owner’s job to supervise the user (eg if a parent bought a robot to play with a child). But if a robot’s actions do turn out to break the law, it will always be the responsibility, legal and moral, of one or more human beings, not of the robot (We consider how to find out who is responsible in law 5, below).



Robots are products. They should be designed using processes which assure their safety and security.

Robots are products: as with other products, they should be designed to be safe and secure.

Robots are simply not people. They are pieces of technology their owners may certainly want to protect (just as we have alarms for our houses and cars, and security guards for our factories) but we will always value human safety over that of machines. Our principle aim here, was to make sure that the safety and security of robots in society would be assured, so that people can trust and have confidence in them.

This is not a new problem in technology. We already have rules and processes that guarantee that, eg household appliances and children’s toys are safe to buy and use. There are well worked out existing consumer safety regimes to assure this: eg industry kite-marks, British and international standards, testing methodologies for software to make sure the bugs are out, etc. We are also aware that the public knows that software and computers can be “hacked” by outsiders, and processes also need to be developed to show that robots are secure as far as possible from such attacks. We think that such rules, standards and tests should be publicly adopted or developed for the robotics industry as soon as possible to assure the public that every safeguard has been taken before a robot is ever released to market. Such a process will also clarify for industry exactly what they have to do.

This still leaves a debate open about how far those who own or operate robots should be allowed to protect them from eg theft or vandalism, say by built-in taser shocks. The group chose to delete a phrase that had ensured the right of manufacturers or owners to include “self-defence” capability into a robot. In other words we do not think a robot should ever be “armed” to protect itself. This actually goes further than existing law, where the general question would be whether the owner of the appliance had committed a criminal act like assault without reasonable excuse.


Robots are manufactured artefacts. They should not be designed in a deceptive way to exploit vulnerable users; instead their machine nature should be transparent.

Robots are manufactured artefacts: the illusion of emotions and intent should not be used to exploit vulnerable users.

One of the great promises of robotics is that robot toys may give pleasure, comfort and even a form of companionship to people who are not able to care for pets, whether due to rules of their homes, physical capacity, time or money. However, once a user becomes attached to such a toy, it would be possible for manufacturers to claim the robot has needs or desires that could unfairly cost the owners or their families more money. The legal version of this rule was designed to say that although it is permissible and even sometimes desirable for a robot to sometimes give the impression of real intelligence, anyone who owns or interacts with a robot should be able to find out what it really is and perhaps what it was really manufactured to do. Robot intelligence is artificial, and we thought that the best way to protect consumers was to remind them of that by guaranteeing a way for them to “lift the curtain” (to use the metaphor from The Wizard of Oz).

This was the most difficult law to express clearly and we spent a great deal of time debating the phrasing used. Achieving it in practice will need still more thought. Should all robots have visible bar-codes or similar? Should the user or owner (eg a parent who buys a robot for a child) always be able to look up a database or register where the robot’s functionality is specified? See also rule 5 below.



The person with legal responsibility for a robot should be attributed.

It should be possible to find out who is responsible for any robot.

In this rule we try to provide a practical framework for what all the rules above already implicitly depend on: a robot is never legally responsible for anything. It is a tool. If it malfunctions and causes damage, a human will be to blame. Finding out who the responsible person is may not however be easy. In the UK, a register of who is responsible for a car (the “registered keeper”) is held by DVLA; by contrast no one needs to register as the official owner of a dog or cat. We felt the first model was more appropriate for robots, as there will be an interest not just to stop a robot whose actions are causing harm, but people affected may also wish to seek financial compensation from the person responsible.

Responsibility might be practically addressed in a number of ways. For example, one way forward would be a licence and register (just as there is for cars) that records who is responsible for any robot. This might apply to all or only operate where that ownership is not obvious (eg for a robot that might roam outside a house or operate in a public institution such as a school or hospital). Alternately, every robot could be released with a searchable on-line licence which records the name of the designer/manufacturer and the responsible human who acquired it (such a licence could also specify the details we talked about in rule 4 above). There is clearly more debate and consultation required.

Importantly, it should still remain possible for legal liability to be shared or transferred eg both designer and user might share fault where a robot malfunctions during use due to a mixture of design problems and user modifications. In such circumstances, legal rules already exist to allocate liability (although we might wish to clarify these, or require insurance). But a register would always allow an aggrieved person a place to start, by finding out who was, on first principles, responsible for the robot in question.

Table 3


In addition to the above principles the group also developed an overarching set of messages designed to encourage responsibility within the robotics research and industrial community, and thereby gain trust in the work it does. The spirit of responsible innovation is, for the most part, already out there but we felt it worthwhile to make this explicit. The following commentary explains the principles.




We believe robots have the potential to provide immense positive impact to society. We want to encourage responsible robot research.

This was originally the “0th” rule, which we came up with midway through. But we want to emphasize that the entire point of this exercise is positive, though some of the rules can be seen as negative, restricting or even fear-mongering. We think fear-mongering has already happened, and further that there are legitimate concerns about the use of robots. We think the work here is the best way to ensure the potential of robotics for all is realised while avoiding the pitfalls.



Bad practice hurts us all.

It’s easy to overlook the work of people who seem determined to be extremist or irresponsible, but doing this could easily put us in the position that GM scientists are in now, where nothing they say in the press has any consequence. We need to engage with the public and take responsibility for our public image.



Addressing obvious public concerns will help us all make progress.

The previous note applies also to concerns raised by the general public and science fiction writers, not only our colleagues.



It is important to demonstrate that we, as roboticists, are committed to the best possible standards of practice.

as above



To understand the context and consequences of our research we should work with experts from other disciplines including: social sciences, law, philosophy and the arts.

We should understand how others perceive our work, what the legal and social consequences of our work may be. We must figure out how to best integrate our robots into the social, legal and cultural framework of our society. We need to figure out how to engage in conversation about the real abilities of our research with people from a variety of cultural backgrounds who will be looking at our work with a wide range of assumptions, myths and narratives behind them.



We should consider the ethics of transparency: are there limits to what should be openly available

This point was illustrated by an interesting discussion about open-source software and operating systems in the context where the systems that can exploit this software have the additional capacities that robots have. What do you get when you give “script kiddies” robots? We were all very much in favour of the open source movement, but we think we should get help thinking about this particular issue and the broader issues around open science generally.



When we see erroneous accounts in the press, we commit to take the time to contact the reporting journalists.

Many people are frustrated when they see outrageous claims in the press. But in fact science reporters do not really want to be made fools of, and in general such claims can be corrected and sources discredited by a quiet & simple word to the reporters on the byline. A campaign like this was already run successfully once in the late 1990s.

1 Please see appendices for further information on the EPSRC definitions of these (Table 1)

2 See http://www.nerc.ac.uk/research/programmes/tropopause/ and http://www.nerc.ac.uk/press/releases/2013/02-airborne.asp

3 http://www.ceoi.ac.uk/index.php?option=com_content&view=article&id=109:future-platforms-for-eo&catid=10:latest&Itemid=5

4 Full details of the projects funded from this call can be found here http://gow.epsrc.ac.uk/NGBOViewPanelROL.aspx?PanelId=1-25869652&RankingListId=1-FEH5L

5 See EPSRC Strategic Plan 2010: http://www.epsrc.ac.uk/SiteCollectionDocuments/Publications/corporate/EPSRC_strategic_plan_2010.pdf

6 http://www.epsrc.ac.uk/newsevents/news/2013/Pages/85million.aspx

7 http://www.nerc.ac.uk/press/briefings/2013/03-investment.asp

8 http://www.epsrc.ac.uk/research/ourportfolio/themes/engineering/activities/Pages/principlesofrobotics.aspx

Prepared 24th March 2014