This is a House of Commons committee report, with recommendations to government. The Government has two months to respond.
Developing AI capacity and expertise in UK Defence
Date Published: Friday 10 January 2025
Artificial Intelligence has the potential to transform Defence in fundamental ways, from back office functions to the front line, and to provide a decisive advantage in military competition and conflict. AI is already being deployed effectively in Ukraine, showing that AI is no longer something on the horizon but a reality with which Defence must engage. Given the breadth of potential applications of AI there are few areas of Defence which would not benefit from being AI-enabled or AI-enhanced, and so MOD needs to start thinking about AI as an integral part of how it solves problems and meets its objectives. The UK has the potential to be home to a first-class defence AI sector, but at present the sector is under-developed and requires cultivation by MOD. This entails both practical and cultural change.
Developing a thriving defence AI sector will require improvements in digital infrastructure, data management and the AI skills base, and we urge MOD to identify where gaps exist so that work can begin to address these issues. We suggest some specific actions it can take, such as making AI a greater part of military education and making it easier for AI specialists to move between the civilian and defence sectors. MOD is also likely to need to work with smaller and non-traditional defence suppliers who currently face barriers to working with defence, and the department needs to adopt its ways of working to make itself a more appealing and effective partner for the sector. It needs to become more comfortable with risk-taking, rapid development cycles and working with non-traditional defence suppliers. We urge MOD to overcome the barriers that currently prevent such companies from working with Defence, such as complex procurement processes and difficulties obtaining security clearance for staff.
Beyond these practical changes, MOD needs to undergo a wider cultural change to adapt to a world where military advantage is increasingly delivered by digital capabilities and cheaper platforms that can be rapidly developed, deployed and iterated. MOD policy documents recognise this, but there is a gap between the department’s rhetoric and the reality, and too often AI is still treated as a novelty rather than as something that will soon be a core part of defence’s toolkit. The ongoing Strategic Defence Review is an ideal moment for Defence to accelerate the required cultural transformation and to modernise both its capabilities and its mindset for the new AI-enabled era.
AI-enabled systems will be most effective if they are interoperable with those of the UK’s allies. The UK and allies would benefit from a mutual understanding of their objectives in developing and deploying defence AI, and shared standards and practices where appropriate. Pillar 2 of the AUKUS Partnership is a generational opportunity for the UK Defence AI sector to work with Australian and US allies at the frontier of AI. We urge the Government to encourage closer partnership between industry in the AUKUS partner nations, including through the establishment of multinational centres of excellence.
We recognise that the use of AI in defence raises important ethical questions. The House of Lords AI in Weapon Systems Committee published a comprehensive report on Lethal Autonomous Weapon Systems in December 2023, and the decision was taken not to duplicate that work. Our report therefore focuses on the UK’s ability to develop and deploy AI in defence.
1. Rapid advances in Artificial Intelligence have the potential to dramatically transform many aspects of our world, and Defence is no exception. Recent conflicts have demonstrated that the use of AI in defence and security is not a theoretical concern or a problem for the future, but a reality with which Defence must engage. In the war in Ukraine both sides have used AI for various purposes: for example, Ukraine uses AI to analyse battlefield data for command and control and damage assessment, and to analyse intelligence and open-source data from social media to geolocate Russian forces and combat disinformation.1 Some Ukrainian drones also use AI targeting and flight control in order to identify and attack targets in areas where electronic jamming disrupts drones’ communication with human pilots.2 Russian use of AI is harder to verify, but there is evidence that it has been used for target identification in loitering munitions among other purposes.3 Much of the AI used by Ukraine is still deployed in a limited or supporting capacity, but it is nonetheless a cornerstone of the country’s war effort.4 The experience of Ukraine shows that AI is delivering military advantage in modern conflict, and it can be assumed that it will continue to do so. Defence must recognise this new reality and react accordingly, or risk ceding military advantage to the UK’s adversaries.
2. In view of the above, we believe it is essential that the UK is able to develop, adopt and deploy AI effectively if it is to meet the defence challenges it faces today and will confront in the years ahead. In the previous Parliament, the Defence Sub-Committee launched an inquiry to examine the UK’s approach to developing and adopting Defence AI. The terms of reference for this inquiry were as follows:
The inquiry’s work was interrupted by the July 2024 general election before the Sub-Committee could report. The UK’s approach to Defence AI needs closer scrutiny, and so we decided to continue the work of that Sub-Committee, drawing on the written and oral evidence it received.5
3. The use of Artificial Intelligence for Defence raises important questions about how the technology can be deployed in a safe, reliable and accountable way, particularly when AI would be used to deliver lethal force. The House of Lords AI in Weapon Systems Committee published a report on Lethal Autonomous Weapon Systems in December 2023, which considered those questions in great detail. The previous Sub-Committee agreed that its inquiry should not duplicate the work of the Lords Committee, and as such did not invite evidence on those matters. We therefore do not examine those questions in this Report.
4. In total the previous Sub-Committee received twenty-two submissions from a range of stakeholders including academics, defence primes, think tanks and AI companies. We are grateful to all those who took the time to share their views and insights.
5. In February and March 2024 the Sub-Committee held a series of oral evidence sessions, as follows:
We are grateful to all our witnesses for their time and for their helpful contributions to this work.
6. To support this inquiry we and the Sub-Committee appointed a Specialist Adviser, Kenneth Payne, Professor of Strategy at King’s College London.6 We are hugely grateful to Professor Payne for his advice and support throughout the inquiry.
7. Our report is structured as follows:
8. There is no generally agreed definition of Artificial Intelligence. The UK’s National AI Strategy offers: “Machines that perform tasks normally requiring human intelligence, especially when the machines learn from data how to do those tasks”. However, the Strategy acknowledges that this may not be applicable in all cases.7 AI encompasses, but is not limited to, software which uses ‘machine learning’ to improve performance. The Ministry of Defence’s Defence AI Strategy further describes AI as “a family of general-purpose technologies” capable of tasks such as language processing, data classification and predictive analysis.8 MOD gives examples of ways in which these technologies can be applied to defence, including such varied applications as object detection, analysing radio frequency signals, career management, last-mile resupply, and autonomous mine hunting.9 The above are appropriate definitions for the purposes of our inquiry.
9. There is little publicly available research on the size and characteristics of the UK’s Defence AI sector. In 2023 research commissioned by the UK Government identified 3,713 AI companies in the UK; of which 2,204 have AI products or infrastructure as the core of their business models.10 It is unknown how many of the UK’s AI companies work on defence: 33 per cent of firms work on computer vision and image processing and a further 29 per cent on autonomous systems—fields which the think tank RAND Europe noted in their written evidence are “highly relevant to defence”11—but this part of the sector will encompass many non-defence companies as well. AI is a rapidly growing sector which is expected to expand significantly in the coming years: evidence from KBR and Frazer-Nash Consultancy estimates that the UK’s military AI sector was worth approximately £285 million in 2023, and projects this will grow to £1.2 billion by 2028.12
10. There is limited data on the characteristics of the UK’s Defence AI companies, but our evidence indicates that they range from established defence primes for whom AI forms a small part of their operations to start-ups dedicated specifically to Defence AI. Most companies involved in AI development are relatively small: Dr Simona Soare, Professor of Innovation, Technology and Strategy at Lancaster University, described the sector as a “maturing ecosystem” in which 75–80 per cent of companies were small enterprises or start-ups.13 In Defence, this category includes Adarga, AdvAI, Skyral, Ripjar and Mind Foundry.14 AI development crosses international boundaries, and multinational companies who are global leaders in AI, such as Microsoft and Amazon, have a presence in the UK.15 There are also international companies which specialise in defence applications of AI with a presence in the UK, such as Helsing and Anduril.16
11. The AI sector is relatively nascent, and it is too early to tell how the UK’s AI and Defence AI sectors will develop. However, our evidence is clear that the UK has advantages that could encourage the growth of a successful sector, including excellent universities and a strong research sector with existing strengths in relevant disciplines like computing and mathematical sciences.17 The UK also possesses significant computing power (compute) capability, an important asset for the development of advanced AI, and has a substantial financial sector that can attract investment into advanced research.18 The UK also has institutional advantages that make it a good environment for AI companies and which can attract investors, including a strong regulatory regime and effective rule of law.19 All of these mean the UK is equipped with what James Black of RAND Europe called “quite good generic strengths” to support a successful AI sector.20
12. At the same time, some witnesses pointed to weaknesses in the UK sector at present. Dr Simona Soare highlighted that while there is certainly an AI ‘ecosystem’ in the UK “there is no particular separate defence AI ecosystem.” Furthermore, the turnover of start-ups is very high, with fewer than one in five companies lasting four years or more. This meant there was little resilience in the ecosystem, and this would make it challenging to scale up defence AI capacity in the UK.21 The Sub-Committee also heard that the approach to AI across defence lacked coherence within the Services themselves. Retired Air Marshal Edward Stringer told us that “across defence AI, you have some very good people working very hard, but within a slightly fragmented system.”22 Although the UK generally is a dynamic environment for venture capital (VC) investment, Dr Mikolaj Firlej of the AI Institute at the University of Surrey wrote that the UK Defence AI sector was “underinvested” with only a few smaller VC funds investing.23
13. Although the UK’s AI sector and Defence AI sectors are small in absolute terms, the country performs relatively well compared to its peers, as the sector is still in the early stages of its development globally. RAND Europe reported that the UK has the third largest number of AI firms worldwide.24 Global AI Indices compiled by Oxford Insights and Tortoise, which rank countries on a combination of factors, rank the UK in third and fourth respectively.25 Although these indices do not rank countries’ strength in Defence AI specifically, there are reasons to believe the UK performs relatively well in this field: Dr Simona Soare noted that the UK funds AI considerably more than its European peers, and is estimated to invest twice as much in Defence AI as France and Germany.26 At the same time, while the UK is ahead many of its peers, in many key indicators it lags far behind the global leaders in AI, the United States and China.27 Both countries’ total government spending on Artificial Intelligence is more than four times that of the UK Government, and the number and processing power of supercomputers in the US and China far outstrips the UK.28 Some submissions took the view that it was not possible for the UK to compete with the scale and investment power of the US sector, but that the UK could capitalise on its existing strengths to develop world-leading expertise in some areas of AI.29
14. Although our written evidence and discussions with witnesses enabled us to draw the above comparisons, a lack of publicly available data meant we were only able to gain a partial picture of how UK Defence AI compares to that of peer nations. Dr Simona Soare said there was little reported data on the size and structure of the UK’s Defence AI ecosystem.30 Dr Keith Dear observed that there is no formal benchmarking used by MOD to systematically compare the UK’s ecosystem with that of allies or adversaries.31
15. conclusion
The UK has many of the right conditions that would allow it to be a global leader in the development of Defence AI, but at present Defence is an under-developed aspect of the AI ecosystem in the UK, and the gap between the UK and the current global leaders in AI, the United States and China, is significant. The UK cannot and should not aim to match those countries’ sectors in terms of scale but should instead seek to specialise in areas of strength and achieve a first-class level of sophistication in these.
16. recommendation
AI is becoming increasingly critical to effective defence, and therefore the UK’s ambition must be to have a first-class Defence AI ecosystem. The MOD should establish measures by which it will compare the UK’s sector against others internationally, so that the sector’s strength relative to those of its peers can be tracked.
17. As Artificial Intelligence has become increasingly advanced, UK Government policy has paid it increasing attention and the Government has published policy papers, including on defence AI specifically. The UK’s first National AI Strategy was published in September 2021, under the previous Government, with the ambition of making Britain a ‘global AI superpower’ over the course of a ten-year plan.32 AI has also featured increasingly in Ministry of Defence policy documents: the 2018 Modernising Defence Programme identified AI and autonomy as two ways in which the character of warfare is changing, and said AI and machine learning was a ’family’ of technologies on which the department would adopt “a more coordinated approach.”33 The 2019 Defence Technology Framework and the 2023 Defence Command Paper both refer to AI as a critical technology with transformative potential for Defence.34 In June 2022 MOD published the Defence Artificial Intelligence Strategy,35 which is discussed in detail in the next sub-section. As Government policy on Defence AI has taken shape new institutions have been created, including the Defence Artificial Intelligence and Autonomy Unit (DAU) and the Defence AI Centre (DAIC). These are discussed in greater detail in the ‘Roles and responsibilities’ section later in this chapter. (Paragraph 40)
18. AI has also featured increasingly in public statements made by senior defence ministers and military figures.36 However, our evidence points to a gap between rhetoric and reality—a “say-do gap”—when it comes to AI. Dr Keith Dear commented that although speeches by senior leadership frequently refer to AI, these are rarely accompanied by specifics about what MOD wants to achieve, how, and by when. He said the consequence of this was:
What is incentivised is sales and marketing material, pitches that are ‘PowerPoint deep’, while suppliers await clearly funded procurement opportunities and can see the specifications they need to meet, rather than the development of real capabilities that are competing for MOD funding.37
The Adam Smith Institute’s evidence pointed out the mismatch between the frequency of AI appearing in policy speeches and the very small number of AI contracts awarded by MOD.38 Throughout the inquiry participants noted that there were few examples of Defence AI applications making it beyond small-scale experimentation and actually being adopted by the Armed Forces.39 This issue is explored in greater detail in the ‘Incentivising innovation’ section in the next chapter. (Paragraph 60)
19. In July the new Prime Minister launched a Strategic Defence Review (SDR), the purpose of which is to “determine the roles, capabilities and reforms required by UK Defence to meet the challenges, threats and opportunities of the twenty-first century.”40 The terms of reference for the SDR do not specifically mention AI, but the themes to be considered by the review include “opportunities for modernisation and transformation … including through the rapid and consistent application of Digital Age technologies,” and “the approach to be taken to acquisition … and how to secure the best possible value for money and rapidly changing technology.”41 The Government also published a Statement of Intent for its Defence Industrial Strategy in December 2024, which lists AI among the areas where “the UK can develop comparative advantage through the Industrial Strategy.”42
20. conclusion
There has been a gap between rhetoric and reality when it comes to Defence AI for some time. MOD statements and policy documents speak about Artificial Intelligence as a paradigm-shifting technology for Defence, and the use of AI in theatres such as Ukraine bears this out. Yet the Department has not behaved as though this is the case; AI is still treated as a novelty or a niche interest rather than something that will soon be a core component of defence systems across the Armed Forces. If UK Defence is to become AI-enabled the ‘say-do gap’ must be closed.
21. recommendation
The Strategic Defence Review represents an opportunity for the Government to modernise UK defence; one way it should do this is by recognising and embracing the ways defence needs to change to reflect the new reality of an AI-enabled world. The Government’s response to the SDR must set out specific actions that will be taken to normalise the use of AI as an enabler across the work of Defence.
22. The Defence Artificial Intelligence Strategy (‘the Strategy’), published in June 2022 by the previous Government, is the core document setting out MOD’s approach to Defence AI.43 At the time of writing the new Government has not indicated that it intends to revise or replace the Strategy. The Strategy says MOD’s vision is that “in terms of AI, we will be the world’s most effective, efficient, trusted and influential Defence organisation for our size.”44 The Strategy sets out four headline objectives for UK Defence, and actions the department will take to achieve each one (see Box 1, overleaf).
23. The Strategy sets direction for the whole of Defence. It states that within six months of publication all Functional Owners, Front Line Commands, Top Level Budgets and Enabling Organisations in Defence should provide a formal response, and that they should either “deliver a stand-alone AI strategy/plan or ensure that AI is prominently covered within other strategies and plans.”45 Two and a half years on from the Strategy’s publication, it is not clear which parts of Defence have responded, and only the Army has published a full Artificial Intelligence Strategy of its own;46 the Navy has produced an ‘Artificial Intelligence Adoption Roadmap’ but this is not in the public domain.47 The Ministry of Defence’s written evidence reiterated the responsibility for individual parts of Defence to develop their own AI plans, and while it did not provide details of individual plans their submission stated that these plans were “maturing rapidly”.48
Box 1: Headline of objectives and actions of the Defence AI Strategy
Objective: Transform Defence into an ‘AI ready’ organisation Actions
Objective: Adopt and exploit AI at pace and scale for Defence advantage Actions
Objective: Strengthen the UK’s defence and security AI ecosystem Actions
Objective: Shape global AI developments to promote security, stability and democratic values Actions
|
Source: Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022
24. There was broad agreement among those who contributed to the inquiry that the Strategy identified clear priorities, and that these were broadly the right ones.49 However, a number of submissions agreed that the Strategy lacked specific actions or targets for how these ambitions would be met and against which progress could be measured. Palantir Technologies told us “MOD’s challenge is not one of strategic direction or priority-setting–it is one of implementation at the necessary pace.”50 Where the Strategy does include objectives and actions, these lacked detail. Dr Keith Dear said the objectives, “while hard to disagree with … are vague and unclear. It would be hard to judge progress against them, or to say whether the department has succeeded or failed in achieving them.”51
25. A lack of specificity in MOD’s approach was seen to be inhibiting progress on the development of Defence AI.52 KBR and Frazer-Nash Consultancy argued that a “lack of clarity on specific use cases being sought by MOD—by implication, the ‘route to market’—risks disincentivising industry investment.53 Dr Simona Soare, Professor of Innovation, Technology and Strategy at Lancaster University, noted that MOD had an implementation plan to accompany the Strategy, but that this was not publicly available, remarking that “there is not enough transparency on where that implementation stands today.”54
26. Aspects of the Strategy have been overtaken by advances in AI since its publication. Palantir Technologies wrote that several of the capabilities described in the Strategy as ‘AI next’ were already being deployed both by other militaries and in private sector enterprises, while the use of AI in operational level planning, described by the Strategy as ‘AI future’ had already been proven in Ukraine a mere 18 months on from the Strategy’s publication.55 Faculty AI commented that rapid developments in Large Language Models since the Strategy’s release “have already rendered some of its key assumptions obsolete.”56
27. The policy environment has also evolved since the Strategy’s publication, with developments including the UK’s hosting of the AI Safety Summit at Bletchley Park and the development of NATO’s DIANA (Defence Innovation Accelerator for the North Atlantic) programme.57 The global picture is changing too, for example with the United States’ publication of a detailed National Security Memorandum on AI in October 2024.58 Some submissions suggested the Strategy should be refreshed to reflect these developments;59 however, Air Marshal Stringer argued that the speed of change in the field was such that “if you were to refresh it, it would always be behind the drag curve.”60
28. conclusion
The Defence AI Strategy sets appropriate ambitions for how the UK will develop and use AI in Defence in the future, but MOD has not set out clear action plans for how these will be achieved and how it will measure its progress towards them. This makes it difficult for industry and investors to gauge the seriousness of MOD’s commitment to AI. In turn, this has a suppressive effect on investor confidence and consequently on the growth of AI capacity in the UK.
29. recommendation
As part of its response to the Strategic Defence Review, MOD must clarify whether the new Government remains committed to the Defence AI Strategy, and provide an update on progress it is making against each of the objectives and actions set out within it. It should set out a clear action plan defining specific actions it will take to further advance each of those objectives and actions and how it will measure and demonstrate success against them.
30. conclusion
As part of its response to the Strategic Defence Review, MOD must clarify whether the new Government remains committed to the Defence AI Strategy, and provide an update on progress it is making against each of the objectives and actions set out within it. It should set out a clear action plan defining specific actions it will take to further advance each of those objectives and actions and how it will measure and demonstrate success against them.
31. recommendation
In its response to this Report, MOD should provide a detailed update on: a) what it considers to have been the key technological developments in Defence AI since the Defence AI Strategy was published; b) how the use of AI in defence has evolved since the Strategy was published; and c) how the implementation of the Strategy has evolved in response to these changes. By doing this, MOD can demonstrate that it is able to evolve its approach within the parameters of the existing Strategy.
32. One of the Strategy’s objectives is to “Transform Defence into an ‘AI ready’ organisation.61 The document elaborates on this:
Defence must rapidly transition from an industrial age Joint Force into an agile, Information Age Integrated Force to stay ahead of adversaries amid an increasingly complex and dynamic threat environment. Rapid and systematic adoption of AI—where it is the right solution—will be an essential element in realising this ambition.62 (Original emphasis)
The Strategy sets out several ‘enablers’ that it says are needed “to prepare Defence for widespread AI adoption,” and five questions which organisations and Functions63 within Defence should use to assess their ‘readiness’. (See Box 2)
Box 2: Tests of AI ‘readiness’ for Defence organisations and functions
1. Have you assessed how AI will shape the future of your business or function? Have you identified those areas where AI is the right solution? 2. Do you have the right culture, leadership models, policies and skills to act rapidly on AI-driven outputs? 3. Do you have accessible, structured, exploitable data? Are you continually collecting data? 4. Do you have access to appropriate scalable computing power (with cloud and ‘edge’ computing as required)? 5. Do you have models? Are they fit for purpose and can you build, test, deploy and update them quickly enough? |
Source: Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022
The Strategy gives examples of actions that will be taken to make the organisation more ‘AI ready’. Many of these relate to improving Defence AI skills within the organisation, such as: developing a Defence AI Skills Framework; establishing a Head of AI Profession; creating AI career development and progression pathways; using more specialist reserves; AI leadership training programmes; and bringing in AI leaders from the private sector. Others relate to ensuring the department has the digital infrastructure and data it needs, such as: delivering Cloud hosting at Secret and Above Secret classification; streamlining data-sharing arrangements with allies, industry and academia; and adopting new processes for structuring and classifying data.64 However, as was noted in the previous section, few of these actions come with clear targets, timelines or measures for success. In a rare case where a target date is specified—MOD’s ambition to deliver Cloud-based hosting at Secret by the end of 2022—this target has not been met.65 Later in the Strategy, the department says it will “explore the mandating of equipment programmes to be ‘AI ready’,” although it does not define what this would mean in practice.66 The Defence Command Paper 2023 said that MOD would “set ambitious targets for ‘AI readiness’ by 2025 across Defence.”67
33. Witnesses said that determining whether Defence as an organisation, or even a particular project, was ‘AI-ready’, was a complex question, but the general view was that there was still more progress to be made. Neil Morphett, Chief Engineer, Advanced Technology at Lockheed Martin Rotary and Mission Systems UK, said:
To your wider question of whether the authority is AI-ready … the answer is probably no. There are bits of it … that are AI literate, have aspirations to get smart on it and are on the journey to do so. There are probably quite a lot of areas where, if it all went away, they would be quite happy about it because then it would be a problem they did not have to deal with.68
When the Sub-Committee took evidence from the then Minister for Defence Procurement, James Cartlidge MP, he acknowledged that there was more progress to be made:
Are we yet at the stage where we can say that, in every aspect of MoD, we are at the highest level of AI preparedness? No. I would like to go a lot further.
He added that “the rate of progress is very good, but I would never be relaxed and say that we are now in a steady state that is entirely satisfactory.”69
34. The former Minister’s comment above illustrates one of the challenges with targeting ‘AI-readiness’ as an ambition: as AI evolves, what it means to be AI-ready will also evolve and as such readiness is not a fixed state but a moving target. Numerous contributors to the inquiry argued that most important adaptation MOD needs to make in an age of AI-enabled defence is one of ethos: they took the view that AI will accelerate the pace at which military capabilities are developed, deployed and updated, and this will require a corresponding change in approach from MOD. Adarga wrote that:
The MOD must recognise that the speed at which innovation and technological development take place in the AI sector is rapid and… existing practices for supporting industry are not valid in an era of software-defined warfare.70
Anduril and Palantir echoed the point that in the AI sector product timelines were measured in months, or even weeks, and contrasted this with the much longer timelines typical in defence.71 Air Marshal Stringer explained that this resulted in a very different battlefield environment in which capabilities are rapidly deployed, upgrade and countered:
Once drones, for example, are out there in a battlefield, they have a lifespan of only a few weeks before the opposition learns how to defeat them. Therefore, you have to keep iterating and keep ahead of the game. At the moment, we don’t move quickly enough to play that game, let alone win it.72
He argued that as a result the approach to building capability had to be very different:
The battlefield is the battle lab. The results come back and you iterate forward. That is a different model to writing a spec, then taking years down the CADMID cycle.73
We discuss how defence procurement might adapt to meet these challenges in the next chapter, on Developing Defence AI (Paragraph 48).
35. The Defence AI Strategy, to its credit, recognises the need for cultural change within Defence, and talks about the need to “tolerate increased risk, learn by doing and rapidly reorient to pursue successes and efficiencies.”74 The Strategy does not lay out a clear plan for how this will be achieved. However, the Integrated Procurement Model published in February 2024 sets out some of the ways in which MOD will try to accelerate development, including a new ‘spiral by default’ approach to development that focuses on delivering a Minimum Deployable Capability quickly and improving it iteratively over time.75 This is a welcome development that suggests MOD understands the challenge it must meet; at this stage, however, it is too early to tell whether this will translate into real change.
36. conclusion
Despite the existence of pockets of excellence within MOD, the department is by its own admission not yet ‘AI ready’. This is concerning because AI is no longer something on the horizon but is already here and being deployed on the frontline. The use of automated analysis and uncrewed systems in Ukraine indicates that AI-enabled systems are and will remain essential to defence; the UK therefore has no choice but to embrace AI as a core defence capability. MOD has started this process but has a long way to go.
37. conclusion
AI ‘readiness’ is hard to define because the technology and its use in defence is evolving so rapidly. It is extraordinarily difficult to predict how AI technology will develop during the lifetime of the Defence AI Strategy, and this makes AI readiness a fast-moving target and a process that will never be fully complete. For this reason, rather than aiming to be ‘AI-ready’ it would be more appropriate for MOD to aim to become an ‘AI-native’ organisation, for which AI is a core part of its toolkit and which is able to use AI instinctively to solve problems and meet its objectives.
38. conclusion
The aspects of readiness identified in the Strategy are all valid ambitions but will not result in the effective deployment of AI-enabled defence on their own. To accomplish this MOD also needs to achieve a broader cultural change that goes beyond AI. The organisation needs to become more comfortable with rapid change, more open to experimentation and risk and more able to deploy and iterate cheaper, disposable, software-led solutions on faster timelines. There is evidence–including in the Defence AI Strategy and the new Integrated Procurement Model–that MOD understands this, but it is not clear that this has yet resulted in real change or that there is a coherent plan to bring about that change across the organisation.
39. recommendation
Becoming ‘AI ready’ must not be treated as a discrete task but rather as one aspect of a broader transformation of MOD’s culture to become more innovative and adaptable in response to a faster-paced and more complex defence environment. The department should consider this cultural change a necessary condition not just for using AI effectively but for achieving Defence’s aims in modern conflict. MOD must urgently accelerate this transformation.
40. AI has the potential to transform a wide range of functions across many parts of Defence and the governance arrangements for Defence AI reflect this. The Defence AI Strategy explains that “There is no single overall owner of AI in Defence. Every business unit and Function has an important role to play if we are to achieve our goals.”76 At the same time, the Government has created specific units and bodies with an AI focus: the Defence AI and Autonomy Unit (DAU) was set up in 2018 to help MOD adopt AI,77 and in 2020 the Defence AI Centre (DAIC) was announced by the then Prime Minister Boris Johnson.78 The Defence AI Strategy describes their respective responsibilities:
Overall strategic coherence is managed jointly by the Defence AI and Autonomy Unit (DAU) and the Defence AI Centre (DAIC). The DAU sets strategic policy frameworks governing development, adoption and use of AI. The DAIC is the focal point for AI R&D and technical issues.79
The Strategy also outlines roles for other parts of Defence. For example, Strategic Command is responsible for ensuring “strategic and operational integration across the warfighting domains.”80
41. Witnesses described a lack of coordination between different parts of Defence, and said the relationship between the Front Line Commands (FLCs) and the Centre was not as clear as it could be. techUK noted that the FLCs have AI centres of their own, and said it was unclear what their relationship with DAIC was and whether DAIC had authority to mandate particular approaches across the services.81 A recent Joint Service Publication (JSP) aims to direct consistent principles and practice for how AI is developed and used across Defence, but it remains to be seen whether this will help to achieve greater coherence.82 Several witnesses believed thinking on AI could be better joined up: James Black of RAND Europe commented that “sometimes the Navy, the Army and the Air Force will be looking at a similar problem, but approaching it separately.”83
42. Of the bodies with responsibility for Defence AI, the Defence AI Centre has the clearest remit for developing the UK AI sector. The Strategy says the three main functions of DAIC are to:
The DAIC is responsible for, among other things, engagement and interchange between Defence, academia and the tech sector,85 for developing technical handbooks for AI practitioners working in defence,86 and an AI Concept Playbook that the Strategy says will give direction to research and development.87 In its submission the MOD gave examples of activities undertaken by DAIC, including hosting a DAIC Connect event aimed at helping SMEs learn about working with defence, as well as other engagements with industry and academia.88
43. Written evidence and witnesses said the DAIC was a welcome initiative.89 Phil Morris of Palantir Technologies believed it had “a significant part to play in the delivery of AI capabilities to the personnel on the frontlines,”90 adding that it had “made significant strides over the first couple of years of existence.”91 techUK told us the DAIC “is communicating clearly AI priorities and initiatives.” At the same time, they noted there was still some confusion about the exact role of the DAIC, and how it relates to the Front-Line Commands’ own AI centres.92
44. Witnesses discussed whether there was sufficiently senior leadership to drive adoption of AI—including the necessary cultural change referred to earlier in this chapter—across Defence. The DAIC’s ability to exert influence across Defence is uncertain: the head of the DAIC is a one star officer,93 and witnesses were unsure, when asked, where the DAIC was based or whether it in fact had a physical presence at all.94 Dr Simon Harwood, Director of Capability and Chief Technology Officer at Leonardo UK, agreed with the characterisation of Defence AI leadership as ‘atomised’95 while KBR and Frazer-Nash Consultancy recommended that MOD appoint a senior, named official to drive coherence across Defence.96 There were however differing views on whether this was the best approach. Air Marshal Edward Stringer said: “if you have an SRO just for AI per se, there would be a lot of conferences and talk about the abstract, but you would not actually be driving to get things to work.” He suggested instead:
On having an SRO for AI, that should really be the Chief of the Defence Staff. They should ensure that all three services are building to a common vision and will come together in a way that creates a warfighting ecosystem that merges with the physical.97
There was a recognised tension between the goals of providing a central impetus for adopting AI and empowering all parts of defence to use initiative in pursuit of innovation. Lt Gen. Tom Copinger-Symes described it as “a balance between moving quickly on your own and going a long way together.”98
45. conclusion
Responsibility for AI across Defence is diffuse, and as a result Defence’s approach is not as joined up as it should be. The creation of the Defence AI Centre as a focal point for industry engagement and development is welcome, but it has limited scope to influence how AI is adopted across Defence, as demonstrated by the fact that the head of the Defence AI Centre is a one star officer. The kind of change required needs to be driven by senior leaders at the top of the Defence organisation. Driving adoption of AI is not an optional extra but a task for all parts of the organisation, and as such should be a responsibility for the established leadership of MOD.
46. recommendation
In view of the importance of AI to so many aspects of defence, the Chief of the Defence Staff should have lead responsibility for Armed Forces transformation, a major part of which will be embedding the use of Artificial Intelligence in the work of MOD and the Armed Forces and bringing about the cultural change necessary to make the use of AI ‘business as usual’ in defence. Complementing this, Strategic Command should be responsible for ensuring defence AI develops in a coherent way across defence, so that AI is interoperable across the services.
47. recommendation
The Defence AI Centre can be a tool for knowledge exchange, responsible for mainstreaming AI and sharing innovation and best practice across the Armed Forces and between industry, the research sector and MOD. To do this it needs to be visible and accessible. The DAIC should have its own premises, so that it can act as a physical focal point for collaboration on Defence AI. The DAIC should host attachments and coordinate placements with industry, with the aim of developing individuals’ AI fluency and expertise that they can then take back to their own branch of the Forces.
48. AI is developed in a very different way to traditional defence systems, and the increasing use of AI in Defence will therefore require systems to be designed, built and maintained in new ways. MOD will need to change its approach accordingly. This is part of the cultural shift referred to in the section on ‘AI readiness’ in the previous chapter (Paragraph 32). Two of the main challenges for MOD that witnesses identified relate to the way MOD procures equipment and how it incentivises innovation. We explore these challenges in the following sections.
49. Several witnesses highlighted that the way MOD procures defence equipment is not suited to procuring AI, because software development timelines are considerably shorter and the product is developed according to a very different cycle than is ordinarily the case in defence.99 Air Marshal Stringer described what he saw as the difference between Defence’s traditional approach to development and that of software companies:
We are still configured to go to big primes and beat out a contract, and put in thousands of pages all the cardinal points. Then our lawyers and their lawyers will argue over the 20 years of the programme, before a big bit of plant is delivered. Software companies do not work that way. They use the minimum viable product method. They want to work with you on a problem, get something that works, get it out there, test it, get the data back and keep evolving it.100
James Black of RAND Europe noted that the fact that AI could be developed and updated quickly changes not just how Defence procures equipment, but also what is seeks to buy in the first place:
You are no longer buying a big lump of metal—a ship, a tank, an aircraft or whatever—and doing it in a 10, 15-year timeframe and having most of the value wrapped up in hardware. You are instead now looking at a much more rapidly evolving force of unmanned systems, as well as crewed systems, much more emphasis on connectivity data, AI, et cetera, and then consequently on software.101
He added that as software becomes a growing part of defence platforms, it will change how the accompanying hardware is designed:
You are no longer buying an aircraft and calling it your capability once you have stuck a pilot in it, got a runway and all these other things. You are instead thinking about how you can provide certain effects over the life of a programme, and how the hardware and software can be designed, configured, upgraded and refreshed over time so that the aircraft’s capabilities in year one versus year 10 are radically different. You design the hardware to accommodate that, so you can stick modules in and out and change things over time relatively easily, and you design it much more around the software.102
50. A practical example of how AI might influence the design of a platform is the Global Combat Air Programme (GCAP). During our inquiry on Future Aviation Capabilities we asked Richard Berthon, Director Future Combat Air at MOD, how the programme would adapt to rapid and unpredictable improvements in technology during its lifespan. He responded that the programme had been designed with open system architecture to enable new capabilities to be ‘plugged in’, making it “significantly easier to integrate additional new technologies down the line.”103
51. Witnesses told us that as software comes to constitute a greater component of defence systems this would change MOD’s relationship with its suppliers. As James Black explained:
It is not that they are providing you with a vehicle, a missile or whatever, and then they hand over the keys and, although they provide rolling maintenance, you largely own and operate that capability, so it is more of a transactional relationship. When you start talking about software delivery over time, particularly as a service, you are talking about a much more intimate relationship between the user and the designers. You are providing constant feedback, data and so on, so it is much more enmeshed.104
The new relationship dynamics may require contracts to work differently. Mr Black gave the example of vendor lock-in, which he said might be addressed by promoting open systems architecture in system designs. Neil Morphett from Lockheed Martin Rotary and Mission Systems UK highlighted the challenge of managing intellectual property rights through the commercial relationship with a supplier.105
52. AI is a general-purpose technology, and this means many of Defence’s suppliers in the future may be companies who do not specialise in defence but who sell their dual-use products to military and civilian clients alike. These are likely to include start-ups who are newcomers to the Defence market. Witnesses told us that this could change the relationship MOD has with its suppliers. James Black of RAND Europe explained that MOD has a significant amount of influence and leverage in its relationship with primes because it is their primary customer, but in a relationship with a supplier for whom Defence is just one of many clients that influence would be reduced.106 Witnesses told us that MOD would therefore need to work to be a more appealing customer. At present the department was considered “very difficult” and slow to work with;107 Dr Simona Soare pointed out that start-ups need influxes of work or cash to stay afloat, and that MOD often moved too slowly to provide them with these opportunities.108
53. Some witnesses said the experience of smaller and non-traditional defence suppliers of working with defence could be improved through partnerships between those companies and established defence primes.109 Dr Simona Soare said there were already a number of interesting examples of partnerships between primes and market newcomers.110 Air Marshal Stringer said that primes could help newcomers to scale new technologies in a way that the smaller company might find challenging on their own.111 Faculty AI’s submission saw benefits in an approach where “Primes and SMEs collaborate in ‘Rainbow teams’ to bring their advantages to bear: mass production and knowledge of MOD processes from the former, and cutting-edge AI/ML approaches from the latter.”112 Dr Simon Harwood of Leonardo UK noted that there was no strategy for what the relationship between primes and SMEs should look like.113
54. In December 2024 the Government published a Statement of intent for its Defence Industrial Strategy. The Statement identifies working with non-traditional SMEs, including technology companies, as a priority.114
55. In February 2024 MOD published a new Integrated Procurement Model,115 one of the objectives of which is to drive greater pace in procurement. In the document’s introduction the then Minister for Defence Procurement, James Cartlidge MP, said it is “imperative that we drive pace in delivery of military capability to our Armed Forces—making every day count—so that we stay ahead of our adversaries.”116 The new model seeks to achieve this by implementing a ‘spiral by default approach.’ The department says it will focus on delivering a “minimum deployable capability” which can be improved over time, and will develop “new commercial pathways to increase speed and value for money.”117 The new Government has not confirmed whether it this remains its approach.
56. The document says the spiral approach will be accompanied by a “cultural shift to put greater value on pace”, which will include “encouraging our people to use their professional judgement to put greater value on pace.”118 MOD says the model will be implemented by the end of 2025, but acknowledges that cultural change will take longer to achieve.119 The recent Statement of Intent for the Defence Industrial Strategy reiterates the aim of “dramatically increas[ing] the pace in procurement.”120 A recent Joint Service Publication, ‘Dependable Artificial Intelligence in Defence’, attempts to set out consistent practice for through-life management of AI; while this gives detailed guidance on responsible risk management, there is little mention of pace to suggest the sort of cultural shift imagined in the Integrated Procurement Model.121
57. Alongside wider procurement reform, we heard that the department’s Commercial X team had enjoyed some success in applying different approaches to procurement. Phil Morris of Palantir Technologies said that his company’s experience of working with Commercial X was that they were “less risk-averse” and “more agile.”122 Faculty AI said that the team was a “welcome innovation.”123
58. conclusion
As AI becomes an increasingly important component of defence systems, this will change the way those systems are designed, built and maintained. MOD’s acquisition processes will need to evolve in response. The Integrated Procurement Model’s commitment to spiral development is an encouraging sign that MOD grasps the nature of this challenge. However, it should not underestimate the scale of the change this will require both in terms of its processes and its culture. If the IPM is properly implemented, one of the benefits we would expect to see is more regular development and deployment of AI-enabled defence systems. Work to deliver the spiral development aspects of the IPM must begin now and with urgency.
59. recommendation
In its response to this report, MOD should explain how the Integrated Procurement Model will bring about more regular development and deployment of AI-enabled defence systems. The Model should ensure that it becomes standard MOD procurement practice to consider how AI might deliver or enhance a given defence capability. The Model promises that decisions on major programmes will be informed by independent advice at the pre-concept phase; this advice should include an assessment of the project’s AI-readiness and how it could be AI-enabled. Engagement with industry on the new procurement model should extend beyond the traditional defence sector so that it reaches developers whose software could have dual-use potential but who may not have considered working with Defence before.
60. A number of contributions to the inquiry discussed the importance of having the right incentives to support innovation.124 Dr Keith Dear wrote that while the UK has the talent and opportunity to build successful Defence AI companies, “What is missing at the moment are clear incentives to mobilise this latent capability.”125 Witnesses suggested a number of ways in which MOD could better incentivise innovation.
61. The evidence suggests that MOD could be clearer about what it sees as the use cases for AI. Witnesses said that a clear ‘pipeline’ was needed to send a demand signal to industry, to attract capital investment and help industry understand the skills it would need to invest in for the future.126 The Defence AI Strategy says the department will publish an “AI Concept Playbook … sparking innovation by setting out the key application where we see most benefit from AI adoption.”127 The first Playbook was published in February: it identifies six ‘problem spaces’ in which MOD sees opportunities for AI to be used in Defence, and provides case studies for how AI is being developed or used in each.128 However, in our view the Playbook does not contain the kind of detail that is likely to give investors confidence to make investment decisions.
62. MOD has a tendency to over-specify technical requirements at an early stage, and so discourage suppliers from finding creative solutions to problems; this can prevent companies from using AI to solve problems in innovative ways.129 Dr Keith Dear suggested that MOD should instead ‘contract for outcomes’, i.e. the effect it wants to achieve, rather than specifying the details of the platform itself; he argued this would encourage companies to use AI to achieve the outcome where it could deliver an advantage over rival approaches.130
63. Witnesses explained that one of the differences between Defence’s approach to product development and that of the software industry was their respective attitudes to risk. James Black of RAND Europe pointed out that there are good reasons—including safety and political concerns—for Defence to be comparatively risk-averse,131 but at the same time a reluctance to try new things could result in the UK’s military advantage slowly diminishing over time. He contrasted this with Ukraine’s situation, where the existential danger of Russia’s invasion has changed the risk calculation and heavily incentivised experimentation, with the result that more innovative ideas were being tried.132 Dr Simona Soare said that the present attitude in UK Defence was one which sought to avoid ‘regret’, i.e. procurements where things go wrong; she said that Defence ought to recognise that time wasted through excessive caution also resulted in lost military capability.133
64. Witnesses said that while MOD was good at small-scale experimentation, it often struggled to translate successful innovations into fielded and scaled capability.134 MOD’s Defence and Security Accelerator (DASA) is designed to catalyse innovation in defence, but we heard that ‘pull-through’ is lacking and that very few small defence tech companies who had received DASA grants had gone on to achieve larger commercial scale, and in any case DASA’s budget is comparatively small.135 Dr Keith Dear wrote that Defence should act as a “first customer to help companies get over the valley of death in innovation.”136
65. conclusion
MOD has not provided a clear enough signal to industry of the kind of Defence AI it wishes to acquire, or a clear route to market for smaller companies who might want to sell to the department. This has made engaging with defence an uncertain proposition for smaller AI companies and has resulted in limited appetite from investors.
66. recommendation
MOD should aim to cultivate a more dynamic ecosystem of smaller AI companies working in Defence. It should do this by:
67. conclusion
One obstacle to the adoption of AI by Defence is the culture clash between Defence and the tech industry. Tech’s ‘fail fast’ mindset and model of rapid, iterative development is not something that comes naturally to Defence, but this is something MOD needs to adapt to as AI’s importance grows. The ‘cultural shift’ outlined in the Integrated Procurement Model suggests that MOD understands the need for cultural change, but the department acknowledges that achieving this will be difficult.
68. recommendation
In its response to this report, MOD should set out specific actions it will take to achieve the ‘cultural shift’ envisaged in the Integrated Procurement Model so that decision-makers feel more empowered to pursue innovative solutions, to take calculated risks, and to acknowledge and learn from failures.
69. conclusion
Smaller companies told us there were relatively few opportunities in the UK’s Defence AI ecosystem for smaller defence suppliers to win the kind of contracts that would allow them to expand and attract investment. While MOD is good at supporting experimentation and the production of prototypes, it struggles to translate these into fielded and scaled capability.
70. recommendation
MOD should establish a fund within its existing procurement budget for the specific purpose of scaling successful defence innovations, including those which use AI, to help the most successful prototypes translate into fielded and scaled capabilities.
71. recommendation
In recognition of the fact that many of the AI companies who might provide services to defence are smaller than traditional defence suppliers, MOD should design its competitions and contracting processes in such a way as to minimise administrative barriers to entry, to make competition for defence contracts as accessible as possible to outsiders and challengers. We heard that Commercial X has enjoyed significant success in simplifying commercial processes for smaller businesses, and these lessons should be drawn on when tendering for Defence AI.
72. Some of the fundamental challenges of developing Defence AI capacity relate to establishing the technical apparatus for AI development and deployment. AI requires significant computing power (compute), and training AI models depends on having large data sets. These challenges are amplified by the security requirements of Defence.
73. The lack of suitable secure digital infrastructure was seen as a comparative weakness for the UK in its efforts to develop and deploy Defence AI.137 Adarga described a “lack of modern infrastructure [that is] a challenge when working with MOD.”138 Tortoise’s AI Index ranks the UK’s infrastructure as 17th of the 62 countries ranked, compared with its top ten rankings in talent, research and commercial environment.139 RAND Europe wrote in their evidence that the UK ranks tenth globally in terms of achieved computing performance.140 Secure cloud computing for Secret and above data was seen as a priority by a number of those who responded to our inquiry,141 while others highlighted the increased demand for compute—not just from Defence but from across government and the wider economy—that would result from the widespread deployment from AI.142
74. MOD recognises the importance of digital infrastructure, and since 2021 has had a Digital Strategy for Defence that pledges the creation of a Digital Backbone including “Hyperscale Cloud for Defence.”143 The Defence AI Strategy is clear that “Fast, scalable and secure compute power and seamless network infrastructures are critical to fully exploit AI across Defence,” and on cloud specifically adds that “Cloud hosting at multiple classifications is essential to provide the scalable compute needed for our AI models.” The Defence AI Strategy says that MOD will deliver clouds at Secret and Above Secret by 2022 and 2024, respectively.144 In addition, the UK is investing in new supercomputer capacity,145 although in August 2024 the Government took the decision to cancel £1.3 billion of funding for AI projects.146
75. MOD has found some of its digital aspirations challenging to meet in practice. A 2023 report by the Public Accounts Committee concluded that “The Department is struggling to deliver its largest, most transformational digital programmes” and noted that MOD does not have a delivery plan that allows it to measure progress towards implementing its Digital Strategy.147 The target for Secret cloud given in the Defence AI Strategy was ultimately not met; Palantir’s evidence said that the new target date was the end of 2024. The Second Permanent Secretary told us that “Secret on-premises cloud” had been operational since early 2023, but did not say what benefits this would offer to AI development specifically.148 Faculty AI wrote that “the in service date of the Digital Backbone remains crucial,” and that “an update on its completion date is vital if UK AI SMEs are to put in place realistic expansions, hiring and training plans.”149
76. Some of our witnesses said the UK needed to consider the security of the supply chains needed to develop digital infrastructure. Dr Simona Soare, Professor of Innovation, Technology and Strategy at Lancaster University, told the Sub-Committee the UK “depends significantly on other, potentially adversarial countries” for critical mineral-heavy elements.150 James Black of RAND Europe noted that because compute is often shared across borders, questions of sovereignty and security are more complicated and the UK should be careful not to become dependent on other countries “which may or may not be reliable providers in the future.”151
77. conclusion
A mature defence AI ecosystem will require improved digital infrastructure, including cloud networks that enable developers to work securely with classified data and an increase in overall compute. MOD has set out the right ambitions in the Defence Digital Strategy, but realising these has proved difficult and the department’s plans for a ‘Digital Backbone’ are behind schedule.
78. recommendation
MOD should undertake a mapping exercise to assess the adequacy and the resilience of its digital ecosystem. This should consider, but not be limited to: computing power; secure cloud computing; data centres; availability of semiconductors; quantum computing capacity; and frontier AI models. On the basis of this exercise, the department should identify weaknesses that might constrain the development of Defence AI.
79. recommendation
In its response to this Report, MOD should provide an update on delivery of its ‘Digital Backbone’, explaining how it is measuring progress and providing an estimated completion date.
80. Witnesses emphasised that developing Defence AI required effective collection, management and sharing of data with which AI models can be trained.152 Scale AI’s evidence noted that while Defence generates and collects large amounts of data as a matter of routine, a lot of this data is not ‘AI-ready’ and so much of it cannot be effectively used.153 Complicating matters further is the fact that a lot of defence data which could be useful for training AI models is held not by MOD itself but by contractors or by allies, and is therefore not always easily identifiable or shareable.
81. The sensitive nature of defence data is an obstacle to its use for training AI models, as it demands systems that can store and share that data securely and requires individuals who are given access to that data to be appropriately security cleared. This is a challenge when MOD works with smaller or non-traditional defence suppliers, who may find it more difficult than established defence companies to meet defence’s stringent security requirements and for whom getting security clearance for staff might be time-consuming. Andrew van der Lem of Faculty AI said “On security clearance for individuals, if that cannot be done quickly, you cannot have a non-traditional supplier provide a solution. It is as simple as that.”154 Faculty told us in their written evidence that they had seen recently seen improvements in this regard, and that “this has removed a significant source of friction for us when building engineering capacity.”155 However, they said difficulties remained in certifying the security of SMEs’ IT systems, something which they said was “a key blocker when trying to access even moderately sensitive data.”156
82. Creating useful datasets is a further challenge. Many types of machine learning require large datasets which have been properly curated and labelled, but this can be a costly and time-consuming process which requires SC-cleared157 individuals to do the labelling.158 Faculty suggested that Reservists could perform this task for Official Sensitive data.159 An alternative approach is to create synthetic datasets—bodies of artificial but representative data that simulate a real dataset—for when the risk of using real data is considered too great.160 Dr Simon Harwood observed that such datasets will only be useful if they are of sufficient quality.161
83. MOD recognises the importance of data and the need for Defence to improve the way it manages data. The department published a Data Strategy for Defence in 2021 which identified four Strategic Outcomes to be achieved by 2025.162 (See Box 3.) The Defence AI Strategy describes data as a “critical strategic asset,” but acknowledges that “our vast data resources are too often stove-piped, badly curated, undervalued and even discarded.” The Strategy says that each part of Defence will be required to maintain a data strategy, adopt new processes to help standardise and structure data, and explore ways of streamlining data-sharing bureaucracy to make data more accessible to and from allies, industry and academia.163 The Strategy also says that MOD will explore the creation of synthetic datasets, managed by the Defence AI Centre, in cases where widespread labelling and sharing of data is impossible or inappropriate.164
Box 3: Data Strategy for Defence: four Strategic Outcomes
1. Data is curated, integrated and human and machine ready for exploitation 2. Data is treated as the second most important asset only behind our people 3. Our people are skilled and exploiting data to drive advantage 4. Defence are data leaders with partners, allies and industry |
Source: Ministry of Defence, Data Strategy for Defence, 27 September 2021
84. conclusion
Defence collects masses of data that could potentially be used to train defence AI models, but this data is not always easily accessible to those developing defence software. To some extent this is an inevitable challenge when working with sensitive data of the kind Defence holds, but it is a challenge that must be overcome if the UK is to apply AI to defence problems. Even at lower or unclassified levels of data, there is enormous unrealised potential in the data held by Defence that could be unlocked by improving the way data is collected, labelled and shared. The 2021 Data Strategy for Defence shows that MOD has diagnosed the problem and has the right ambitions, but our evidence shows that many of the issues identified in the Data Strategy persist.
85. recommendation
MOD should confirm whether it still intends to fully deliver the Digital Strategy for Defence. By July 2025, MOD should provide us with a progress report on each of the Strategic Outcomes identified in the Data Strategy for Defence. It should confirm whether each of these outcomes were fulfilled by 2025, as was set out in the Strategy, and if this is not the case explain the reasons for the delay and confirm new dates for their fulfilment.
86. recommendation
MOD should address the barriers which currently prevent smaller AI companies from using defence data and so inhibit them from bringing value to defence. It should do this by overcoming the delays companies face in obtaining security clearance for their staff to access such data, designing tenders that enable use of open-source data where appropriate, and making available synthetic datasets for tenders when security demands.
87. Defence faces challenges in recruiting Suitably Qualified and Experienced Persons (SQEPs) to work on Defence AI. The Sub-Committee heard that a lack of suitably skilled staff was an obstacle to scaling up production in the digital domain.165 Many Defence industry roles require STEM qualifications and expertise which are in short supply, and so this is not a problem unique to AI; however, some aspects of the labour market in AI mean the challenges are different.
88. The Sub-Committee heard that the Ministry of Defence was not clear about what its skills needs were in this area.166 Understanding the skills needs of the sector is made more complicated by the rapid changes occurring in the field of AI, which mean the skills MOD needs in the medium or long term might not be the same as those required today.
89. The UK has a strong academic pedigree in computer science, and is home to some of the world’s leading research universities in the field of AI.167 However, Professor Toby Breckon, Professor of Computer Vision at Durham University observed that “only a relatively small number of UK computer science (software) graduates enter the UK defence sector.” He explained:
the reward, flexibility and freedoms afforded to AI researchers in academia and (non-defence) industry research is often in sharp contrast to that on offer from the defence sector itself. My postgraduate qualified students often land well-paid industry jobs, based in city-centre business hubs with flexible working and the freedom to continue to publish their work, often focussed on fundamental AI research topics, in the open scientific literature.168
The AI workforce is also highly mobile and international, presenting multiple challenges for Defence. Not only does UK Defence face more competition for the most talented minds,169 but many of the people working on or studying AI in the UK at present are not UK nationals, which can create issues when working on sensitive projects.170
90. One of the challenges MOD faces in recruiting talented AI professionals is that AI skills are in high demand in many other sectors of the economy, some of which can offer substantially higher pay than Defence.171 Rules on public sector pay mean that MOD is restricted in the salaries it can offer, which witnesses agreed were well below the market rate.172 Giving evidence to the House of Lords Committee on AI in Weapon Systems last year, Professor Sir Lawrence Freedman, Emeritus Professor of War Studies at King’s College London, said that “some of the very important technology jobs in the Civil Service are going at ludicrous salaries at the moment that would require people to almost halve their pay if they were coming from the private sector.”173 Even outside of Government, we heard that salaries in defence companies were not competitive because price competition for defence contracts tends to drive down pay.174 Witnesses said that Defence offered “non-financial drivers” that partly compensate for the salary gap, such as a sense of mission and the opportunity to work on exciting projects.175 At the same time, Dr Simon Harwood of Leonardo UK explained that the classified nature of some of Defence’s work made it harder to sell as a career.176
91. MOD acknowledges that pay is a challenge, and the Defence AI Strategy says the department will develop options for an “AI pay premium to incentivise recruitment, upskilling and retention.”177 The Second Permanent Secretary told us that since the Strategy was published MOD had implemented a pay framework on data, digital and technology that he said offers “the median private sector rate” and could enable Grade 6, Grade 7 and SEO staff working in MOD to earn up to an extra £18,000 per year.178
92. Several witnesses advocated the use of flexible, non-linear career structures that would help MOD bring on board individuals with a talent for AI and create more appropriate career paths for AI practitioners. Air Marshal Edward Stringer said the model of “recruiting someone as a 16-year-old and get[ting] them to stay for 20 years doing nothing but sitting on military pay while becoming a global expert in AI … is not going to work.”179 Dr Simona Soare agreed, and suggested that instead Defence should seek to bring in specialists from the commercial sector for periods of 4–6 years.180 Neil Morphett and Dr Simon Harwood saw merit in the use of secondments or so-called ‘zig-zag’ careers, although Dr Harwood acknowledged that implementing this had been tried in the past and had proved difficult to achieve.181 Other ideas that were shared during the course of the inquiry included making greater use of Reservists with AI skills.182 MOD said that it was already recruiting individuals with digital skills into the Reserves and making use of existing Reservists with specialist digital skills.183
93. The Haythornthwaite Review, which examined career incentives in the Armed Forces, acknowledged the difficulties the Forces face in recruiting digital skills.184 It also recognised the merits of greater career flexibility in helping people develop skills over their career while also filling skills gaps in Defence. The Review discusses how to enable the ‘zig-zag’ career, and how to move away from linear rank-based progression to skills-based career pathways; digital skills are specifically identified as an area where such new approaches could be trialled.185 Perhaps most relevant to AI, it recommends that individuals should be given opportunities for temporary or permanent career moves outside the Service.186 Lt Gen. Tom Copinger-Symes described how MOD’s approach had been informed by the Haythornthwaite Review, but acknowledged that these changes could be moving more quickly.187
94. In addition to the actions mentioned above, the Defence AI Strategy says MOD will appoint a Head of AI Profession based within the Defence AI Centre, and will produce a Defence AI Skills Framework identifying skills needs across Defence.188 MOD said a Capability Lead for AI Talent and Skills was appointed in June 2024, and that the development of a Defence AI Skills Framework is underway.189 The Strategy also commits to developing AI career development and progression pathways and using more specialist reserves.190
95. The increasing use of AI across Defence will not only increase demand for technical specialists within industry, but will require people across the defence organisation to get used to working with AI and using it as a tool. The Sub-Committee heard that these individuals did not necessarily need to be technical experts in AI but should “have at least the basic AI literacy to understand enough about the technology to understand how it may be disruptive or beneficial to their day-to-day lives and the work that they do.”191
96. The evidence pointed to knowledge gaps within the UK’s AI ecosystem: Professor Toby Breckon wrote that “AI expertise across traditional public and private sector organisations within the UK defence sector lags significantly behind academia and the broader UK AI industry.”192 James Black and Air Marshal Stringer said that while there were knowledgeable individuals in MOD at all levels of the organisation, they tended to be “pocket experts” working in siloed areas of the organisation.193 RAND Europe said that AI technical knowledge is most concentrated in the Defence Science and Technology Laboratory (Dstl) and the Defence AI Centre, whereas decision power for acquisition sits elsewhere with Senior Responsible Owners within the Front Line Commands.194
97. The Defence AI Strategy, as part of its objective of transforming Defence into an AI-ready organisation, says the department will mandate that senior leaders across Defence have a “foundational and strategic understanding of AI and the implications for their organisation” while providing training opportunities to develop AI leadership skills throughout the organisation. It also says MOD will upskill its workforce to “generate an informed community of AI users across our workforce.”195 The Second Permanent Secretary said around forty of the MOD’s senior leaders had taken part in a digital leaders learning programme which included an AI element, although at the time this had amounted to only a single session.196
98. conclusion
MOD should address the barriers which currently prevent smaller AI companies from using defence data and so inhibit them from bringing value to defence. It should do this by overcoming the delays companies face in obtaining security clearance for their staff to access such data, designing tenders that enable use of open-source data where appropriate, and making available synthetic datasets for tenders when security demands.
99. conclusion
There is very limited information available on what MOD believes its AI skills needs are; it is therefore difficult to know how MOD understands the skills challenge it faces.
100. recommendation
MOD should carry out a mapping exercise to establish what skills it believes the UK Defence AI sector faces and where shortfalls currently exist. This exercise should also consider how the sector’s skills needs might change in the future as AI evolves.
101. recommendation
As was promised in the Defence AI Strategy, the Ministry of Defence should devise a strategy for supporting individuals with an AI specialism to develop a career in defence. The plan should create viable career paths for AI specialists within MOD and the Armed Forces, and should also establish schemes to enable AI experts to move between the defence and civilian sectors flexibly, including industry placements with defence of the kind used by the National Cyber Security Centre. MOD should also develop recruitment campaigns targeting AI professionals, both for full-time and specialist reservist roles.
102. conclusion
As AI becomes a core part of Defence’s functions, MOD will require not just AI specialists but greater AI-fluency throughout the organisation, and particularly among its leadership. MOD recognises this and has begun to offer AI training, but at the moment this is very limited and does not reflect the importance of AI to Defence.
103. recommendation
MOD should increase the proportion of the Professional Military Education curriculum dedicated to AI, so that current and future defence leaders improve their understanding of AI’s impact on defence and learn how to better exploit it.
104. The UK will not develop Defence Artificial Intelligence in isolation; AI is a global sector in which research and development happens across international borders. Equally, many of the UK’s defence and security objectives are not met alone but through cooperation with allies, including NATO, Five Eyes, and the AUKUS partnership (discussed later in this chapter). As AI becomes more and more important to defence, the UK’s ability to work with its allies will depend on the interoperability of their respective AI; for example, interoperable systems could enable more effective operational collaboration and intelligence sharing. Interoperability could also bring benefits for AI development: a recent RAND report noted that common approaches to AI between allies could expand the pool of data available on which AI models can be trained.197 James Black of RAND Europe explained that the impact of AI was maximised by joining up systems to achieve scale.198
105. Witnesses said there were merits to thinking about interoperability of AI systems at an early stage in the global development of Defence AI. Neil Morphett, Chief Engineer, Advanced Technology at Lockheed Martin Rotary and Mission Systems UK, said there was a history of the UK developing its own defence platforms, only to end up adopting allies’ standards because of the need to be compatible with US, NATO or European platforms. He argued it would be better to develop common architectures with allies at an early stage.199 Mr Morphett said he did not think there were sufficient multilateral discussions about developing common AI architectures, although added there were “lots of bilateral discussions going on.” He said that the AUKUS framework had “vast potential to get those countries on a common framework.”200
106. Other countries are independently developing their own AI policies at the same time as the UK. The extent to which the UK and its allies can develop interoperable AI will be affected by their respective policy approaches. If the UK and its allies pursue different policy aims or adopt very different practices from each other then it will be more challenging to make their systems interoperable. As a key ally of the UK, the United States’ approach is of particular interest when considering interoperability and the compatibility of our policy approaches. The Biden administration recently published a detailed National Security Memorandum on Artificial Intelligence. US policy will no doubt continue to develop under the incoming administration.201
107. conclusion
The UK’s AI-enabled defence systems will be most effective if they are interoperable with those of its allies. This will require dialogue between allies to ensure that systems are designed with compatibility in mind. Interoperability will be more easily achieved if the UK and its allies have a mutual understanding of their objectives in developing and deploying defence AI.
108. recommendation
The Government should convene a programme of discussions with the UK’s AUKUS and NATO allies aimed at reaching a mutual understanding of each other’s AI objectives and strategies. These discussions should also explore reaching, where possible, common positions and standards on how AUKUS and NATO allies will develop and deploy AI in defence. These could include common approaches to data collection and labelling, a shared approach to ethical use of autonomous technologies, and joint working on skills and capacity building.
109. The AUKUS partnership between the UK, Australia and the United States is a potential vehicle for AI collaboration between the three countries. Announced in 2021, AUKUS consists of two ‘pillars’, the second of which is a commitment to collaborate on a range of ‘advanced capabilities’ including autonomy and AI.202 Giving evidence to the Sub-Committee, the former Minister for Defence Procurement described AUKUS as “an incredible opportunity for UK AI.”203 The new Government has declared its continued commitment to AUKUS,204 and has appointed an AUKUS Adviser, Sir Stephen Lovegrove, to “reinforce the progress and benefits of the AUKUS programme.”205 Sir Stephen was due to report his findings to the Government in November 2024, but at the time of writing it has not been published.206
110. Collaboration on AI is already taking place within the AUKUS framework. MOD said work was already underway in areas including force protection, precision targeting, and intelligence, surveillance and reconnaissance; they added that the partnership’s AI and Autonomy Working Group had already delivered multiple AUKUS trials with support from UK SMEs.207 In May 2023 the MOD publicised a “world first” trial which deployed UK, Australian and US assets in a “collaborative swarm” using AI.208 In August of this year, the AUKUS partners ran an exercise in which autonomous drones from the three countries worked in tandem to identify targets; this was reportedly the first use by the UK and its allies of autonomy and AI sensing systems in a real-time military environment.209
111. Evidence submissions were generally optimistic about the potential for AUKUS to be a catalyst for the development of the UK’s AI sector. ADS Group described it as a “generational opportunity for the UK’s defence AI industry to become a global leader, and to collaborate with our partners to develop the next generation of technologies.”210 RAND Europe wrote that AUKUS was “a natural vehicle to advance the UK’s AI ambitions through international collaboration” and could help to promote interoperability and joint capabilities.211 In a paper on AUKUS collaboration, RAND recommended the creation of multinational research and development centres to catalyse this process by enabling the three nations to develop their Research and Development capabilities in partnership.212
112. Another potential advantage of AUKUS is that it could enable the development of common approaches and standards which could simplify collaboration between the partner countries’ AI sectors. Adarga proposed that common AUKUS standards should be devised and shared with developers to encourage companies to develop products which can be used by all three militaries.213 This could not only broaden the customer base for products, but make interoperability between partner nations easier. Conversely, the risk of failing to develop consistent standards is that collaboration is hampered by lock-in effects,214 and that allies’ systems are not compatible.215 Faculty said the UK Government could support the UK AI sector by influencing the development of AUKUS standards so that UK companies were not unfairly disadvantaged when compared with their US or Australian counterparts.216 KBR and Frazer-Nash recommended that the trilateral AUKUS Advanced Capabilities Industry Forum would be a good vehicle for this work.217
113. MOD’s evidence recognised the value of collaboration and interoperability. Lieutenant General Tom Copinger-Symes, Deputy Commander at Strategic Command gave an example of an AI framework being developed for the P-8 Maritime Patrol Aircraft used by all three nations on the basis of common data architectures, data standards and cloud architecture. He said that these standards could potentially spread to other allies in NATO and Five Eyes.218
114. Some contributors to the inquiry felt there could be clearer ownership and direction of AUKUS work within government. techUK’s submission reported
techUK members do not feel there is a centre of gravity for AI work within the UK’s contribution to the AUKUS partnership, which makes it hard for companies to proactively engage in associated opportunities relating to AI. Some members suggested that responsibility for this should sit with a qualified centralised body such as the Defence Science and Technology Laboratory (Dstl) or the DAIC.219
KBR and Frazer-Nash welcomed the work being done to shape the AI and autonomy priorities for Pillar 2, but said “there is more to do to define and formalise the Government’s priorities.”220 MOD told us that although AUKUS work was coordinated by the Director General AUKUS, within Pillar 2 AI work was led by the head of the Defence AI Centre.221
115. Since the Sub-Committee took its evidence, a new administration has been elected in the United States. Politicians and defence commentators have speculated on what the approach of the Trump administration to AUKUS might be, although there has been no definitive statement of its policy at the time of writing.222
116. Several contributors to the inquiry expressed concern that export controls could be an obstacle to effective AUKUS collaboration on Defence AI. The United States’ stringent controls on military exports—in particular International Traffic in Arms Regulations (ITAR)—were seen as being a barrier to effective cooperation.223 MOD recognised this issue in oral evidence, and progress has since been made in this area.224 In August the AUKUS partners announced mutual export control reforms to reduce licence requirements for export, transfer and re-export of certain defence products between the AUKUS partner nations, including reductions in ITAR restrictions.225
117. It is not only through the United States and Australia that the UK is working with allies on the development of Defence AI. The AUKUS partnership consults with other nations on elements of the Pillar 2 programme, and cooperation with Japan on advanced capabilities including AI is already being explored.226 Other cooperation exists outside of AUKUS as well: for example, NATO has its own efforts to spur innovation such as the Defence Innovation Accelerator for the North Atlantic (DIANA), which cites AI as a technological area of interest.227 In September the UK, US and Canada reached a trilateral agreement to collaborate on research, development and testing of advanced defence and security technologies including AI.228 In the future, as defence programmes become increasingly AI-enabled, collaboration with international partners on those programmes is likely to require collaboration on AI as a matter of routine. The Global Combat Air Programme, which is a collaboration between the UK, Italy, and Japan and which will incorporate AI, is one such example.
118. conclusion
Pillar 2 of the AUKUS partnership is a generational opportunity for UK defence to work alongside allies at the frontier of new technologies, including AI, that can transform the future of defence. The ability for UK AI companies to develop defence AI alongside allies should be a catalyst for growing the UK defence AI sector. We welcome recent steps taken by the AUKUS partners to reduce the licensing requirements for exports and technology sharing.
119. conclusion
It is unknown what the new US administration’s approach to AI, to trade policy and to defence more widely will be. The Government will need to be alive to new developments as the AUKUS partnership evolves in the coming months and years. We will monitor developments closely.
120. recommendation
The Government should work with the US and Australian Governments to integrate research and development in AI between the UK and its AUKUS allies. This could include establishing multinational research centres and fora to normalise collaboration between the countries’ AI sectors.
121. conclusion
Artificial Intelligence has the potential to transform defence in fundamental ways, from back office functions to the front line, and to provide a decisive advantage in military competition and conflict. We are already seeing AI being deployed effectively in Ukraine, showing that AI is no longer something on the horizon but a reality with which defence must engage. Given the breadth of potential applications of AI there are few areas of defence which would not benefit from being AI-enabled or AI-enhanced, and so MOD needs to start thinking about AI as an integral part of how it solves problems and meets its objectives. This entails both practical and cultural change: MOD requires digital infrastructure to enable the development and deployment of AI but also a change in mindset to adapt to a world where military advantage may increasingly be delivered by digital capabilities and cheaper platforms that can be rapidly developed, deployed and iterated.
122. conclusion
The UK has the potential to be home to a first-class defence AI sector, but at present the sector is under-developed and requires cultivation by MOD. Improving the digital infrastructure, skills base and security clearance situation will help, but the department also needs to adapt its ways of working to make itself a more appealing and effective partner for the sector. It needs to become more comfortable with risk-taking, rapid development cycles and working with non-traditional defence suppliers.
123. conclusion
MOD policy documents recognise the above challenges, but throughout this inquiry witnesses said that the organisation has been slow to change, and that while ‘pockets of excellence’ exist Defence still has a long way to go to achieve the necessary transformation. At a time of rapid progress in the field of AI, MOD must itself adapt rapidly to harness that progress. The ongoing Strategic Defence Review is an ideal moment for Defence to modernise both its capabilities and its mindset for the new AI-enabled era.
1. The UK has many of the right conditions that would allow it to be a global leader in the development of Defence AI, but at present Defence is an under-developed aspect of the AI ecosystem in the UK, and the gap between the UK and the current global leaders in AI, the United States and China, is significant. The UK cannot and should not aim to match those countries’ sectors in terms of scale but should instead seek to specialise in areas of strength and achieve a first-class level of sophistication in these. (Conclusion, Paragraph 15)
2. AI is becoming increasingly critical to effective defence, and therefore the UK’s ambition must be to have a first-class Defence AI ecosystem. The MOD should establish measures by which it will compare the UK’s sector against others internationally, so that the sector’s strength relative to those of its peers can be tracked. (Recommendation, Paragraph 16)
3. There has been a gap between rhetoric and reality when it comes to Defence AI for some time. MOD statements and policy documents speak about Artificial Intelligence as a paradigm-shifting technology for Defence, and the use of AI in theatres such as Ukraine bears this out. Yet the department has not behaved as though this is the case; AI is still treated as a novelty or a niche interest rather than something that will soon be a core component of defence systems across the Armed Forces. If UK Defence is to become AI-enabled the ‘say-do gap’ must be closed. (Conclusion, Paragraph 20)
4. The Strategic Defence Review represents an opportunity for the Government to modernise UK defence; one way it should do this is by recognising and embracing the ways defence needs to change to reflect the new reality of an AI-enabled world. The Government’s response to the SDR must set out specific actions that will be taken to normalise the use of AI as an enabler across the work of Defence. (Recommendation, Paragraph 21)
5. The Defence AI Strategy sets appropriate ambitions for how the UK will develop and use AI in Defence in the future, but MOD has not set out clear action plans for how these will be achieved and how it will measure its progress towards them. This makes it difficult for industry and investors to gauge the seriousness of MOD’s commitment to AI. In turn, this has a suppressive effect on investor confidence and consequently on the growth of AI capacity in the UK. (Conclusion, Paragraph 28)
6. As part of its response to the Strategic Defence Review, MOD must clarify whether the new Government remains committed to the Defence AI Strategy, and provide an update on progress it is making against each of the objectives and actions set out within it. It should set out a clear action plan defining specific actions it will take to further advance each of those objectives and actions and how it will measure and demonstrate success against them. (Recommendation, Paragraph 29)
7. Defence AI is evolving rapidly and some aspects of the Defence AI Strategy are already outdated. In such a fast-moving field it is important that the Ministry of Defence is alert to new innovations in AI and is willing and able to adapt its approach. The Defence AI Strategy needs to be a dynamic strategy, capable of adapting to changes in the Defence AI landscape over time. Just as defence procurement is moving towards a ‘spiral’ model, MOD needs to be capable of ‘spiralling’ its strategic approach by quickly learning lessons and updating its approach accordingly. (Conclusion, Paragraph 30)
8. In its response to this Report, MOD should provide a detailed update on: a) what it considers to have been the key technological developments in Defence AI since the Defence AI Strategy was published; b) how the use of AI in defence has evolved since the Strategy was published; and c) how the implementation of the Strategy has evolved in response to these changes. By doing this, MOD can demonstrate that it is able to evolve its approach within the parameters of the existing Strategy. (Recommendation, Paragraph 31)
9. Despite the existence of pockets of excellence within MOD, the department is by its own admission not yet ‘AI ready’. This is concerning because AI is no longer something on the horizon but is already here and being deployed on the frontline. The use of automated analysis and uncrewed systems in Ukraine indicates that AI-enabled systems are and will remain essential to defence; the UK therefore has no choice but to embrace AI as a core defence capability. MOD has started this process but has a long way to go. (Conclusion, Paragraph 36)
10. AI ‘readiness’ is hard to define because the technology and its use in defence is evolving so rapidly. It is extraordinarily difficult to predict how AI technology will develop during the lifetime of the Defence AI Strategy, and this makes AI readiness a fast-moving target and a process that will never be fully complete. For this reason, rather than aiming to be ‘AI-ready’ it would be more appropriate for MOD to aim to become an ‘AI-native’ organisation, for which AI is a core part of its toolkit and which is able to use AI instinctively to solve problems and meet its objectives. (Conclusion, Paragraph 37)
11. The aspects of readiness identified in the Strategy are all valid ambitions but will not result in the effective deployment of AI-enabled defence on their own. To accomplish this MOD also needs to achieve a broader cultural change that goes beyond AI. The organisation needs to become more comfortable with rapid change, more open to experimentation and risk and more able to deploy and iterate cheaper, disposable, software-led solutions on faster timelines. There is evidence–including in the Defence AI Strategy and the new Integrated Procurement Model–that MOD understands this, but it is not clear that this has yet resulted in real change or that there is a coherent plan to bring about that change across the organisation. (Conclusion, Paragraph 38)
12. Becoming ‘AI ready’ must not be treated as a discrete task but rather as one aspect of a broader transformation of MOD’s culture to become more innovative and adaptable in response to a faster-paced and more complex defence environment. The department should consider this cultural change a necessary condition not just for using AI effectively but for achieving Defence’s aims in modern conflict. MOD must urgently accelerate this transformation. (Recommendation, Paragraph 39)
13. Responsibility for AI across Defence is diffuse, and as a result Defence’s approach is not as joined up as it should be. The creation of the Defence AI Centre as a focal point for industry engagement and development is welcome, but it has limited scope to influence how AI is adopted across Defence, as demonstrated by the fact that the head of the Defence AI Centre is a one star officer. The kind of change required needs to be driven by senior leaders at the top of the Defence organisation. Driving adoption of AI is not an optional extra but a task for all parts of the organisation, and as such should be a responsibility for the established leadership of MOD. (Conclusion, Paragraph 45)
14. In view of the importance of AI to so many aspects of defence, the Chief of the Defence Staff should have lead responsibility for Armed Forces transformation, a major part of which will be embedding the use of Artificial Intelligence in the work of MOD and the Armed Forces and bringing about the cultural change necessary to make the use of AI ‘business as usual’ in defence. Complementing this, Strategic Command should be responsible for ensuring defence AI develops in a coherent way across defence, so that AI is interoperable across the services. (Recommendation, Paragraph 46)
15. The Defence AI Centre can be a tool for knowledge exchange, responsible for mainstreaming AI and sharing innovation and best practice across the Armed Forces and between industry, the research sector and MOD. To do this it needs to be visible and accessible. The DAIC should have its own premises, so that it can act as a physical focal point for collaboration on Defence AI. The DAIC should host attachments and coordinate placements with industry, with the aim of developing individuals’ AI fluency and expertise that they can then take back to their own branch of the Forces. (Recommendation, Paragraph 47)
16. As AI becomes an increasingly important component of defence systems, this will change the way those systems are designed, built and maintained. MOD’s acquisition processes will need to evolve in response. The Integrated Procurement Model’s commitment to spiral development is an encouraging sign that MOD grasps the nature of this challenge. However, it should not underestimate the scale of the change this will require both in terms of its processes and its culture. If the IPM is properly implemented, one of the benefits we would expect to see is more regular development and deployment of AI-enabled defence systems. Work to deliver the spiral development aspects of the IPM must begin now and with urgency. (Conclusion, Paragraph 58)
17. In its response to this report, MOD should explain how the Integrated Procurement Model will bring about more regular development and deployment of AI-enabled defence systems. The Model should ensure that it becomes standard MOD procurement practice to consider how AI might deliver or enhance a given defence capability. The Model promises that decisions on major programmes will be informed by independent advice at the pre-concept phase; this advice should include an assessment of the project’s AI-readiness and how it could be AI-enabled. Engagement with industry on the new procurement model should extend beyond the traditional defence sector so that it reaches developers whose software could have dual-use potential but who may not have considered working with Defence before. (Recommendation, Paragraph 59)
18. MOD has not provided a clear enough signal to industry of the kind of Defence AI it wishes to acquire, or a clear route to market for smaller companies who might want to sell to the department. This has made engaging with defence an uncertain proposition for smaller AI companies and has resulted in limited appetite from investors. (Conclusion, Paragraph 65)
19. MOD should aim to cultivate a more dynamic ecosystem of smaller AI companies working in Defence. It should do this by:
20. One obstacle to the adoption of AI by Defence is the culture clash between Defence and the tech industry. Tech’s ‘fail fast’ mindset and model of rapid, iterative development is not something that comes naturally to Defence, but this is something MOD needs to adapt to as AI’s importance grows. The ‘cultural shift’ outlined in the Integrated Procurement Model suggests that MOD understands the need for cultural change, but the department acknowledges that achieving this will be difficult. (Conclusion, Paragraph 67)
21. In its response to this report, MOD should set out specific actions it will take to achieve the ‘cultural shift’ envisaged in the Integrated Procurement Model so that decision-makers feel more empowered to pursue innovative solutions, to take calculated risks, and to acknowledge and learn from failures. (Recommendation, Paragraph 68)
22. Smaller companies told us there were relatively few opportunities in the UK’s Defence AI ecosystem for smaller defence suppliers to win the kind of contracts that would allow them to expand and attract investment. While MOD is good at supporting experimentation and the production of prototypes, it struggles to translate these into fielded and scaled capability. (Conclusion, Paragraph 69)
23. MOD should establish a fund within its existing procurement budget for the specific purpose of scaling successful defence innovations, including those which use AI, to help the most successful prototypes translate into fielded and scaled capabilities. (Recommendation, Paragraph 70)
24. In recognition of the fact that many of the AI companies who might provide services to defence are smaller than traditional defence suppliers, MOD should design its competitions and contracting processes in such a way as to minimise administrative barriers to entry, to make competition for defence contracts as accessible as possible to outsiders and challengers. We heard that CommercialX has enjoyed significant success in simplifying commercial processes for smaller businesses, and these lessons should be drawn on when tendering for Defence AI. (Recommendation, Paragraph 71)
25. A mature defence AI ecosystem will require improved digital infrastructure, including cloud networks that enable developers to work securely with classified data and an increase in overall compute. MOD has set out the right ambitions in the Defence Digital Strategy, but realising these has proved difficult and the department’s plans for a ‘Digital Backbone’ are behind schedule. (Conclusion, Paragraph 77)
26. MOD should undertake a mapping exercise to assess the adequacy and the resilience of its digital ecosystem. This should consider, but not be limited to: computing power; secure cloud computing; data centres; availability of semiconductors; quantum computing capacity; and frontier AI models. On the basis of this exercise, the department should identify weaknesses that might constrain the development of Defence AI. (Recommendation, Paragraph 78)
27. In its response to this Report, MOD should provide an update on delivery of its ‘Digital Backbone’, explaining how it is measuring progress and providing an estimated completion date. (Recommendation, Paragraph 79)
28. Defence collects masses of data that could potentially be used to train defence AI models, but this data is not always easily accessible to those developing defence software. To some extent this is an inevitable challenge when working with sensitive data of the kind Defence holds, but it is a challenge that must be overcome if the UK is to apply AI to defence problems. Even at lower or unclassified levels of data, there is enormous unrealised potential in the data held by Defence that could be unlocked by improving the way data is collected, labelled and shared. The 2021 Data Strategy for Defence shows that MOD has diagnosed the problem and has the right ambitions, but our evidence shows that many of the issues identified in the Data Strategy persist. (Conclusion, Paragraph 84)
29. MOD should confirm whether it still intends to fully deliver the Digital Strategy for Defence. By July 2025, MOD should provide us with a progress report on each of the Strategic Outcomes identified in the Data Strategy for Defence. It should confirm whether each of these outcomes were fulfilled by 2025, as was set out in the Strategy, and if this is not the case explain the reasons for the delay and confirm new dates for their fulfilment. (Recommendation, Paragraph 85)
30. MOD should address the barriers which currently prevent smaller AI companies from using defence data and so inhibit them from bringing value to defence. It should do this by overcoming the delays companies face in obtaining security clearance for their staff to access such data, designing tenders that enable use of open-source data where appropriate, and making available synthetic datasets for tenders when security demands. (Recommendation, Paragraph 86)
31. Recruiting and retaining the brightest minds in AI to work on defence problems is a challenge. Recruitment difficulties exist across STEM disciplines, but the market for AI skills differs in such a way that meeting the challenge requires a targeted approach. (Conclusion, Paragraph 98)
32. There is very limited information available on what MOD believes its AI skills needs are; it is therefore difficult to know how MOD understands the skills challenge it faces. (Conclusion, Paragraph 99)
33. MOD should carry out a mapping exercise to establish what skills it believes the UK Defence AI sector faces and where shortfalls currently exist. This exercise should also consider how the sector’s skills needs might change in the future as AI evolves. (Recommendation, Paragraph 100)
34. As was promised in the Defence AI Strategy, the Ministry of Defence should devise a strategy for supporting individuals with an AI specialism to develop a career in defence. The plan should create viable career paths for AI specialists within MOD and the Armed Forces, and should also establish schemes to enable AI experts to move between the defence and civilian sectors flexibly, including industry placements with defence of the kind used by the National Cyber Security Centre. MOD should also develop recruitment campaigns targeting AI professionals, both for full-time and specialist reservist roles. (Recommendation, Paragraph 101)
35. As AI becomes a core part of Defence’s functions, MOD will require not just AI specialists but greater AI-fluency throughout the organisation, and particularly among its leadership. MOD recognises this and has begun to offer AI training, but at the moment this is very limited and does not reflect the importance of AI to Defence. (Conclusion, Paragraph 102)
36. MOD should increase the proportion of the Professional Military Education curriculum dedicated to AI, so that current and future defence leaders improve their understanding of AI’s impact on defence and learn how to better exploit it. (Recommendation, Paragraph 103)
37. The UK’s AI-enabled defence systems will be most effective if they are interoperable with those of its allies. This will require dialogue between allies to ensure that systems are designed with compatibility in mind. Interoperability will be more easily achieved if the UK and its allies have a mutual understanding of their objectives in developing and deploying defence AI. (Conclusion, Paragraph 107)
38. The Government should convene a programme of discussions with the UK’s AUKUS and NATO allies aimed at reaching a mutual understanding of each other’s AI objectives and strategies. These discussions should also explore reaching, where possible, common positions and standards on how AUKUS and NATO allies will develop and deploy AI in defence. These could include common approaches to data collection and labelling, a shared approach to ethical use of autonomous technologies, and joint working on skills and capacity building. (Recommendation, Paragraph 108)
39. Pillar 2 of the AUKUS partnership is a generational opportunity for UK defence to work alongside allies at the frontier of new technologies, including AI, that can transform the future of defence. The ability for UK AI companies to develop defence AI alongside allies should be a catalyst for growing the UK defence AI sector. We welcome recent steps taken by the AUKUS partners to reduce the licensing requirements for exports and technology sharing. (Conclusion, Paragraph 118)
40. It is unknown what the new US administration’s approach to AI, to trade policy and to defence more widely will be. The Government will need to be alive to new developments as the AUKUS partnership evolves in the coming months and years. We will monitor developments closely. (Conclusion, Paragraph 119)
41. The Government should work with the US and Australian Governments to integrate research and development in AI between the UK and its AUKUS allies. This could include establishing multinational research centres and fora to normalise collaboration between the countries’ AI sectors. (Recommendation, Paragraph 120)
42. Artificial Intelligence has the potential to transform defence in fundamental ways, from back office functions to the front line, and to provide a decisive advantage in military competition and conflict. We are already seeing AI being deployed effectively in Ukraine, showing that AI is no longer something on the horizon but a reality with which defence must engage. Given the breadth of potential applications of AI there are few areas of defence which would not benefit from being AI-enabled or AI-enhanced, and so MOD needs to start thinking about AI as an integral part of how it solves problems and meets its objectives. This entails both practical and cultural change: MOD requires digital infrastructure to enable the development and deployment of AI but also a change in mindset to adapt to a world where military advantage may increasingly be delivered by digital capabilities and cheaper platforms that can be rapidly developed, deployed and iterated. (Conclusion, Paragraph 121)
43. The UK has the potential to be home to a first-class defence AI sector, but at present the sector is under-developed and requires cultivation by MOD. Improving the digital infrastructure, skills base and security clearance situation will help, but the department also needs to adapt its ways of working to make itself a more appealing and effective partner for the sector. It needs to become more comfortable with risk-taking, rapid development cycles and working with non-traditional defence suppliers. (Conclusion, Paragraph 122)
44. MOD policy documents recognise the above challenges, but throughout this inquiry witnesses said that the organisation has been slow to change, and that while ‘pockets of excellence’ exist Defence still has a long way to go to achieve the necessary transformation. At a time of rapid progress in the field of AI, MOD must itself adapt rapidly to harness that progress. The ongoing Strategic Defence Review is an ideal moment for Defence to modernise both its capabilities and its mindset for the new AI-enabled era. (Conclusion, Paragraph 123)
Mr Tanmanjeet Singh Dhesi, in the Chair
Mr Calvin Bailey
Alex Baker
Lincoln Jopp
Mrs Emma Lewell-Buck
Jesse Norman
Ian Roome
Michelle Scrogham
Fred Thomas
Derek Twigg
Draft Report (Developing AI capacity and expertise in UK defence), proposed by The Chair, brought up and read.
Ordered, That the draft Report be read a second time, paragraph by paragraph.
Paragraphs 1 to 123 read and agreed to.
Summary agreed to.
Resolved, That the Report be the Second Report of the Committee to the House.
Ordered, That The Chair make the Report to the House.
Ordered, That embargoed copies of the Report be made available (Standing Order No. 134)
Adjourned till Tuesday 7 January 2025 at 10.00am.
The following witnesses gave evidence. Transcripts can be viewed on the inquiry publications page of the Committee’s website.
James Black, Assistant Director of the Defence and Security Research Group, RAND Europe; Dr Simona Soare, Senior Lecturer, Lancaster University; Air Marshal (Retd) Edward Stringer, Senior Fellow, Policy ExchangeQ1–36
Neil Morphett, Chief Engineer and Advanced Technology Director, Lockheed Martin UK; Doctor Simon Harwood, Director of Capability and Chief Technology Officer, Leonardo UK LtdQ37–59
Richard Drake, General Manager, Anduril Industries; Andrew van der Lem, Head of Defence, Faculty Science Ltd (Faculty AI); Phil Morris, Head of Defence AI (United Kingdom), Palantir Technologies LtdQ60–89
James Cartlidge MP, Minister for Defence Procurement, Ministry of Defence; Paul Lincoln CB OBE VR, Second Permanent Secretary, Ministry of Defence; Lt Gen. Tom Copinger-Symes CBE, Deputy Commander of UK Strategic Command, Ministry of DefenceQ90–149
The following written evidence was received and can be viewed on the inquiry publications page of the Committee’s website.
DAIC numbers are generated by the evidence processing system and so may not be complete.
1ADS GroupDAIC0020
2Adam Smith InstituteDAIC0003
3AdargaDAIC0005
4Air Street CapitalDAIC0002
5BAE SystemsDAIC0018
6BT BusinessDAIC0006
7Breckon, Professor Toby (Professor of Computer Vision, Durham University)DAIC0016
8Capita PlcDAIC0023
9Dear, Dr KeithDAIC0014
10Faculty Science Ltd (Faculty AI)DAIC0012
11Firlej, Dr Mikolaj (Assistant Professor, University of Surrey, Surrey Institute for People-Centred Artificial Intelligence)DAIC0019
12Gegov, Alexander (Associate Professor, University of Portsmouth)DAIC0001
13JacobsDAIC0004
14KBR and Frazer-Nash ConsultancyDAIC0013
15Ministry of DefenceDAIC0022
16Palantir Technologies UK, LtdDAIC0010
17RAND EuropeDAIC0007
18Scale AIDAIC0021
19Taylor, Professor Trevor (Director, Defence, Industries & Society Programme, Royal United Services Institute)DAIC0017
20ThalesDAIC0011
21Wang, Dr Jia (Associate Professor, School of Law, Durham University)DAIC0008
22techUKDAIC0009
1 Center for Strategic and International Studies, Understanding the Military AI Ecosystem of Ukraine, 12 November 2024
2 Reuters, Ukraine rolls out dozens of ai systems to help its drones hit targets, 31 October 2024
3 Centre for a New American Security, The Role of AI in Russia’s Confrontation with the West, 3 May 2024, p.10–13
4 Center for Strategic and International Studies, Understanding the Military AI Ecosystem of Ukraine, 12 November 2024
5 The evidence received by the previous Sub-Committee can be found on its website, Developing AI capacity and expertise in UK Defence - Committees - UK Parliament. Evidence cited in the footnotes to this report is, unless otherwise stated, evidence reported by the previous Sub-Committee.
6 Professor Payne has made a declaration of interests, which can be found in the Committee’s formal minutes.
7 HM Government, National AI Strategy, 22 September 2021, p. 15
8 Ministry of Defence, Defence AI Strategy, 15 June 2022, p. 4
9 Ministry of Defence, Defence AI Playbook, 1 February 2024, p. 5
10 Department for Science, Innovation and Technology, Artificial Intelligence sector study 2022, 29 march 2023
12 KBR and Frazer-Nash Consultancy (DAIC0013)
16 ADS Group, Helsing Ltd, Accessed 18 November 2024; ADS Group, Anduril Industries UK Ltd, Accessed 18 November 2024
17 RAND Europe (DAIC0007); Dr Mikolaj Firlej (DAIC0019); Dr Keith Dear (DAIC0014)
23 Dr Mikolaj Firlej (DAIC0019)
25 Oxford Insights, Government AI Readiness Index 2023, 6 December 2023; Tortoise Media, The Global AI Index, accessed 18 November 2024
27 Australian Strategic Policy Institute, AUKUS Relevant Technologies: Top 10 country snapshot, 6 June 2023
28 Tortoise Media, The Global AI Index 2023, accessed 14 May 2024)
29 ADS Group (DAIC0020); Adarga (DAIC0005)
32 HM Government, National AI Strategy, 22 September 2021
33 Ministry of Defence, Mobilising, Modernising & Transforming Defence, A report on the Modernising Defence Programme, 7 March 2018
34 Ministry of Defence, Defence Technology Framework, 9 September 2019, p.10; Ministry of Defence, Defence Command Paper: Defence’s responses to a more contested and volatile world, CP 901, 18 July 2023, p. 26–27
35 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022
36 Dr Keith Dear (DAIC0014); Adam Smith Institute (DAIC0003)
38 Adam Smith Institute (DAIC0003)
39 Adarga (DAIC0005); Dr Mikolaj Firlej (DAIC0019); techUK (DAIC0009)
40 HM Government, Strategic Defence Review 2024–2025: Terms of reference, 17 July 2024
41 HM Government, Strategic Defence Review 2024–2025: Terms of reference, 17 July 2024
42 GOV.UK, Defence Industrial Strategy – Statement of Intent, 2 December 2024
43 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022
44 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 2
45 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 61
46 Army, British Army’s Approach to Artificial Intelligence, October 2023
47 UK Defence Journal, Royal Navy looking to ‘hasten’ adoption of AI, 18 September 2023
48 Ministry of Defence (DAIC0022)
49 Q11; Faculty Science Ltd (Faculty AI) (DAIC0012); KBR and Frazer-Nash Consultancy (DAIC0013); techUK (DAIC0009)
50 Palantir Technologies UK, Ltd. (DAIC0010)
53 KBR and Frazer-Nash Consultancy (DAIC0013)
55 Palantir Technologies UK, Ltd. (DAIC0010)
56 Faculty Science Ltd (Faculty AI) (DAIC0012)
58 White House, Memorandum on Advancing the United States’ Leadership in Artificial Intelligence; Harnessing Artificial Intelligence to Fulfil National Security Objectives; and Fostering the Safety, Security and Trustworthiness of Artificial Intelligence, 24 October 2024
59 Adarga (DAIC0005); Faculty Science Ltd (Faculty AI) (DAIC0012)
61 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022
62 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 17
63 In Defence a Function refers to a cross-cutting activity that should be carried out in a coherent way across all Defence organisations. Functions include Digital, Finance, Intelligence, Military Capability Management and People. Each Function is the responsibility of an individual ‘Functional Owner’, a senior post-holder who reports to the Permanent Secretary. (See: Ministry of Defence, How Defence Works, September 2020, p. 6. A list of MOD Functions can be found on page 26 of the same document.)
64 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 17–28
65 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 25; Palantir Technologies UK, Ltd. (DAIC0010)
66 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 32
67 Ministry of Defence, Defence Command Paper: Defence’s responses to a more contested and volatile world, CP 901, 18 July 2023, p. 26–27
73 Q3 (CADMID refers to the MOD’s defence equipment life cycle: Concept, Assessment, Demonstration, Manufacture, In Service, Disposal)
74 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 17
75 Ministry of Defence, Integrated Procurement Model: Driving pace in the delivery of military capability, 28 February 2024,
76 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 2
77 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 22
78 HC Deb 19 Nov 2020, Col 489
79 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 2
80 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 2
82 Ministry of Defence, Dependable Artificial Intelligence (AI in Defence), JSP 936, 13 November 2024
84 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 24
85 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 20
86 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 26
87 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 31 (The first Defence AI Playbook was published in February 2024, see Defence AI Centre, The Defence AI Playbook, 1 February 2024)
88 Ministry of Defence (DAIC0022)
89 Q39; Q69; KBR and Frazer-Nash Consultancy (DAIC0013); techUK (DAIC0009)
96 KBR and Frazer-Nash Consultancy (DAIC0013)
99 Adarga (DAIC0005); ADS Group (DAIC0020); techUK (DAIC0009)
103 Evidence to the inquiry on Future Aviation Capabilities, Q152
112 Faculty Science Ltd (Faculty AI) (DAIC0012)
114 GOV.UK, Defence Industrial Strategy – Statement of Intent, 2 December 2024
115 Ministry of Defence, Integrated Procurement Model: Driving pace in the delivery of military capability, 28 February 2024
116 Ministry of Defence, Integrated Procurement Model: Driving pace in the delivery of military capability, 28 February 2024
117 Ministry of Defence, Integrated Procurement Model: Driving pace in the delivery of military capability, 28 February 2024
118 Ministry of Defence, Integrated Procurement Model: Driving pace in the delivery of military capability, 28 February 2024
119 Ministry of Defence, Integrated Procurement Model: Driving pace in the delivery of military capability, 28 February 2024
120 GOV.UK, Defence Industrial Strategy – Statement of Intent, 2 December 2024
121 Ministry of Defence, Dependable Artificial Intelligence (AI in Defence), JSP 936, 13 November 2024
123 Faculty Science Ltd (Faculty AI) (DAIC0012)
126 KBR and Frazer-Nash Consultancy (DAIC0013); ADS Group (DAIC0020); BAE Systems (DAIC0018)
127 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 31
128 Ministry of Defence, Defence AI Playbook, 1 February 2024
135 Dr Mikolaj Firlej (DAIC0019); Air Street Capital (DAIC0002)
139 Tortoise Media, Global AI Index, accessed 20 May 2024 (N.B. The Index ranks nations’ wider AI sectors, and not their Defence AI sectors specifically)
141 BAE Systems (DAIC0018); Faculty Science Ltd (Faculty AI) (DAIC0012); techUK (DAIC0009)
143 Ministry of Defence, Digital Strategy for Defence, 27 May 2021
144 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p. 25
145 GOV.UK, Press release: Bristol set to host UK’s most powerful supercomputer to turbocharge AI innovation, 13 September 2023
146 BBC News, Government shelves £1.3bn UK tech and AI plans, 2 August 2024
147 Committee of Public Accounts, The Defence Digital Strategy, Thirty-Sixth Report of Session 2022–23, 3 February 2023
149 Faculty Science Ltd (Faculty AI) (DAIC0012)
152 Faculty Science Ltd (Faculty AI) (DAIC0012); techUK (DAIC0009); Thales (DAIC0011)
155 Faculty Science Ltd (Faculty AI) (DAIC0012)
156 Faculty Science Ltd (Faculty AI) (DAIC0012)
157 SC stands for ‘Security Check’. Further information about National Security Vetting clearance levels can be found on GOV.UK.
158 Faculty Science Ltd (Faculty AI) (DAIC0012); Jacobs (DAIC0004)
159 Faculty Science Ltd (Faculty AI) (DAIC0012)
162 Ministry of Defence, Data Strategy for Defence, 27 September 2021
163 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p.24–25
164 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p.48
167 Faculty Science Ltd (Faculty AI) (DAIC0012); RAND Europe (DAIC0007)
168 Professor Toby Breckon (DAIC0016)
170 Professor Toby Breckon (Professor of Computer Vision at Durham University) (DAIC0016); Q33
173 Evidence to the House of Lords AI in Weapon Systems Committee, 11 May 2023, Q77
177 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p.18
182 BT Business (DAIC0006); Q31
184 Ministry of Defence, Agency and Agility: Incentivising people in a new era, 19 June 2023, p. 52
185 Ministry of Defence, Agency and Agility: Incentivising people in a new era, 19 June 2023, p. 87
186 Ministry of Defence, Agency and Agility: Incentivising people in a new era, 19 June 2023, p. 90
188 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p.18
189 Defence: Artificial Intelligence, PQ 16919, 9 December 2024; Defence: Artificial Intelligence, PQ 21057, 12 April 2024
190 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p.18–19
192 Professor Toby Breckon (Professor of Computer Vision at Durham University) (DAIC0016)
195 Ministry of Defence, Defence Artificial Intelligence Strategy, 15 June 2022, p.18–19
197 RAND Corporation, Towards AUKUS Collaboration on Responsible Military Artificial Intelligence, 6 February 2024, p. v
201 White House, Memorandum on Advancing the United States’ Leadership in Artificial Intelligence; Harnessing Artificial Intelligence to Fulfil National Security Objectives; and Fostering the Safety, Security and Trustworthiness of Artificial Intelligence, 24 October 2024
202 GOV.UK, News story: UK, US and Australia launch new security partnership, 15 September 2021
204 GOV.UK, New Defence Secretary sets out commitment to AUKUS to drive regional British growth, 16 July 2024
205 GOV.UK, New government adviser to maximise benefits of AUKUS partnership, 22 August 2024
206 AUKUS, PQ 12460, 12 November 2024; AUKUS, PQ 17120, 9 December 2024
207 Ministry of Defence (DAIC0022)
208 GOV.UK, Press release: World first as UK hosts inaugural AUKUS AI and autonomy trial, 26 May 2023
209 The Times, UK and allies use AI drones in battlefield exercise for first time, 9 August 2024
212 RAND Corporation, Towards AUKUS Collaboration on Responsible Military Artificial Intelligence, 6 February 2024, p. 46
214 Adarga (DAIC0005); Dr Jia Wang (DAIC0008)
216 Faculty Science Ltd (Faculty AI) (DAIC0012)
217 KBR and Frazer-Nash Consultancy (DAIC0013)
220 KBR and Frazer-Nash Consultancy (DAIC0013)
222 See, for example: Reuters, Donald Trump is supportive of AUKUS defence pact, former Australian PM says, 16 May 2024; Australian Naval Institute, Trump would axe AUKUS: Bolton, 31 October 2024; Council on Foreign Relations, How Australia is Responding to a Second Trump Term, 12 November 2024; Politico, Will Donald Trump kill US-UK-Aussie sub defense deal?, 9 December 2024
223 RAND Corporation, Towards AUKUS Collaboration on Responsible Military Artificial Intelligence, 6 February 2024, p. 21–23; Adarga (DAIC0005);
225 GOV.UK, Historic Breakthrough in defence trade between AUKUS partners, 15 August 2024
226 GOV.UK, AUKUS partnership to consult with other nations including Japan on military capability collaboration, 8 April 2024
227 NATO, Defence Innovation Accelerator for the North Atlantic (DIANA), Accessed 21 November 2024
228 GOV.UK, UK, US and Canada to collaborate on cybersecurity and AI research, 20 September 2024
All publications from the Committee are available on the publications page of the Committee’s website.
Number |
Title |
Reference |
1st |
Service Accommodation |
HC 406 |