AI and creative technology scaleups: less talk, more action Contents

Chapter 3: Scaling up: AI

The UK AI landscape

87.Evidence to our inquiry was clear about the strategic importance and transformational potential of AI in driving innovation and growth across multiple sectors.192 Michael Holmes, CEO of spinout incubator Scale Space, warned against talking about AI as a “completely separate sector”. Instead, “the reality is that across any industry sector AI is becoming increasingly prevalent”.193 James Wise, General Partner at Balderton Capital, said that “we are at the foundations of an industrial revolution underpinned by these new technologies”.194 Mr Dancer and Prof Lord Tarassenko described the current period as “a transformational moment in technology, of the sort which only comes about once or twice in a century.”195

88.In our report on large language models, we found that the previous Government had strayed from the ambitions set out in its 2021 National AI Strategy196 towards a “narrow view of high-stakes AI safety”. We called for a rebalancing of priorities towards a greater focus on opportunity.197 During this inquiry, witnesses emphasised the need to act swiftly to take advantage of such opportunities. Erin Platts, CEO of HSBC Innovation Banking, spoke of a “two to three-year window” for action.198 Mr Wise warned that “Britain is at real risk of being an also-ran if we do not find ways to keep up”.199

89.Witnesses also raised concerns about the withdrawal by the current Government of £1.3 billion of funding for AI infrastructure, including £800 million for an exascale supercomputer in Edinburgh.200 We heard that this decision left the entrepreneurial community “deeply unimpressed” and sent the wrong signal about the UK’s commitment to AI and other emerging technologies.201

90.Contrasts were drawn between the UK Government’s approach and that of some of its European counterparts.202 Policies delivered in France, in particular, were credited with giving the country an advantage in AI development. These include: the Tibi investment scheme;203 the Station F incubator;204 the La French Tech initiative to promote French startups abroad;205 public-private partnerships allowing PhD students to work in a private company while completing their research at a public institution;206 and the IA-Booster programme, which focuses on integrating AI technologies into SMEs.207

Box 2: France’s pro-AI-policies

France’s five-year National AI strategy launched in 2018, with an initial budget of €800 million over three years, and a significant focus on advancing research. One-third of the funding supported initiatives such as establishing interdisciplinary institutes, funding an additional 180 PhD positions, and opening a petascale supercomputing facility. The second phase of the strategy in 2021–22 prioritised expanding education and training, advancing embedded AI, and promoting trustworthy AI in critical systems.208 The French National AI Commission recently drew up a further action plan representing an annual public investment of €5 billion over five years.209

Tortoise Media reported that the French Government had invested a total of €7.2 billion in AI since 2018—60 per cent more than the UK. Its largest supercomputer is three times more powerful than the UK’s equivalent.210 France is also home to Mistral, an open-source large language model competing with ChatGPT, Claude and Gemini, that is valued at $6 billion.211 However, reporting by the Financial Times suggests that promising French AI startups are also attracted to the bigger market and greater resources offered by moving to the US.212

91.The UK is ranked higher than France in the number of AI scientists, the number of academic papers published on AI, and the total amount of private investment,213 but we heard that this advantage is not guaranteed. As Mr Dancer and Prof Lord Tarassenko put it:

“If we fail to act, UK companies will not be able to play a full role in the development of applied AI, in which we have so many advantages … There are very few other such beneficial productivity or growth levers available to us; if we do not use them soon, we are likely to remain on the current trajectory of falling relative wealth.”214

The AI Opportunities Action Plan

92.Matt Clifford’s AI Opportunities Action Plan similarly characterised AI as “the Government’s single biggest lever to deliver its five missions, especially the goal of kickstarting broad-based economic growth.” The plan’s recommendations centred around three areas of focus:

93.The Prime Minister stated his ambition for the UK to become “the best place to start and scale an AI business”.216 Matt Clifford recognised that this would take “bold, concerted and coherent action, using all the levers of the state”. He argued that the “lead the current frontier firms enjoy” meant the Government would have to take a “more activist approach” to supporting new challenger AI companies to grow in the UK.217 Elsewhere, he urged the Government to “be on the side of innovators” and consider how the plan can benefit those “trying to do new and ambitious things in the UK”.218

94.The plan, and the Government’s response, were viewed as “a clear statement of intent to put AI at the heart” of the Government’s programme,219 as well as a demonstration of its “belief in the UK’s tech sector”.220 Commentators generally welcomed the ambitious objectives they set out,221 though some claimed these were overly geared towards benefitting the largest tech firms.222

95.Industry body techUK stressed that “the success of this plan depends on swift and coordinated action”.223 Sustained political will was also seen as vital, to ensure the plan is backed by the necessary funding, and that conflicting government priorities do not hinder implementation.224 Matt Clifford himself acknowledged that delivery of the plan “will require a whole of government commitment, with senior and visible leadership and a relentless focus on driving progress.”225

96.AI is not a sector but a technology, with the potential to drive innovation across all eight of the Government’s key growth sectors. Yet the window of opportunity for capitalising on the UK’s strengths is limited and diminishing.

97.The AI Opportunities Action Plan is a positive step towards seizing opportunities in this transformational technology, and the Government’s response to it is encouraging. However, achieving these goals will demand a mindset shift across the public sector, accompanied by bold policy reforms and robust political commitment. The Government should not underestimate the scale of the challenge.

98.The Government must take immediate action to deliver the AI Opportunities Action Plan. Delivery of the plan must be supported by sustained political commitment and a laser focus on delivering growth. Implementation must be joined-up and pragmatic, and focus on solving immediate challenges.

Delivering homegrown AI

General purpose vs fine-tuned models

99.Our evidence suggested that the UK’s existing strengths in university research and early-stage venture, alongside its limitations in market size and available capital, make some areas of focus for AI development more attractive than others.226 In his review, Matt Clifford set out that “our goal should be a thriving domestic AI ecosystem, with serious players at multiple layers of the ‘AI stack’”.227

100.We heard, however, that the development of a UK-based general purpose foundation model to compete with OpenAI’s GPT or Google’s Gemini is unlikely.228 Some witnesses cited DeepMind, which was acquired by Google in 2014, as emblematic of the UK’s capacity to build foundation model capabilities, but not to nurture their growth domestically.229 Barney Hussey-Yeo, founder and CEO of AI-fintech Cleo, said the UK had “missed the boat” on foundation models, adding “there is no chance that we will … be able to build these companies now”.230 Reporting on the AI action plan agreed that “trying to beat superpowers like China and the US at their own game is a recipe for misallocating capital—both financial and political”, and warned against the “far-fetched” idea of building a homegrown competitor to the biggest AI models.231

101.Witnesses made the case for the UK to focus instead on the development of fine-tuned models designed for specific uses in sectors where the UK demonstrates existing strengths. Antony Berg, CFO of AI speech scaleup Speechmatics, argued that “going specialised will be the way forward”, identifying an opportunity for the UK to “fine tune” large language models and “make them specialist for different sectors”.232 AI scaleup Luminance, for example, has built “a specialist AI that can help businesses with any interactions they have with their legal contracts”, using legal data.233 Other witnesses echoed Mr Berg’s assessment, suggesting that the UK’s success in financial services, gaming, media and advertising creates potential for fine-tuned models designed to solve specific issues in these industries.234

102.Our report on LLMs also highlighted the importance of open source development for ensuring fair competition and warned against regulatory interventions that might “stifle low-risk open access model providers”.235 Written evidence submitted to this inquiry by Andreessen Horowitz emphasised the benefits of open source technology as the “great equaliser for startups”, enabling them to innovate using tools freely available online.236

Talent

103.The Government told us that 62 per cent of UK AI firms identified skills shortages as a growth barrier, with SMEs struggling to match big tech salaries and facing global competition, particularly from the US.237 Witnesses agreed that competition for talent for AI-related roles is particularly fierce.238 Wayve cited estimates suggesting that there are only 300,000 AI engineers globally, with 80 per cent of them working for Google or Meta. As a consequence, “all companies, including scaleups, are competing for the other 20 per cent”.239

104.Witnesses drew attention to the success of the UK’s universities in attracting and developing technical talent, suggesting that this provided a competitive advantage for AI companies based here.240 However, not all of these skilled workers are British citizens.241 Stakeholders advocated for improvements to visa schemes and better incentives for working in scaleups, such as an expansion of the Enterprise Management Scheme, to ease recruitment pressures.242 Matt Clifford’s review similarly asserted that the “cost and complexity” of the existing immigration system creates “obstacles for startups”.243 The Government said improvements would be explored as part of the industrial strategy.244

Access to compute

105.Training and running AI models demands significant amounts of computing power. According to Mr Berg, “Google, Meta, Amazon and Microsoft are spending £250 billion a year” on compute, creating a “huge” disparity between the tech giants and smaller companies.245 The Government told us that some AI SMEs reported “spending up to 80 per cent of their budgets on compute capabilities”.246 Mr Wise suggested that this creates a tension for those conducting AI-related research, citing the case of a graduate researcher based at Oxford University:

“He is in a queue to access CPU compute with the Brunel infrastructure around Bristol. Were he at Google DeepMind now, he would have access to almost unlimited compute power as it builds that model … As an individual, he has to make that trade-off. Does he want to stay with the wonderful peers he has and the wonderful tutor he has at Oxford, or does he want to get paid significant sums of money and have access to all the resources he needs to build that model?”.247

106.Stakeholders from the University of Edinburgh noted that supercomputing will be the key driver of economic impact from modern large-scale AI over the next decade, but acknowledged that the UK “should consider how we meet the need of the long tail of users who have more modest
computational needs”.248 Others also highlighted the breadth of the UK’s compute demands.249

107.Matt Clifford’s review echoed our evidence in outlining the importance of building up ‘sovereign compute’, which is owned and/or allocated by the public sector, while recognising that it will “almost certainly be the smallest component of the UK’s overall compute portfolio”.250 The Government accepted his recommendation to expand the capacity of the UK’s AI Research Resource (AIRR) by at least 20 times by 2030. In addition, it announced the delivery of a new “state of the art supercomputing facility that will at least double” current AIRR capacity. Details of the location and funding for this project are outstanding, awaiting the publication of a long-term compute strategy and 10-year delivery roadmap this spring.251

108. The Government’s long-term compute strategy should set out as soon as possible, and certainly by the proposed “spring 2025” deadline, how it will deliver the broad range of computing resources required by AI scaleups, including high-end computing facilities. AI scaleups should be granted access to these facilities to catalyse commercial opportunities for UK companies. Startups and universities should also be provided access to ensure a healthy pipeline of innovation.

Finance

109.Some witnesses argued that issues with access to compute and talent were in fact a question of insufficient access to capital.252 Mr Dancer and Prof Lord Tarassenko told us:

“A company with a credible proposition should be able to attract investment to fund the right talent and compute, if that investment is available”.253

110.The capital demands of AI scaleups vary depending on the level and intensity of AI development.254 We heard that deeptech255 AI companies need larger amounts of money over longer time periods before they can commercialise their research.256 For example, autonomous vehicle software firm Wayve has not yet taken a product to market, despite being founded in 2017.257 Oxford Science Enterprises said that the issue was “a lack of growth capital that is willing to take the risk and be patient”.258 Stakeholders suggested that efforts to increase the availability of domestic, patient capital would be important for AI growth.259

111.We heard that public grants and programmes were also of insufficient scale and duration to be meaningful for AI scaleups.260 While Victor Riparbelli, CEO of AI scaleup Synthesia told us he was “very grateful” for the two Innovate UK grants the company received “when no one else believed in us”, he noted that accessing the grants required “a lot of paperwork … almost a full-time employee for a year to do all the reporting”.261 The British Business Bank told us it had invested in a number of leading UK AI companies, including Wayve and Quantexa.262 Challenges relating to the complexity, bureaucracy and efficacy of public grants and programmes were discussed in the previous chapter.263

Data

112.Some estimates suggest that by 2028, developers of large language models will require data sets as big as the total amount of available text on the internet.264 Three-quarters of respondents to the Government’s 2023 AI sector study reported needing more training data.265

113.According to AI firm Faculty, however, data is a “precision game—more is not always better”. It explained:

“A specific AI model, with a specific objective, will need specific data to achieve that. You need to understand what problem you’re solving and work back from that to the exact data you need.”266

Gerard Grech CBE, Managing Director at Founders, Cambridge Enterprise, noted that the Government has access to proprietary datasets that could be used to train small language models. He argued that “if we are serious about transforming public services”, these should be made available to developers.267

114.Ms Clark agreed that the UK has a “wealth of really high-quality data”, but acknowledged that access was a challenge for scaleups. She said that the Government was exploring ways to improve availability, including through the creation of a “national data library”, that will provide “simple, secure and ethical access to our key public data assets for researchers, policy makers and businesses”.268

115.The Government previously indicated that this would take “beyond one Parliament” to deliver.269 Leo Ringer, Founding Partner at Form Ventures, told us that “a five-year window to gather data and curate it properly is unacceptably long” for AI founders.270 Susan Bowen advocated for the Government to take a staggered approach to forming the library, by focusing on:

“curated datasets and [identifying] the social and economic challenges that we are trying to address. If we start from the problem that we are trying to solve and identify the datasets that we need to solve it, we will go a lot faster”.271

116.This approach was also championed by Matt Clifford, who called on Government to “rapidly identify at least five high-impact public data sets” to be made available to AI researchers and innovators.272 However, the Government committed only to “explore” how to “take forward this recommendation” as part of DSIT’s work to develop the national data library.273

117.The Government should quickly make available high-quality, curated data sources linked to specific objectives. A mission-led, incremental approach that builds public confidence should be adopted in work to deliver the complete national data library.

Regulation

118.During our recent visit to San Francisco, we heard that countries looking to be so-called “AI hyper-centres” require forward-looking regulation to attract researchers and give them a “safe harbour” to experiment without fear of personal legal liability.274 Oxford Science Enterprises emphasised the importance of “certainty on the Government’s plans for AI safety regulation and approach to regulatory alignment with other key markets” in enabling businesses to “scale confidently”.275

119.Many stakeholders voiced support for the UK’s sector-led approach to AI regulation,276 introduced by the previous administration, as opposed to a ‘one-size-fits-all’ approach.277 Some were cautious about the strict regulations introduced by the EU AI Act.278 Mr Grech said that “now that we have left the European Union … the UK has a great opportunity to be between the US, which is known to be very pro-innovation, and the EU, which is known to be very pro-regulation”. He added that equipping regulators to “really understand the needs and wants of the innovators … could be an edge for the UK”.279 Wayve told us that regulators could help high-growth UK tech companies expand in overseas markets by facilitating connections with foreign regulators.280

120.The risk aversion of regulators was raised as a significant challenge for AI scaleups. Mr Ringer suggested this is true across sectors:

“We see all the time in startups—whether in drone technology, medical devices or financial services … An innovation is brought to market using all the funding and talent that we talked about, and then it hits a regulator that does not have either the incentives or the resources to understand and enable it and instead has the incentives to be risk-averse.”281

121.Our report on large language models noted the importance of regulators being “properly resourced and empowered” in order to ensure good outcomes from AI, and highlighted an internal lack of technical expertise as a “recurring theme” across regulators.282 The AI Opportunities Action Plan noted that some regulators’ AI capabilities need “urgent addressing”, and called for more funding and a focus on safe AI innovation from sponsor departments.283 Matt Clifford’s recommendation to implement pro-innovation initiatives, for example through the use of regulatory sandboxes, echoed evidence from our current inquiry. He also indicated that more “radical changes” might be necessary, including through the establishment of a central body with powers to override sector regulators.284

122.The previous Government said in February 2024 that it had started developing a “central function to support effective risk monitoring, regulator coordination, and knowledge exchange”.285 Current Government Ministers appeared to have little awareness of this function. They later referenced the Central AI Risk Function (CAIRF), which coordinates a cross-government approach to AI risk, and suggested that regulatory coordination was the responsibility of the newly established Regulatory Innovation Office (RIO).286

123.Ms Clark acknowledged that at present “it can be a minefield for businesses trying to navigate the regulatory world” and stated that the RIO would “remove barriers to opportunity and to increase the speed at which transformative technologies and innovation can get safely to market”.287 The Government’s response to the AI Opportunities Action Plan said that the RIO will be empowered “to drive regulatory innovation for technologies and innovation through behavioural changes within regulators” by issuing strategic guidance and identifying priority sectors.288

124.Witnesses to our inquiry were cautiously optimistic about the RIO, but noted its infancy and cautioned against the introduction of further complexity.289 Speaking in a separate hearing, Sarah Cardell, CEO of the Competition and Markets Authority, said the regulator had “no detailed insights into” how its work would interact with that of the RIO, and was “waiting to hear from DSIT more detail”.290

125.The King’s Speech announced the intention by the Government to “establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models”.291 The Government stated that this future legislation will:

“support growth and innovation by ending current regulatory uncertainty for AI developers, strengthening public trust and boosting business confidence”.292

126.Mr Ringer warned that preoccupation with the contents of an AI Bill without sufficient focus on regulators’ capacity for implementation risks “missing the wood for the trees”. He explained:

“The companies that we see building and succeeding are not doing so while wondering about an overarching AI Bill. They are thinking about … how the food regulator will determine whether the AI they used in devising how to grow meat at scale in a lab is acceptable. That is not a question of an AI Bill but of a set of people at the FSA who are empowered and well-resourced to do the work to understand it.”293

127.We heard strong support for the UK’s sector-led, outcomes-based approach to AI regulation. However, this relies heavily on existing regulators’ ability to navigate complex and evolving technologies. The Government has frequently referenced the potential benefits of the Regulatory Innovation Office (RIO), but its remit remains unclear.

128.The Government must continue to ensure that regulators are properly resourced to deliver a sector-led approach to AI. In its response to this report, the Government should clarify the RIO’s priorities and set out in detail how it will engage with existing regulators to harmonise approaches, share best practice and drive behavioural change. Future AI legislation must not create further regulatory uncertainty or barriers to entry.

129.The Government should also consider setting up dedicated teams specialising in key markets to help fast-growing UK AI companies expand internationally by facilitating connections overseas, including with foreign regulators.

Adoption

130.The Government’s written submission suggests that public adoption of AI in the UK is “lower than many international counterparts”.294 Faculty highlighted comparatively low levels of AI adoption as a challenge for AI scaleups developing models and tools for business use. It explained: “some companies are not yet convinced by the technology and how to get value from it”.295

131.High operational costs, a lack of technical skills, poor data governance and regulatory uncertainty were cited as reasons for low enterprise adoption of AI.296 Our report on digital exclusion highlighted significant gaps in digital skills across the UK population.297 In his review, Matt Clifford called on the Government to “push hard on cross-economy AI adoption”, arguing that the public sector should lead by example to “rapidly pilot and scale AI products and services and encourage the private sector to do the same”.298

132.In its response to the AI Opportunities Action Plan, the Government accepted a number of recommendations on improving procurement to boost public sector AI uptake, and said that measures to boost private sector AI adoption will be outlined in the upcoming industrial strategy.299 Commentators have questioned whether these ambitions can be realised without first addressing structural issues in public services, including outdated IT systems, limited digital skills and rigid delivery models.300

133.We note the Government’s support for a domestic AI assurance301 market, which it claims will unlock “widespread adoption in both the private and public sectors”.302 Evidence to our large language models inquiry supported the view that progress on AI assurance could boost business confidence and public trust, and thereby adoption of AI technologies.303

134.The UK has comparatively low levels of AI adoption and public trust in AI technologies. Supporting innovation in AI will not lead to economic growth unless adoption is promoted in parallel.

135.The Government is right to identify adoption as a key factor in enabling its AI growth ambitions, but should not play down the level of change this represents for both the public and private sectors. In its industrial strategy, the Government should outline the specific steps it will take to drive AI adoption across its key high-growth sectors, including how it will overcome barriers such as low trust, outdatedi nfrastructure, and lagging digital skills.

University spinouts

136.The UK’s top universities were cited by many as a core strength in the UK’s AI arsenal.304 Examples of successful AI spinouts that originated from UK universities include Wayve and Synthesia, who provided evidence to this inquiry, both of which have achieved unicorn status. However, stakeholders argued that the high equity stakes universities have demanded have been a deterrent to other investors and therefore an impediment to further growth.305

137.In November 2023, the Government published the final conclusions of the Independent Review of University Spin-out Companies.306 Mr Grech told us that “over 50 universities” were in the process of implementing its recommendations.307 The review encouraged universities to adopt a set of best practice guidelines, the University Spin-out Investment Terms (USIT), developed by TenU, an international collaboration of universities.308 These propose a market norm for 10–25 per cent equity taken by universities for life sciences spinouts, and 5–10 per cent for software companies.309 Several witnesses advocated for the review’s recommendations on equity stakes to be taken up more widely.310

138.Strong links between industry and academia were cited as contributing factors to AI scaleup success in California and France.311 Matt Clifford similarly championed co-designed courses in Canada, Germany and France in his AI action plan.312 Ms Platts said that giving PhD students “access to not just working in established companies but seeing how it feels to work in a startup” would be “very welcome”.313 Mr Kendall described how conducting part of his PhD research at an early-stage robotics company allowed him to benefit from commercial resources while also learning to build a venture-backed business.314

139.Oxford Science Enterprises argued that universities could better support AI scaleups by “embracing part-time roles, encouraging internships in spinout companies and entrepreneurship sabbaticals”.315 In San Francisco, we heard that California’s lack of non-compete clauses were a key driver of local spinout success.316 The 2023 spinout review recommended that the Government take steps to improve “porosity” between industry and academia, including “buying out” academic time to focus on commercial partnerships, or introducing “academic returner” fellowships for those who have spent time in the private sector.317 The previous Government accepted all the recommendations of the review, and committed to working with UKRI and the National Academies to improve fellowship options for commercialisation.318

140.Witnesses were clear that the UK’s universities are one of its strengths, and that universities will continue to play an important role in UK AI development and leadership. Steps taken to implement the recommendations of the 2023 Independent Review of University Spin-out Companies should help innovative AI spinout companies be better positioned for future growth. However, more can be done to improve links between academia and industry and to ensure we remain competitive in the provision of supercomputing capacity.

141.We recognise the progress made in the adoption of TenU’s University Spin-out Investment Terms best practice guidance. The Government should continue to implement the recommendations of the independent spinout review, including options to improve collaboration between academia and industry.


192 43 (Leo Ringer), Q 95 (Professor Christopher Smith); Written evidence from the Institute for Chartered Accountants in England and Wales (ACT0046)

193 Q 8 (Michael Holmes)

194 Q 2 (James Wise)

195 Written evidence from James Dancer and Prof Lord Tarassenko (ACT0055)

196 HM Government, National AI Strategy (September 2021): https://www.gov.uk/government/publications/national-ai-strategy [28 January 2025]

198 Q 6 (Erin Platts)

199 Q 2 (James Wise)

200 ‘Government shelves £1.3bn UK tech and AI plans’, BBC News (2 August 2024): https://www.bbc.co.uk/news/articles/cyx5x44vnyeo [accessed 28 January 2025]

201 Q 44 (Gerard Grech CBE, Leo Ringer); Written evidence from Oxford Science Enterprises (ACT0056)

202 Written evidence from Oxford Science Enterprises (ACT0056)

203 Written evidence from BVCA (ACT0050)

204 Written evidence from Creative UK (ACT0019)

205 Written evidence from Surrey Institute for People-Centred AI (ACT0018)

206 The future of news, Appendix 4. For more information, see Agence Française de Développement, ‘Cifre PhD: A Specific Mechanism for Doctoral Students’: https://www.afd.fr/en/cifre-phd-specific-mechanism-doctoral-students [accessed 20 January 2025]

207 Written evidence from HM Government (ACT0025)

208 European Commission, ‘France AI Strategy Report’: https://ai-watch.ec.europa.eu/countries/france/france-ai-strategy-report_en [accessed 20 January 2025]

209 Comité de l’intelligence artificielle générative, ‘AI: our ambition for France’ (March 2024), p 5: https://www.info.gouv.fr/actualite/25-recommandations-pour-lia-en-france [accessed 28 January 2025]

210 ‘AI: the French connection’, Tortoise Media (19 September 2024): https://www.tortoisemedia.com/2024/09/19/ai-the-french-connection [accessed 20 January 2025]

211 ‘How French start-up Mistral AI is planning to take on Silicon Valley’, Euronews (26 November 2024): https://www.euronews.com/business/2024/11/26/how-french-start-up-mistral-ai-is-planning-to-take-on-silicon-valley [accessed 20 January 2025]

212 ‘Can France become a global AI powerhouse?’, Financial Times (2 January 2025): https://www.ft.com/content/11cb5217-9c2a-4128-b257-7cb6a63b2ba1 [accessed 20 January 2025]

213 ‘AI: the French connection’, Tortoise Media (19 September 2024): https://www.tortoisemedia.com/2024/09/19/ai-the-french-connection [accessed 20 January 2025]

214 Written evidence from James Dancer and Prof Lord Tarassenko (ACT0055)

215 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 22: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

217 Ibid.

218 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 6: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

219 techUK, ‘The UK’s AI moment: An ambitious new plan for innovation and growth’ (January 2025): https://www.techuk.org/resource/the-uk-s-ai-moment-an-ambitious-new-plan-for-innovation-and-growth.html [accessed 20 January 2025]

220 The Chartered Institute for IT, ‘AI Opportunities Action Plan—a summary’ (January 2025): https://www.bcs.org/articles-opinion-and-research/ai-opportunities-action-plan-a-summary/ [accessed 20 January 2025]

221 See for example, ‘Tony Blair and William Hague: Our focus on AI must be relentless’, The Times (12 January 2025), available at: https://www.thetimes.com/uk/politics/article/tony-blair-and-william-hague-our-focus-on-ai-must-be-relentless-37fjgbzhv?srsltid=AfmBOorQ37M0F_9mqL00roGuQrTSPiV_iKR5agyFssnwdzOzzYO-2BPj; Barney Hussey-Yeo (@Barney_H_Y), tweet on 13 January 2025: https://x.com/Barney_H_Y/status/1878660795317555690. See also Department for Science, Innovation and Technology, ‘Prime Minister sets out blueprint to turbocharge AI’ (January 2025): https://www.gov.uk/government/news/prime-minister-sets-out-blueprint-to-turbocharge-ai [accessed 16 January 2025]

222 ‘Startups Welcome UK AI Action Plan, But With Caveats’, Forbes (14 January 2025): https://www.forbes.com/sites/trevorclawson/2025/01/14/startups-welcome-uk-ai-action-plan-but-with-caveats/ [accessed 28 January 2025]; Minderoo Centre for Technology and Democracy, ‘AI Opportunities Action Plan falls short of challenges of tech and the UK economy’ (January 2025): https://www.mctd.ac.uk/ai-opportunities-action-plan-falls-short-of-challenges-of-tech-and-the-uk-economy/ [accessed 15 January 2025]

223 techUK, ‘The UK’s AI moment: An ambitious new plan for innovation and growth’ (January 2025): https://www.techuk.org/resource/the-uk-s-ai-moment-an-ambitious-new-plan-for-innovation-and-growth.html [accessed 20 January 2025]

224 ‘At last, a growth plan—let’s hope they seize it’, The Times (13 January 2025), available at: https://www.thetimes.com/article/3b8b0b2a-34c0-4a56-893b-c81c0c4fc618?shareToken=08e63e38b4424b30c65decfae6f47cd6

225 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 24: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

226 Q 22 (Antony Berg), QQ 23–27 (Barney Hussey-Yeo), Q 27 (Eleanor Lightbody), Q 61 (Victor Riparbelli, Peadar Coyle); Written evidence from Dr Mercedez Bunz (ACT0009) and Mati Staniszewski (ACT0057)

227 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 6: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

228 Q 45 (Gerard Grech). See also written evidence from the Royal Academy of Engineering to the Communications and Digital Committee’s inquiry ‘Large language models’ (LLM0036); The future of news, Appendix 4

229 Q 24 (Barney Hussey-Yeo); Written evidence from James Dancer and Prof Lord Tarassenko (ACT0055)

230 Q 23 (Barney Hussey-Yeo)

231 ‘UK has half of what it needs to be an AI hub’, Financial Times (14 January 2025): https://www.ft.com/content/657a490c-a044-49ad-851d-e69a4294046f [accessed 20 January 2025]

232 Q 22 (Antony Berg)

233 Q 21 (Eleanor Lightbody)

234 Q 22 (Eleanor Lightbody); Written evidence from Faculty (ACT0054); The future of news, Appendix 4

235 Large language models and generative AI, para 41

236 Written evidence from Andreessen Horowitz (ACT0051)

237 Written evidence from HM Government (ACT0025)

238 Written evidence from Oxford Science Enterprises (ACT0056)

239 Written evidence from Wayve (ACT0047)

240 Q 27 (Eleanor Lightbody), Q 38 (Barney Hussey-Yeo), Q 52 (Victor Riparbelli)

241 Q 37 (Eleanor Lightbody, Antony Berg)

242 Q 2 (Michael Holmes), Q 3 (Alex Kendall), Q 37 (Eleanor Lightbody, Antony Berg); Written evidence from techUK (ACT0017), CBI (ACT0029), Dean Williams (ACT0030), Wayve (ACT0047), Boardwave (ACT0052) and Oxford Science Enterprises (ACT0056)

243 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 13: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

244 Department for Science, Innovation and Technology, AI Opportunities Action Plan: government response, CP 1242 (January 2025), pp 11–13: https://assets.publishing.service.gov.uk/media/6785178cc6428e01318816f0/ai_opportunities_action_plan_government_repsonse.pdf [accessed 28 January 2025]

245 Q 24 (Antony Berg)

246 Written evidence from HM Government (ACT0025)

247 Q 12 (James Wise); Written evidence from the Council on Geostrategy (ACT0031)

248 Letter from Professor Sir Peter Mathieson, Principal and Vice-Chancellor, The University of Edinburgh and Professor Mark Parsons, EPCC Director and Dean of Research Computing, The University of Edinburgh to the Chair (December 2024): https://committees.parliament.uk/publications/46107/documents/229577/default/

249 Q 44 (Leo Ringer); Written evidence from Gerard Grech CBE (ACT0053); See also Department for Science, Innovation and Technology, ‘Independent Review of The Future of Compute: Final report and recommendations’ (March 2023): https://www.gov.uk/government/publications/future-of-compute-review/the-future-of-compute-report-of-the-review-of-independent-panel-of-experts#chap3 [accessed 8 January 2025]

250 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 7: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

251 Department for Science, Innovation and Technology, AI Opportunities Action Plan: government response (January 2025), p 8: https://assets.publishing.service.gov.uk/media/6785178cc6428e01318816f0/ai_opportunities_action_plan_government_repsonse.pdf [accessed 28 January 2025]

252 Q 24 (James Smith)

253 Written evidence from James Dancer and Prof Lord Tarassenko (ACT0055)

254 Q 2 (James Wise)

255 Deeptech companies are technology firms whose core businesses rely on significant scientific or engineering research. AI firms are frequently categorised as deeptech companies, including by the BBB in its evidence to our inquiry.

256 Written evidence from the British Business Bank (ACT0027)

257 Written evidence from Wayve (ACT0047)

258 Written evidence from Oxford Science Enterprises (ACT0056)

259 Q 6 (Erin Platts), Q 29 (Antony Berg); Written evidence from Surrey Institute for People-Centred AI (ACT0018)

260 Q 35 (James Smith), Q 43 (Susan Bowen)

261 Q 55 (Victor Riparbelli)

262 Q 82 (Louis Taylor)

263 See paras 61–74.

264 Nature, ‘The AI revolution is running out of data. What can researchers do?’ (December 2024), available at: https://www.nature.com/articles/d41586–024-03990-2 [accessed 20 January 2025]

265 Department for Science, Innovation and Technology, Artificial Intelligence sector study 2023
(October 2024): https://www.gov.uk/government/publications/artificial-intelligence-sector-study-2023/artificial-intelligence-sector-study-2023#investment-in-uk-ai-companies [accessed 9 January 2025]

266 Written evidence from Faculty (ACT0054)

267 Q 45 (Gerard Grech)

268 Q 127 (Feryal Clark MP); Written evidence from HM Government (ACT0063)

269 Oral evidence taken before the Science and Technology Committee on 5 November 2024 (Session 2024–25) 8

270 Q 45 (Leo Ringer)

271 Q 45 (Susan Bowen)

272 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 6: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

273 Department for Science, Innovation and Technology, AI Opportunities Action Plan: government response, CP 1242 (January 2025), p 9: https://assets.publishing.service.gov.uk/media/6785178cc6428e01318816f0/ai_opportunities_action_plan_government_repsonse.pdf [accessed 28 January 2025]

274 The future of news, Appendix 4

275 Written evidence from Oxford Science Enterprises (ACT0056)

276 The UK currently has no general statutory regulation of AI. In March 2023 the previous Government published its “proinnovation approach to AI regulation” White Paper, which envisioned existing regulators overseeing AI in their respective sectors. The current Government has committed to maintaining this approach. In contrast, the EU’s AI Act, which came into force in August 2024, has assigned legal obligations to different risk-based tiers of AI use. See Parliamentary Office of Science and Technology, ‘Artificial intelligence: ethics, governance and regulation’ (October 2024): https://post.parliament.uk/artificial-intelligence-ethics-governance-and-regulation/ [accessed 16 January 2025]

277 Q 8 (James Wise, Alex Kendall); Written evidence from CBI (ACT0029) and Wayve (ACT0047)

278 Written evidence from Surrey Institute for People-Centred AI (ACT0018) and James Dancer and Prof Lord Tarassenko (ACT0055)

279 QQ 50–51 (Gerard Grech)

280 Written evidence from Wayve (ACT0047)

281 Q 50 (Leo Ringer)

283 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 6: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

284 Q 45 (Gerard Grech), QQ 50–51 (Susan Bowen); Written evidence from James Dancer and Prof Lord Tarassenko (ACT0055). Regulatory sandboxes allow innovative businesses to trial new products and services in a real-world environment without the need for full regulatory compliance. See for example Medicines and Healthcare products Regulatory Agency, ‘AI Airlock: the regulatory sandbox for AIaMD’ (May 2024): https://www.gov.uk/government/collections/ai-airlock-the-regulatory-sandbox-for-aiamd [accessed 20 January 2025]

285 Department for Science, Innovation and Technology, A pro-innovation approach to AI regulation: government response, Cm 1019 (February 2024), p 7: https://assets.publishing.service.gov.uk/media/65c1e399c43191000d1a45f4/a-pro-innovation-approach-to-ai-regulation-amended-governement-response-web-ready.pdf [accessed 28 January 2025]

286 Q 122 (Feryal Clark MP); Written evidence from HM Government (ACT0063)

287 Q 121 (Feryal Clark MP)

288 Department for Science, Innovation and Technology, AI Opportunities Action Plan: government response, CP 1242 (January 2025), p 14: https://assets.publishing.service.gov.uk/media/6785178cc6428e01318816f0/ai_opportunities_action_plan_government_repsonse.pdf [accessed 28 January 2025]

289 Q 18 (Alex Kendall), Q 19 (Erin Platts), Q 50 (Susan Bowen); Written evidence from Yoti (ACT0014) and CBI (ACT0029)

290 Oral evidence taken before the Communications and Digital Committee, session on the work of the CMA, 7 January 2025 (Session 2024–25), Q 7

291 HL Deb, 17 July 2024, col 7 [Lords Chamber]

292 Written evidence from HM Government (ACT0025)

293 Q 50 (Leo Ringer)

294 Written evidence from HM Government (ACT0025)

295 Written evidence from Effa Ettah, Prof John McAuliffe and Liz Scott (ACT0026) and Faculty (ACT0054)

297 Communications and Digital Committee, Digital exclusion (3rd Report of Session 2022–23, HL Paper 219), paras 137–48

298 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 5: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

299 Department for Science, Innovation and Technology, AI Opportunities Action Plan: government response, CP 1242 (January 2025), p 19: https://assets.publishing.service.gov.uk/media/6785178cc6428e01318816f0/ai_opportunities_action_plan_government_repsonse.pdf [accessed 28 January 2025]

300 Public Digital, ‘New Tech, Old State: Are we ready for an AI Revolution?’ (January 2025): https://public.digital/pd-insights/blog/2025/01/new-tech-old-state-are-we-ready-for-an-ai-revolution [accessed 15 January 2025]

301 AI assurance provides the tools and techniques required to measure, evaluate, and communicate the trustworthiness of AI systems.

302 Department for Science, Innovation and Technology, Assuring a responsible future for AI (November 2024), p 3: https://assets.publishing.service.gov.uk/media/672a2ca440f7da695c921b7c/Assuring_a_Responsible_Future_for_AI.pdf [accessed 28 january 2025]

304 Q 2 (Erin Platts), Q 22 (Anthony Berg), Q 27 (Eleanor Lightbody); Written evidence from James Dancer and Prof Lord Tarassenko (ACT0055) and Mati Staniszewski (ACT0057)

305 Q 11 (Alex Kendall), Q 47 (Leo Ringer); Written evidence from James Dancer and Prof Lord Tarassenko (ACT0055), The future of news, Appendix 4

306 HM Government, Independent Review of University Spin-out Companies: Final report and recommendations (November 2023): https://assets.publishing.service.gov.uk/media/6549fcb23ff5770013a88131/independent_review_of_university_spin-out_companies.pdf [accessed 15 January 2024]

307 Q 47 (Gerard Grech)

308 TenU, ‘The USIT Guide: Leading Universities and Investors Launch Set of Recommendations for the Innovation Sector’: https://ten-u.org/news/the-usit-guide [accessed 15 January 2024]

309 Available at TenU, ‘Essential Resources for Innovation: Download the USIT and USIT for Software Guides’: https://ten-u.org/news/essential-resources-for-innovation-download-the-usit-and-usit-for-software-guides [accessed 15 January 2024]

310 Written evidence from BVCA (ACT0050), James Dancer and Prof Lord Tarassenko (ACT0055) and Oxford Science Enterprises (ACT0056)

311 Q 12 (Erin Platts); Written evidence from Gerard Grech CBE (ACT0053); Professor Oli Buckley (ACT0008), The future of news, Appendix 4

312 Department for Science, Innovation and Technology, AI Opportunities Action Plan, CP 1241 (January 2025), p 11: https://assets.publishing.service.gov.uk/media/67851771f0528401055d2329/ai_opportunities_action_plan.pdf [accessed 28 January 2025]

313 Q 14 (Erin Platts)

314 Q 14 (Alex Kendall)

315 Written evidence from Oxford Science Enterprises (ACT0056)

316 The future of news, Appendix 4

317 HM Government, Independent Review of University Spin-out Companies: Final report and recommendations (November 2023): https://assets.publishing.service.gov.uk/media/6549fcb23ff5770013a88131/independent_review_of_university_spin-out_companies.pdf [accessed 28 January 2025]

318 HM Government, Government Response: Independent Review of University Spin-outs (November 2023), p 22: https://assets.publishing.service.gov.uk/media/655e0bf7046ed400148b9e34/independent_review_of_university_spin-out_companies_government_response.pdf [accessed 15 January 2024]




© Parliamentary copyright 2025