DAIC0012

Written evidence submitted by Faculty Science Ltd (Faculty AI).
 

Faculty AI is a British company and one of Europe’s largest independent Applied AI companies. Founded in 2014, we use AI and ML to solve our client’s hardest problems, including Healthcare, Energy Transition, National Security and Defence. We have experience of delivering a number of projects across MOD Top Level Budgets, covering a broad range of operational and support use cases.

Sub-Committee on Developing AI Capacity and Expertise in UK Defence - Call for Evidence

How clearly has the Ministry of Defence set out its priorities for the kind of AI capacity and expertise it believes the UK defence sector should have, what priorities has it identified, and are these deliverable?

The 2022 UK Defence AI Strategy set admirably clear priorities and timelines. However, rapid technological developments, particularly in the field of Large Language Models (LLMs), have already rendered some of its key assumptions obsolete. In addition to being a powerful AI tool in their own right, LLMs are accelerating the improvement of wider AI capabilities, including writing code for Reinforcement Learning agents, or developing supervised AI models. The step-change and exponentially-improving capabilities of LLMs have triggered significant AI safety concerns. Since the Defence AI Strategy was published, the UK has held the Bletchley Park AI Safety Summit. As evidence of how quickly the landscape has changed, one of the leading companies in the field, OpenAI, has seen its market value soar from $200m to approximately $100bn today; a reflection of AI capabilities not fully appreciated even as recently as Summer 2022. Some of these capabilities were considered ‘AI Next’ in the Defence AI Strategy, i.e., believed unfeasible in the short-term.[1] Many of these use-cases, e.g. ‘Financial Management Tools’ are now ready to deploy. An update of the Defence AI Strategy to reflect LLM development, and to share MOD’s intended approach to safe LLM deployment, would therefore provide a clearer signpost to industry.

This notwithstanding, the MOD’s AI priorities are not deliverable without access to data. The Digital Strategy for Defence noted that by 2025, it would, ‘...enable seamless access to our data by delivering a… Digital Backbone… connecting sensors, effectors and deciders across military and business domains and with partners, driving integration and interoperability across domains and platforms.’[2] The In Service Date of the Digital Backbone remains crucial, thus posing a challenge to the delivery of these priorities.[3] Assuming the Digital Backbone will enable a significant quantity of AI development, an update on its completion date is vital, if UK AI SMEs are to put in place realistic expansion, hiring and training plans. Additionally, the individual Single Services appear to be developing their own in-house data sharing networks  (ZODIAC, STORMCLOUD/EVE and NEXUS for the British Army, RN/RM and RAF respectively). For the same reason, understanding the likely completion dates, protocols and standards and points of difference between these systems would be hugely beneficial to industry.

Timelines for AI projects and their associated budgets are sometimes unclear, making it difficult for Small Medium Enterprises (SMEs) to plan. This is especially challenging when there are limited numbers of security-cleared data scientists, many of whom are also in demand to staff AI projects for other Government agencies. Single Services have announced AI strategies that have taken 18 months before they have gone to tender, if at all, while some modest-sized tender processes for AI projects remain underway, having started a year ago. By contrast, there are many specialist AI-procurement frameworks run by MOD that share their budgets openly and move rapidly through the procurement process, making it relatively straightforward to plan and deliver work. The Defence and Security Accelerator (DASA) and Dstl’s SERAPIS Framework are relatively transparent about upcoming work and budgets, with fast decision timelines, albeit work is divided into suboptimally small lots.

 

What strengths and expertise does UK industry currently have in the field of Artificial Intelligence with defence applications?

The UK is home to some of the world's premier universities that are at the forefront of AI research and development. The excellence of these universities in AI and related fields ensures a steady stream of highly skilled graduates and post-doctorates. These individuals are adept in theoretical aspects of AI and trained in practical, real-world applications, making them valuable assets in Defence-related AI projects. Many of these individuals are unlikely to live near, or be used to working on, military sites. Although Defence offers the opportunity to solve uniquely interesting and cutting-edge problems, other industries offer benefits of their own. Subsequently, the ability to offer relatively similar working conditions to those enjoyed by non-Defence AI/ML engineers, especially office, remote or home working, is of outsized importance. This is a challenge for UK AI SMEs when working with SECRET data.

The UK's visa policies play a crucial role in sustaining and enhancing this talent pool. The post-doctorate visa structure allows international AI researchers to remain in the UK after completing their studies. This policy is particularly beneficial for Defence applications of AI, as it enables these researchers to fulfil residency requirements, making them eligible for UK Security Clearance. While these individuals may never fulfil Citizenship or Deep Vetting prerequisites, their ability to access a large proportion of UK Defence projects under SC-level clearance adds a significant amount of expertise to the industrial base.

While it is too early to tell, the UK’s intended ‘pro-innovation approach’ to AI regulation, combined with a historically strong application of rule of law may deliver further benefits when compared to other countries. A relatively low level of regulatory incoherence may encourage more dual-use or Defence-focused AI SMEs to choose the UK as their home, a powerful advantage, especially when combined with visa policies, as stated above.

 

How can the UK Government best develop capacity and expertise within domestic industry in sectors such as engineering and software to support the development and delivery of Artificial Intelligence applications in defence?

Supporting AI SMEs through access to shared, otherwise expensive resources, and providing the building blocks required for successful AI programmes would deliver huge benefits. It is important to note that ‘support’ does not mean funding. There are sufficient private and public funds and accelerators for Defence and dual-use technology and adding more capital at this stage will bring diminishing returns. Practical, targeted support and provision of shared resource is the best way to develop capacity and expertise, specifically:

Continued improvement and professionalisation of Defence procurement. There have been significant improvements in how defence procures AI and digital technology, but there is more to be done. Hubs such as Commercial X in Defence Digital are welcome innovations. Procurement decisions should be based on an outcomes-based approach wherever possible. Technical requirements should not be over-specced or gold-plated. The use of procurement professionals that understand the market, working closely with, and on behalf of clients in MOD, will quickly achieve better outcomes.

Clear and timely security certification. We recently noted an improvement in the amount of time taken to provide Security Clearance for SC applicants. This has removed a significant source of friction for us when building engineering capacity. However, we now need a similar approach taken to certifying SME IT systems. The Defence Assurance Risk Tool (DART) process was recently replaced by ‘Secure by Design’ (SbD) and we are still waiting for clear definitions and explanations as to how SbD will allow private systems to be accredited to OFFICIAL SENSITIVE. Until this occurs, SMEs will be unable to go above OFFICIAL within their internal systems, a key blocker when trying to access even moderately sensitive data.

Access to Sensitive Compartmented Information Facilities (SCIF). If SMEs are to access SECRET and above information on their premises, they need a SCIF, a certified, specially-constructed, bug-swept, secure room or building. For smaller companies, this can be prohibitive, either due to cost, the uncertainty of ROI, or the fact that they are in a leased building and cannot undertake construction. For relatively low cost, MOD provision of licensable, shared SCIFs at, for example, jHub in London, the Dstl Newcastle AI Lab, or Dorset BattleLab, would allow SMEs to share the cost of a SCIF on a charged-per-seat basis, meaning they could see SECRET data, tenders or use-cases they would otherwise not have access to.

SECRET Compute as a Service. SMEs can easily access compute resources via public cloud. However, this only allows access to OFFICIAL data. Akin to the SCIF, having the MOD run a SECRET cloud system as-a-Service, on a per-seat licensable cost, would allow SMEs to train models on SECRET data. This could be accessed via the public SCIF, as above. It would enable engineers to work on MOD’s most sensitive and valuable projects, without needing to work away from home, a serious barrier that makes other AI/ML industries look attractive in comparison to Defence. The MOD’s recent tendering of SECRET Cloud capabilities may make this possible.

MOD Data as a Service. Make MOD data repositories available to contractors remotely, specifically, access to electronic, acoustic, visual, or back-office data that has been cleaned and labelled. Access controls and use of SCIF facilities would enable security to be maintained, while SECRET Cloud compute would ensure that SECRET data was only processed on MOD systems. Even remote access to OFFICIAL data would make creation of AI models for back office functions become far more straightforward than it currently is.

Presumably the above would be delivered by the Digital Backbone. However, data will still need to be cleaned and labelled by SC-cleared contractors, which might be prohibitively expensive, or slow when use cases are time sensitive. Reservists could remotely staff a Data Labelling/Cleaning service for OFFICIAL SENSITIVE data, providing a ready cohort of SC-cleared individuals who could react more quickly and cheaply than civilian contractors.

Performance benchmarking scores. It is difficult to understand whether an ML approach has succeeded unless performance metrics for current ways of working are measured and understood. Many MOD AI/ML use-cases involve automating or augmenting elements of the Kill Chain; i.e. the Find, Fix, Track, Target, Engage and Assess (F2T2EA) process undertaken during an engagement. Without understanding how a human performs these F2T2EA tasks (both in terms of accuracy and timeliness), it is difficult to demonstrate how valuable or safe an ML approach will be by comparison. This also includes understanding human performance at different levels of stress and tiredness; presumably these are the circumstances under which AI/ML tools will likely be deployed. Metrics for different F2T2EA tasks can help prioritise valuable projects, as well as provide AI/ML SMEs targets to meet when creating models.

Transitioning Veterans. Veterans provide a vital talent pool for UK AI SMEs. They know use-cases intimately and can help AI companies navigate the intricacies of MOD. Support for Veterans transitioning into the UK AI industry is limited in comparison to manual trades or cyber security. The Career Transition Partnership, for example, currently has no coding or AI/ML courses as part of its resettlement training options.

The Enhanced Learning Credit Scheme (ELCAS), which allows Veterans to spend up to £6000 of public funds on resettlement courses, has no providers offering coding or product management qualifications. This is possibly because no coding courses are Level 3 Qualifications, the level required to offer an ELCAS course. ELCAS does offer some BSc/MSc qualifications in data or computer science; but these are likely to be overly-complex and unsuitable for Veterans transitioning into non-technical AI roles.

 

What can the Government do to help embed UK AI companies in defence supply chains, both domestically and internationally?

AI should no longer be treated as ‘innovation’, with its own standalone budget and teams. Rather, it is the key enabler for Affordable Mass. This approach teams inexpensive platforms with AI/ML decision-making, resulting in linked, cheap, attritable platforms that deliver effects at far lower cost than current approaches. Recent events in Ukraine and the Red Sea have demonstrated the clear need to shift to this step-change approach, given that expensive platforms and weapons are being expended or destroyed horrifyingly quickly and cheaply.

AI-enabled Affordable Mass capabilities can only be delivered if there are fair and meritocratic competition models; preferably ones that prioritise rapid physical demonstrations over long, written responses to over-onerous requirements. Competitions should instead accept a ‘good enough’ solution now, coupled with the potential for future spiral development. The Defence AI Centre’s use of the ‘3-2-1’ Competition style, with 3 suppliers invited to physically demonstrate a capability, before 1 is chosen after knockout rounds, is a good example of this. This approach encourages practical, immediate solutions over complex, long-term projects. Innovative and agile UK AI SMEs are likely to have these cutting-edge solutions available for fast demonstration. By contrast, a requirement for a small number of complex platforms will take longer before a practical demonstration is conducted, resulting in coming in over-budget, and so late as to be rendered obsolete on arrival. The timing and funding details of these competitions should be communicated clearly and early, so that SMEs have time to prepare their responses and hire staff if required.

Such an approach will cause industry to deploy a Best of Breed approach, whereby Primes and SMEs collaborate in ‘Rainbow Teams’ to bring their advantages to bear; mass production and knowledge of MOD processes from the former, and cutting-edge AI/ML approaches from the latter. There is already much support for SMEs with regards to forming relationships with Primes; ADS, MakeUK, Team Defence Information, TechUK and other trade bodies provide substantial support and introduction opportunities. Primes themselves do not tend to make onerous demands over rights to the IP developed by SMEs; MOD’s IP conditions also tend to be balanced, with a reasonable use of both DEFCON 703 and 705.

Likewise, access to military end-users for AI companies, especially in the run-up to competitions, is improving. TLBs have increasingly run industry days to share their problem sets, and encouraged continued conversations in the run-up to competitions. This allows for valuable feedback and iterative development, levelling the playing field for companies without pre-existing MOD networks or Veterans on their staff.

 

How can the UK Government ensure that it champions the UK AI sector in the context of Pillar 2 of the AUKUS Partnership?

Definitions of ‘success’ for AUKUS Pillar 2 should be developed and shared by MOD. In our view, Pillar 2 should allow a UK AI company, Australian data provider and US hardware manufacturer, to bid as a consortium for a UK, US or Australian Defence contract; and face no security or ITAR barriers for doing so, or restrictions on subsequent data sharing.

The best way for UK Government to champion the UK AI sector in this regard is to help define the regulations and standards that companies willing to do business under Pillar 2 will need to meet, ensuring that US or Australian companies do not receive an unfair advantage.

Examples include allowing both UK and Australian companies to access US finance under the US Defence Production Act (DPA), and allowing ITAR restrictions for ‘strategically important’ UK and Australian companies. As it stands, Australia is on track to receive ‘domestic source’ designation under the DPA, while the UK is not. A definition of ‘strategically important’ would, presumably, have to be agreed by the AUKUS parties, and ensuring UK AI SMEs fall within this definition will be vital.

 

17th January 2024


[1]https://assets.publishing.service.gov.uk/media/62a7543ee90e070396c9f7d2/Defence_Artificial_Intelligence_Strategy.pdf p.35

[2]https://www.gov.uk/government/publications/data-strategy-for-defence/data-strategy-for-defence

[3]https://publications.parliament.uk/pa/cm5803/cmselect/cmpubacc/727/report.html