DAIC0018
Written evidence submitted by BAE Systems.
Introduction
At BAE Systems, our purpose is to serve, supply and protect those who serve and protect us.
We help our customers to stay one step ahead when protecting people and national security, critical infrastructure and vital information. We provide some of the world’s most advanced, technology-led defence, aerospace and security solutions and employ a skilled workforce of 93,100 people in around 40 countries.
From state of the art cyber threat detection to flight control systems that enable pilots to make better decisions, we never stop innovating to ensure that our customers maintain their advantage. This is a long-term commitment involving significant investments in skills. We also work closely with local partners to support economic development through the transfer of knowledge, skills and technology. In 2022, we contributed £11.1 billion to UK GDP and invested £142 million in UK technologies and an additional £180 million in skills and education. We employ 39,600 people in the UK across 50 sites. A significant number of our UK workforce (14,900) come from Britain’s most deprived local authorities and in 2022, we spent nearly £730m on supply chain purchases in these areas.
BAE Systems has been at the forefront of the development of the UK’s Artificial Intelligence (AI) capacity and expertise over the past decade. In particular, we have worked closely with the UK Government’s Office for AI on its AI Masters places initiative, and with the Ministry of Defence. BAE Systems has provided support for the development of AI enabled systems, advice and knowledge, seconded members of staff, and has helped to promote the importance of AI in defence in support of the Government’s ambitions.
BAE Systems’ role in promoting and building a healthy ecosystem for AI in defence includes significant effort to create pipelines of talent. The company has for example in each of 2022 and 2023 recruited around 100 degree apprentices on the Aerospace Software Engineering and Digital Technology Solutions Professional apprenticeships, which provide apprentices with a foundational understanding of AI as part of wider digital skills. As part of an upskilling pilot, we have co-created a new Masters course in Applied AI with Cranfield University, addressing many of the specific needs of defence in the application of AI.
This submission is in response to the House of Commons Defence Select Committee’s Inquiry into Developing AI Capacity and Expertise in UK Defence.
The MoD has clearly articulated its intent through its publication of the Defence Artificial Intelligence Strategy in 2022, its engagement with industry and academia, and the establishment of the Defence AI Centre (DAIC), which we welcome. This has, to date, rightly focussed on the MoD’s organisational transformation so that it is ‘AI ready’ and capable of adopting and exploiting AI at ‘pace and scale’.
Industry requires greater consultation and clearer and more specific identification of priorities by the MoD.
Whilst the ambition of the MoD in the AI sector has been clear, the specific priorities that it wants to follow are vague and undefined. Detailed priorities and their deliverability are not explicitly defined in the 2022 Defence AI Strategy, save for examples of existing projects.
In order to deliver against the MoD’s overarching objective to strengthen the UK’s defence and security AI ecosystem, for example, industry requires a more explicit idea of: the MoD’s view of the most impactful AI operational applications; which future programmes these AI capacity and expertise needs are tied to; how they are tied and the estimated skills requirement over time. Without this two-way consultation, and a clear and consistent explanation of priorities, it is difficult to form realistic predictions to inform our work. It is also important to note that this two-way consultation must happen with input from those working with AI day-to-day.
The MoD needs to provide clarity of its plans including how they relate to AI SQEP (Suitably Qualified and Experienced Person) capacity demands from industry as outlined in the Defence AI Strategy.
BAE Systems and the wider UK defence industry has already committed to and is investing in AI skills within the workforce in order to provide an adequate pipeline of people with the necessary skills and experience (hereafter referred to as SQEP) to help deliver a healthy UK AI ecosystem. However, additional clarity from the MoD would allow industry to invest further.
Better understanding of the department’s future plans and priorities will allow industry to generate roadmaps for building robust AI SQEP capacity through highlighting the real size of the challenge, the gaps and the best courses of action to address them. There have been some successes on similar initiatives, which could be applied in the context of AI related initiatives. One such example is the work being carried out between the Government and industry on the “Defence Digital Twin Implementation Road Map and Development Framework” which has identified some solutions, such as embedding industry people into MoD teams.
The UK defence industry has extensive relevant experience of the design, development, test and qualification of highly complex systems and software including safety critical and autonomous systems. UK defence Industry has also demonstrated the successful implementation of several branches of AI within its products and services.
The UK defence sector is a leading identifier, developer and adopter of new technologies in a defence context when they emerge. AI is no exception. BAE Systems, for example, has already been working with various forms of AI for more than 20 years, allowing us to increase the manoeuvrability of Typhoon jets, support naval engineering teams to identify wear and tear before it affects platform performance, and process vast amounts of data to identify patterns to better inform operational law enforcement decisions. These are just some of the examples of the broad application of AI across our businesses.
That said, it is more difficult and takes longer to adapt new and novel AI technologies, such as ‘deep’ artificial neural networks (ANNs) and large language models (LLMs), for use in defence compared to in the commercial sector. One of the reasons for this is because defence customers need to be able to trust that the equipment they use works as expected every time.
Although BAE Systems already has a long heritage of leveraging its defence expertise to integrate new and fast developing technologies such as AI, whenever there is a possibility of benefiting our customers, we need clearer guidance from the MoD on its assurance and certification requirements. Only then will the company and wider industry be clear on the standards to be achieved.
Address the STEM (Science Technology Engineering Mathematics) pipeline in schools through greater focus on engineering and the promotion of engineering careers.
The UK should focus on significantly increasing the up-take of STEM subjects both within schools and for school-leavers. Of particular importance is the boosting of the ‘E’ within STEM through greater primary and secondary school curriculum emphasis on engineering with systems thinking as well as digital skills, similar to that which the IET has been recommending to UK Government. Schools are the start of the pipeline which should be complemented with more digital skills course options at the further education stage, such as T Levels.
BAE Systems invested around £180m in education, skills and early careers activities in the UK in 2022 and currently has more than 5,500 apprentices and graduates in training across its UK businesses, equivalent to more than 10% of its almost 40,000 strong UK workforce. In 2024, an additional 1,400 apprentices are expected to join the Company, as well as almost 1,300 graduates.
BAE Systems is committed to playing its part within schools and runs a schools roadshow in partnership with the Royal Navy and RAF every year. Having engaged more than a million pupils over 18 annual seasons so far, it is the longest-running STEM initiative of its kind in the UK. Designed to inspire excitement in these important areas, it provides a highly interactive experience for students aged 10 to 13 in primary and secondary schools across the country.
In addition to the activities outlined above, BAE Systems also supports the talent pipeline at the further education stage wherever possible, for example through supporting the British Computing Society’s ‘Get ahead in digital week’ last year by providing a talk on AI to 23 different schools and their T Level students.
Incentivise school leavers to pursue careers in AI through more accessible apprenticeships, undergraduate and post-graduate courses.
However, there remains a challenge to make AI careers within the public and defence sectors as attractive as those within the commercial AI sector. More financial and other incentives for school and college leavers could help to get them onto the necessary undergraduate and post graduate courses to meet the defence sector’s needs. This could include MoD and industry co-sponsorship.
BAE Systems is currently sponsoring and co-sponsoring nearly 100 PhDs in defence topics, including working closely with the UK’s EPSRC (Engineering Physical Sciences Research Council) and ICASE (Industrial Cooperative Awards in Science & Technology). Several of these are in AI. Greater UK Government support for these industry-Government co-funded schemes would be welcomed.
There are several potential barriers that non-established defence companies face, which if overcome, would significantly increase the AI capacity and expertise in UK Defence. These are:
Provide access to secure computing infrastructure and defence data.
Powerful and affordable computer infrastructure has become commonplace but suitably secure infrastructure for use in defence has not. One way to alleviate this problem would be the provision of government approved computing infrastructure, including collaborative cloud working environments, which would allow Defence sector data to be securely managed in compliance with applicable requirements and regulations and made accessible to partner companies where appropriate.
Encourage greater development of partnerships between ‘disruptor’ AI companies and existing defence companies to transfer the knowledge necessary to create effective and appropriate defence AI solutions.
Many UK AI companies have arisen from the UK’s STEM talent and the general democratisation of AI. However, the systems and safety engineering necessary to implement the capable, responsible and safe solutions required within the defence environment remains a barrier to them. In order for them to best service the requirement from the MoD it will be necessary to facilitate partnerships and the sharing of experience between newer ‘disruptor’ AI companies and the established primes and wider defence supply chain.
Industry leadership like this, in conjunction with the work of government agencies like the MoD’s Defence AI Centre and the Defence and Security Accelerator (DASA), can provide the necessary coherently managed ‘bridge’ and pull-through of state-of-the-art AI technologies from the UK supply base. One such example of successfully working in partnership in this way is the match-funded project that BAE Systems is working on in conjunction with DASA to develop a next generation portable simulator to train pilots. Building on DASA’s existing Project iDAS, it also involves working closely with VRAI, a UK-based SME.
In addition, pro-active clarification of the ownership and use of intellectual property will be necessary, ensuring a fair and equitable solution is available to all parties.
Provide greater support to SMEs and companies outside the defence sector to enable them to meet the enhanced security and regulatory requirements that apply to defence programmes, particularly those at higher security classification levels, or provide alternative approaches and solutions to allow for their involvement without the need to meet these requirements.
Stringent security requirements, and other regulatory requirements, can make it challenging for SMEs and companies from outside of the sector to implement and maintain the controls and processes required to meet the applicable requirements – including in respect of security clearances for staff, and appropriately secure and compliant physical, data and IT infrastructure and platforms. Striking the right balance between security, access to talent and availability of appropriate IT environments to achieve the necessary pace and scale of AI developments is one that will require further attention, as well as clarification around security processes and policy.
One further option could be a strategy which permits SMEs and companies outside of defence to contribute to defence in a manner which avoids triggering the more stringent security requirements. For instance, the development of AI algorithms using sufficiently representative but not real (e.g. classified) data to prove a principle. The follow-on secure development with real defence data could then be conducted through suitably equipped partner organisations.
18th January 2024