DAIC0001
Written evidence submitted by Alexander Gegov
I am Associate Professor in Computational Intelligence at the University of Portsmouth. My area of expertise is Artificial Intelligence for Defence and Security.
I have presented invited lectures in the above area at the NATO Allied Command Transformation. I have led collaborative projects in the above area with the NATO Strategic Communications Centre of Excellence. I am also a member of a Research Task Group in the above area of the NATO Science and Technology Organisation.
My most recent research is on the use of Explainable AI (XAI) for Decision Support in Defence and Security.
You can find details about some of my recent and current research activities from my professional website here:
https://www.port.ac.uk/about-us/structure-and-governance/our-people/our-staff/alexander-gegov
You can find details about some of my recently published relevant research papers here:
https://ieeexplore.ieee.org/document/10195148
https://ieeexplore.ieee.org/document/10195143
I would like to submit evidence on the following point: How clearly has the Ministry of Defence (MoD) set out its priorities for the kind of AI capacity and expertise it believes the UK defence sector should have, what priorities has it identified, and are these priorities deliverable?
My evidence is based mainly on the MoD AI Strategy published in June 2022. As a whole, the strategy sets out clearly the above priorities. These priorities are right and they seem deliverable. They are based on the vision that in terms of AI, MoD will be the most effective, efficient, trusted and influential organisation for its size.
The MoD AI Strategy is well aligned with the NATO AI Strategy whose main focus is on the responsible use and development of AI. However, the MoD AI Strategy does not seem to provide a sufficient level of clarity on how this focus can be achieved, and in particular, on how this achievement can be facilitated by XAI in the context of AI Regulation.
In order to improve clarity on the responsible use and development of AI, MoD could consider the documents with the weblinks listed further below. The first document is a Programme Report on XAI by the Defence Advanced Research Projects Agency (DARPA). The second document is a National Research and Development Strategic AI Plan of the White House. The third document is a Policy Briefing on XAI by the Royal Society. The fourth document is a Political Agreement on the EU AI Act.
chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://nsarchive.gwu.edu/sites/default/files/documents/5794867/National-Security-Archive-David-Gunning-DARPA.pdf
chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.whitehouse.gov/wp-content/uploads/2023/05/National-Artificial-Intelligence-Research-and-Development-Strategic-Plan-2023-Update.pdf
chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://ec.europa.eu/futurium/en/system/files/ged/ai-and-interpretability-policy-briefing_creative_commons.pdf
chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://assets.ey.com/content/dam/ey-sites/ey-com/en_gl/topics/ai/ey-eu-ai-act-political-agreement-overview-10-december-2023.pdf
The above documents could be useful if more detail on Explainable AI (XAI) in the context of AI Regulation is required for the revision of the MoD AI Strategy in the future. An additional useful resource in this respect could be the XAI Panel at the next IEEE World Congress on Computational Intelligence. More details about the topics to be discussed by this panel can be found here:
https://2024.ieeewcci.org/program/panels
Finally, please, feel free to contact me if you think that my research expertise could be of any further help to you or you have any questions about this evidence.
Kind regards,
Alexander Gegov, BSc, MSc, PhD, DSc
Associate Professor in Computational Intelligence
University of Portsmouth
School of Computing
14th January 2024