PACTS welcomes the Transport Committee’s Self-driving vehicles inquiry. Our remit is transport safety and our evidence is in relation to the Committee’s question about safety.
PACTS supports research into connected and autonomous and self-driving vehicles. We take a positive approach to their development whatever can be learned to improve road safety.
Many questions remain to be answered and many issues resolved before of automated vehicles (AVs) can be deployed safely for public use on our roads. We would be opposed to taking shortcuts which compromised safety in order to accelerate the deployment of this technology. Testing of AVs should be controlled and risk to the public should be as low as reasonably possible. The public should not be used as guinea pigs or used to test “beta” systems on public roads.
Safety is advertised as one of the major benefits of AVs. However, this has yet to be proven in practice. There is a risk that commercial and other pressures to progress to AVs will force implementation before safety is established. At present, it is hard for anyone to make sensible judgements on AVs, as it is a broad concept with varied use cases and uncertain timescales. Greater clarity and a clearer roadmap are required to properly assess the challenges and benefits of their application.
The safety case for AVs is often based on the premise that 90% of crashes are due to human error and the assumption that AVs will not be prone to making errors. Both of these statements are misleading if not false.
Many ADAS features are the foundations for AV development, providing the vehicle with situational awareness, are currently available for conventionally driven vehicles. These may include advanced emergency braking (AEB), intelligent speed assistance (ISA), automated lane-keeping system (ALKS), adaptive cruise control, proximity sensors etc. ADAS features are available today. Many, including ISA and AEB, are now mandatory in vehicles sold in the EU under the revised EU General and Pedestrian Safety Directives (and in Northern Ireland, under the Protocol). They do not apply in GB, however, as the UK government has not updated its standards.
The government should raise UK type-approval standards, implementing these driver support features available today in the market for some conventional vehicles, thereby making all vehicles safer, including commercial vehicles. This will allow the industry and regulators to assess the validity and performance of each of these technologies individually before allowing cars, vans, and other vehicle types to become dependent on them. These sensors and technologies available today are beneficial in aiding drivers, but the decision-making based on the information from these systems should be left to the humans rather than machines – at least until the AVs are fully developed and matured. AVs may get to a point where they are equally good at driving as a careful and competent human driver, but they are not there yet.
New vehicles may be fitted with self-driving systems for certain tasks and domains, where the driver is required to, or chooses to, take back control at other times. This mix of automated and conventional driving poses a range of risks.
Until the automated driving system being used can achieve a minimal risk condition (e.g. move to a safe speed or stopping location), and while still recognised to be in the ‘driving’ role, individuals should not be permitted to undertake any secondary activities, beyond those minor tasks currently permitted, when an automated driving system is engaged.
Where the individual is recognised as the user-in-charge and the vehicle is classed as driving itself, secondary activities or as Law Commissions terms them ‘non-driving related activities’, may be permitted. These permitted secondary activities will need to be carefully defined by the regulator, based on safety criteria. Further research is needed into what these activities could be. We recommend, at this stage, non-driving related activities should be limited to the use of in-car infotainment/ navigational systems. The scope may change with technology, experience and learning over time.
Until vehicles provide full automation, the design and function of the Human Machine Interface (HMI) will be crucial for ensuring that the “human driver” and the vehicle-based automated systems collaborate in a safe manner. The role of the driver cannot be eliminated, rather there has to be a high degree of collaboration between the machine and the driver. Greater clarity is required on the role and responsibility of the driver present in the car. It is also important to ensure that adequate training on the use of AVs is provided to the driver before AVs are allowed for public use on roads.
There are many human factors related to the safe operation of AVs.
Many semi-automated or fully-automated technologies will rely on road infrastructure being readable for their application. The infrastructure performance (visibility, state of repair etc) regarding traffic signs, signals, and road markings to support higher levels of safe and reliable automated driving has to be recognised. This will involve new standards, harmonisation and investment. The current road infrastructure has been progressively built or modified avoid crashes and to reduce injury – self-explaining and forgiving roads. This should continue. Furthermore, digital and technological infrastructure and regulation related to network connectivity and data storage and access need to be built in order to ensure the safe and effective operation of connected and AVs.
In the Government’s recent document Connected & Automated Mobility 2025: Realising the benefits of self-driving vehicles in the UK, it suggests the level of safety that should be required of AVs.
This government believes that self-driving vehicles should be held to the equivalent standard of behaviour as that expected of human drivers; competent and careful. This ambition will capitalise on the huge safety potential of self-driving vehicles, ensuring improved safety on our roads and thereby supporting public trust and acceptance.
In our view:
The level of road casualties in the UK and other countries is unacceptably high. Much more should be done to reduce this enormous human and economic cost. Despite the high level of collisions and casualties, humans are very good at complex tasks, which are not easy to match by a machine.
Broadly speaking, PACTS view is that AVs should be held to a standard at least equivalent to a competent and careful driver, and probably higher. This standard should be ratcheted up with time, experience and technical development.
In our judgement, AVs will need to meet higher safety standards than a careful and competent driver to gain and retain public trust. Some of the public has a mistrust of AVs, with some justification. This may be overcome as the development of AVs progresses but safety failures, even if isolated and atypical, may set back public acceptance. Once lost, it can be hard to regain, regardless of technical assurances and statistical evidence. The fuss over smart motorways is an example.
If the “competent and careful driver” standard is accepted, how should it be defined?
For every billion miles the vehicles travelled in 2021 on GB roads there were 425 reported casualties. One might assume that careful and competent drivers were responsible for, at best none, and at worst 10% of these casualties. This equates to a maximum of 43 casualties per billion vehicle miles that could be attributed to careful and competent drivers. We suggest this as a point for debate.
It is also very important that particular road user groups, such as pedestrians, cyclists, motorcyclists and equestrians, are not disadvantaged by self-driving vehicles. It would be inequitable and unacceptable for self-driving systems to be authorised which failed to also improve safety for vulnerable road users.
AVs may offer substantial safety benefits in the future. However, deployment should not be hurried by compromising safety and the industry should be required to ensure that their product is safe and reliable before it enters the market. It is more likely that self-driving systems will progressively be fitted to new vehicles – rather than some “big bang” of automation. The government should address the risks and exploit the safety opportunities of this mixed fleet.
 There are differences between self-driving, automated and autonomous vehicles. The abbreviation “AV” is widely used. We use it here for simplicity.
 David Zipper, The Deadly Myth That Human Error Causes Most Car Crashes - The Atlantic, 2021
 US Department of Transportation, NHTSA, Summary Report: Standing General Order on Crash Reporting for Automated Driving Systems (nhtsa.gov), June 2022.
 Vaccine for Vehicles: Preventing deaths and injuries on UK roads - PACTS briefing January 2022. - PACTS
 HM Government, Connected & Automated Mobility 2025: Realising the benefits of self-driving vehicles in the UK, CP 719 August 2022