As autonomous vehicles (AVs) edge closer to becoming the norm in global markets, public trust remains an obstacle. Projections suggest that by 2035, 40% of new cars in Europe will be equipped with self-driving features. Nonetheless, many surveys disclosed that a sizable portion of the population remains skeptical. A 2019 survey indicated that 64% of Americans wouldn’t buy an AV, and 67% believe they should meet stricter safety standards than traditional vehicles. For these vehicles to gain wider acceptance, technological advancements must be paired with proven safety measures.
Outline of the article
Autonomous Vehicle Safety vs. Security: Understanding the Difference
Safety and security are too often used interdependently.
However, in the AV world, there is a major difference that everyone should be aware of, to understand what is discussed:
- Safety is the harm a vehicle can cause to a human. It refers to the vehicle’s ability to operate without causing harm or accidents, ensuring it responds appropriately to its environment and unforeseen events.
- Security is the harm a human can cause to a vehicle. It focuses on protecting the vehicle from external threats and malicious attacks, such as hacking or unauthorized access.
Both are essential for the successful integration and acceptance of AVs. While a vehicle might be safe in terms of avoiding collisions, without robust security measures, it could be vulnerable to external interferences that compromise its safe operation.
In this article, we will focus on the safety part. Some of our coming articles will solely focus on security.
Understanding How Self-Driving Cars Function in 2023
Autonomous vehicles rely on hardware and software for their self-driving capabilities:
Hardware with Sensors, Lidar, and Cameras
- Sensors: These devices continually measure various parameters like distance and speed of surrounding objects. They’re akin to the vehicle’s “sense of touch,” providing real-time feedback on its environment.
- Lidar (Light Detection and Ranging): Lidar systems emit laser beams to map out the surroundings. By calculating how long it takes for each beam to return after bouncing off an object, it creates a detailed 3D map of the environment. This “vision” allows the vehicle to understand its position relative to other objects.
- Cameras: Positioned at strategic locations around the vehicle, cameras capture visual data, helping in tasks such as lane detection, reading traffic signs, and recognizing pedestrians.
Software with Algorithms and Machine Learning
Autonomous vehicles don’t just rely on hardware; they’re powered by sophisticated software:
- Algorithms: These are a set of rules or procedures the vehicle follows when making decisions. For instance, an algorithm might dictate how the vehicle should behave when it encounters a pedestrian crossing the road or how to maneuver through a roundabout.
- Machine Learning: At the heart of these vehicles lies machine learning, a subset of artificial intelligence. Instead of being explicitly programmed for every scenario, these vehicles “learn” from vast datasets. Every time an autonomous vehicle navigates a situation, it collects data. Over time, by processing and analyzing this data, the vehicle refines its decision-making, ensuring safer and more efficient driving.
Self-Driving Car Capabilities in 2023: A Current Snapshot
In 2023, we must distinguish between markets (EU, China, US…) and between the scene (private cars versus commercial fleets).
European self-driving in the Private Car Scene
a clear distinction between two levels of automation is to be made: assisted driving and self-driving.
Assisted driving features, now prevalent in many vehicles, primarily support the driver. They offer functionalities like braking assistance, parking aid, speed maintenance, lane-centering, and collision alerts. These features are designed to augment the driver’s abilities rather than replace them.
In contrast, 2023 marks a pivotal year for Europe, witnessing the public debut of truly self-driving features in which the car takes full control (and with it – responsibility). The Automated Lane Keeping System (ALKS) emerges as a prominent example. ALKS allows vehicles to autonomously drive in very specific scenarios —like highways at speeds up to 60km/h and under favorable weather conditions.
It’s still crucial for the driver to remin vigilant, seated, and refrain from using handheld devices. The driver must be prepared to assume control when alerted by the vehicle.
As we venture further into the future, we can anticipate a broadening of these scenarios where self-driving will be allowed.
European Commercial AV Projects
Europe’s autonomous vehicle ambitions aren’t limited to personal cars. Several commercial initiatives are underway across various sectors. Looking at examples from the SAAM community, we have LOXO in logistics (last-mile delivery), Swiss Transit Lab with Linie 13 “Rhyder” and ULTIMO in public transportation.
It’s crucial to recognize that these projects are in their early stages, operating within strictly predefined environments. Widespread deployment, according to experts, might materialize around 2025 and will still be contained within predefined environments.
International Perspective with a Focus on the US
The US is undeniably a step ahead when speaking about autonomous vehicles. Factors like prolonged testing durations, funding available, and more permissive regulatory frameworks have given them a head start. Two giant players dominating the US autonomous scene are Cruise (General Motors) and Waymo (Google). Both are making significant strides in the robot-taxi or ride-hailing market.
For the remainder of this article, our safety analysis will lean heavily on Cruise and Waymo for two primary reasons:
- Both acquired public operation licenses as of August, accumulating a vast mileage of autonomous driving, translating to invaluable data.
- Their commitment to transparency ensures that every incident is reported to the Department of Motor Vehicles (DMV), providing a rich dataset for analysis.
Safety Analysis 2023: Insights from Industry Leaders, Cruise & Waymo
An extensive analysis of Waymo and Cruise crash reports in California prior up to summer 2023 provides fascinating insights. We thank Timothy Bee Lee for his amazing work and encourage you to read his full article.
Key Milestones & Reported Incidents for Autonomous Driving in 2023
During this period, both companies reported 102 (39 for Waymo and 63 for Cruise) crashes for nearly 10 million km driven, which translates to one crash every 100,000 km. To put this in context, a typical human driver might take about five years to cover such a distance. What’s noteworthy is that most of these were low-speed collisions with minimal safety risks.
In many cases, the fault lay not with the AV operator but with other road users. Some notable driving errors involved minor mishaps like grazing a deserted shopping cart or slightly nicking a parked car’s bumper.
The biggest crash of all happened in February when a Waymo vehicle killed a dog. In an emailed statement, Waymo said that it “reviewed the event from many different perspectives” and concluded there was no way either Waymo’s software or a human driver could have avoided hitting the dog.
Autonomous Vehicles vs. Human Drivers: A Safety Comparison
Humans, on average, clock almost 100 million kilometers between fatal accidents. To affirm the safety of autonomous vehicles with absolute certainty, hundreds of millions of autonomous kilometers would be required. However, Waymo’s performance is increasingly indicating that their technology might already surpass human driving abilities. Their relatively clean slate over millions of miles suggests a promising potential to enhance road safety.
Detailed Crash Report: Waymo’s Performance Analysis
- 17 instances where a stationary Waymo was hit by another vehicle
- 9 cases where another vehicle rear-ended a moving Waymo
- 2 events where Waymo vehicles were sideswiped
- 2 incidents where a Waymo couldn’t brake swiftly after being cut off
- 2 minor collisions with static vehicles
- 7 trivial collisions with non-mobile objects, including shopping carts and potholes
Main Takeaways from 2023 Autonomous Vehicle Safety Reports
Other vehicles collided with Waymo on 28 occasions, whereas Waymo vehicles were involved in incidents with other vehicles only four times. Notably, in two of these instances, Waymo reported that its vehicle was unexpectedly cut off.
Out of all the reported incidents, only three to four were classified as “serious” crashes. Impressively, in these serious events, the responsibility did not seem to fall on Waymo.
These figures reflect more than 3 million kilometers of driving. The National Highway Traffic Safety Board approximates 6 million reported car crashes annually in the U.S., which means that statistically, a severe crash happens once every 800,000 kilometers.
Contrasted with human driving, Waymo’s record suggests its technology might already be at par, if not superior.
Data Reveals: Are Waymo's Autonomous Cars Safer than Human Drivers?
A collaboration between Waymo and Swiss Re provides even more clarity. Analyzing 600,000 insurance claims spanning over 200 billion kilometers of human-driven car data, the study compared this to Waymo’s 5,5 million kilometers of autonomous driving. The results are striking. Waymo’s autonomous vehicles in San Francisco and Phoenix reported:
- A 100% decrease in the frequency of bodily injury claims.
- A 76% decrease in the frequency of property damage claims.
These figures are especially significant when considering the reaction times of autonomous vehicles in unexpected situations. Faster response times mean fewer and less severe accidents.
Decrypting the Safety Standards for Autonomous Vehicles: When is it Safe Enough?
As many regulators start to see it, the path forward is twofold: it contains a “must” and a “should” part.
The “must” encompasses legally binding criteria an AV should satisfy and are already well defined. To legally drive itself without the need for monitoring by an individual, a vehicle must:
- Comply with relevant road traffic rules
- Avoid collisions which a competent and careful driver could avoid
- Treat other road users with reasonable consideration
- Avoid putting itself in a position where it would be the cause of a collision
- Recognize when it is operating outside of its operational design domain
In contrast, the “should” captures socially accepted behaviors, including not blocking traffic or causing disturbances. This second part is by nature more subjective and more difficult to monitor and regulate.
In our opinion, it’s the extent to which legislators will limit this second part that could potentially hinder innovation. It is thus very important to find the right balance needs to quickly be found and the public should also have a voice in that decision.
We hope this article sheds light on the overreaching goal everyone can agree on, which is leveraging the safety potential of self-driving cars to drastically reduce road injuries and fatalities.
FAQ on Autonomous Vehicle Safety and Developments
In the context of self-driving cars, ‘safety’ relates to the vehicle’s ability to operate without causing accidents, responding correctly to its surroundings. In contrast, ‘security’ refers to the protection of the vehicle from external threats like hacking.
Self-driving cars utilize various hardware, including sensors, Lidar, and cameras. These devices measure parameters, map out surroundings, and capture visual data, aiding in navigation and decision-making.
By 2023, Europe sees the emergence of truly self-driving features in cars, especially the Automated Lane Keeping System (ALKS) for specific scenarios. However, they are still in their early stages, especially in the commercial sectors.
In the US, Cruise (General Motors) and Waymo (Google) are significant players making strides in the autonomous vehicle scene, particularly in the robot-taxi or ride-hailing market.
An analysis up to summer 2023 showed that for nearly 10 million km driven, both companies reported 102 crashes, translating to one crash every 100,000 km. Most of these were minor, low-speed collisions.
Based on Waymo’s driving records and a collaboration study with Swiss Re, the data suggests that Waymo’s technology might already be on par with, or possibly even surpass, human driving abilities in terms of safety.
Legally, for an AV to drive without human monitoring, it must comply with road traffic rules, avoid avoidable collisions, treat other road users considerately, recognize its operational boundaries, and not cause disturbances.
- MIT Technology Review – Robotaxis are here. It’s time to decide what to do about them.
- Driverless cars may already be safer than human drivers
- Setting the standards for the risk assessment of autonomous vehicles: a collaboration between Swiss Re and Waymo
BVLRA – THE AUTOMATED VEHICLE DRIVER RESPONSIBILITY IN VEHICLE EDUCATION GROUP (AV-DRIVE), Self-driving vehicle communications toolkit