Car Crashes truck accidents Maryland DUI Road Injury Truck Accident Traffic Accident Motorcycle Accident people walking on street near red and white bus during daytime

Why AI Didn’t Solve Car Crashes Yet

Ten years ago, many futurists prognosticating on car crashes said that AI would likely solve collisions by 2020, or perhaps 2025 at the latest. But, unfortunately, we’re still here and nothing much has changed. What’s going on? Why has this happened? Why is progress so disappointingly slow? 

Fortunately, this post has some answers. We look at why the technology isn’t quite there yet, and what’s being done about it, if anything. 

Corner Solution Issues

The primary problem with today’s AI is that it isn’t 100% accurate. Many systems are approximate, which is okay for writing blogs, but it isn’t good enough when people’s lives are on the line. 

Mathematicians call these problems “corner issues,” which fall out of some math optimization problems. The idea is that you can solve the vast bulk of challenges on the road using existing systems, but there will always be weird situations that autonomous vehicles can’t react to properly. This is why drivers still have to sit with their hands hovering over the wheel, ready to grab it and take over the controls whenever the vehicle gets confused. 

Unfortunately, the problem isn’t just limited training data. Engineers are now wondering whether the entire architecture needs rethinking. It may not be possible to rely on traditional machine learning to get cars to use common sense on the road. They may need what many are calling “world models,” which give them a common-sense understanding of the physical environment, similar to humans. 

Sensor Limitations

At the same time, many autonomous car developers are dealing with sensor limitations. LIDAR, radar, and cameras are okay, but they don’t always work well in the rain, fog or heavy glare. Unlike people, they are less able to make inferences about their surroundings, meaning that many vehicles now have multiple sensors, pushing up the cost and processing challenges. 

These issues have already led several drivers to ask for help from a car accident lawyer after systems made mistakes. Companies are often on the hook in these cases, which is why they are being very conservative with these systems. 

Human-AI Interaction Challenges

Another reason AI systems didn’t solve car crashes comes down to confusion. Many drivers simply don’t understand how to use these properly, preventing them from taking full advantage of them. 

One issue is simply reaction times. Drivers may not be able to take over the wheel fast enough when the onboard AI gets confused and doesn’t know what to do. Furthermore, drivers may be so used to relying on AI that they don’t take over the wheel, even when prompted. 

Regulatory Hurdles

Finally, AI is finding it hard to solve car crashes for regulatory reasons. Many companies are discovering that it is challenging to even set up trials or get vehicles on the road. 

One of the issues is what AI systems should do if they have to make a moral choice. For example, regulators aren’t sure whether AI should avoid crashing into specific groups of people while sacrificing others.