Driverless cars safety could stop them “turning the corner”
It’s believed that around 90% of car crashes are the result of driver error. That’s a grim statistic and one that adds fuel to the view that driverless cars are the logical solution.
The UK Government has earmarked £8.1 million to support the development of self-driving vehicles – also known as connected and autonomous vehicles (CAVs). They are already a common feature of roads around the Silicon Valley and the race is on to manufacture the first commercially viable model.
If you face a sizeable fleet insurance bill each year, the idea of fewer accidents and lower premiums must sound tempting. Not least as the salary bill for delivery drivers and other fleet vehicle users could be slashed.
However, safety issues could still hold the development back. Recent headlines regarding a self-driving bus in Las Vegas that “crashed” within an hour of its first service, throws the whole issue under the spotlight. (source: https://www.theguardian.com/technology/2017/nov/09/self-driving-bus-crashes-two-hours-after-las-vegas-launch-truck-autonomous-vehicle)
Safety features of self-drive vehicles
Autonomous vehicles can seem straightforward, when you consider how much of driving is already automated, from finding your way around traffic jams to parallel parking.
The theory is that a well-programmed self-driving car will assimilate new data, just as a learner driver would grow in experience and confidence.
How safe is safe?
Critics of autonomous vehicles argue that the two things are very different. Learning as it moves around puts people at risk. Plus, no amount of testing will ever equip a driverless car with the same intuition and insights used by human drivers.
An autonomous vehicle could travel a route many times and gather data on danger points. But human drivers know that danger comes in many forms; a pedestrian or cyclist behaving erratically or a driver who looks likely to cut across you in the most inconsiderate way imaginable.
Can data really replicate human instincts, hardwired during the entire evolutionary process?
There is also the public’s own apprehension to weigh against progress in this field of technology. It will only take a few more crashes – or worse still a loss of life – for self-driving vehicles to become vilified as a public danger.
Does the industry delay then, and create the perfect self-driving car before it gets let loose on the roads? In which case, how many lives would that cost?