Developing driverless cars has been AI’s greatest test. Today we can say it has failed miserably, despite the expenditure of tens of billions of dollars in attempts to produce a viable commercial vehicle. Moreover, the recent withdrawal from the market of a leading provider of robotaxis in the US, coupled with the introduction of strict legislation in the UK, suggests that the developers’ hopes of monetising the concept are even more remote than before. The very future of the idea hangs in the balance.
The attempt to produce a driverless car started in the mid-00s with a challenge by a US defence research agency, offering a $1m prize for whoever could create one capable of making a very limited journey in the desert. This quickly turned into a race between various tech and car companies (OEMs, as they are now known – original equipment manufacturers) to produce what they thought would be the ultimate cash cow: a car that could operate in all conditions without a driver.
Right from the start, the hype far outpaced the technological advances. In 2010, at the Shanghai Expo, General Motors had produced a video showing a driverless car taking a pregnant woman to hospital at breakneck speed and, as the commentary assured the viewers, safely. It was precisely the promise of greater safety, cutting the terrible worldwide annual roads death toll of 1.25m, that the sponsors of driverless vehicles dangled in front of the public.
And that is now proving their undoing. First to go was Uber after an accident in which one of its self-driving cars killed Elaine Herzberg in Phoenix, Arizona. The car was in autonomous mode, and its “operator” was accused of watching a TV show, meaning they did not notice when the car hit Herzberg, who had confused its computers by stepping on to the highway pushing a bike carrying bags on its handlebars. Fatally, the computer could not interpret this confusing array of objects.
Until then, Uber’s business model had been predicated on the idea that within a few years it would dispense with drivers and provide a fleet of robotaxis. That plan died with Herzberg, and Uber soon pulled out of all its driverless taxi trials.
Now Cruise, the company bought by General Motors to spearhead its development of autonomous vehicles, is retreating almost as rapidly. The trigger was also an accident, which by chance proved not to be fatal but caused serious injuries. In October, a woman crossing a road in San Francisco was hit by a human-driven car and knocked into the path of a Cruise robotaxi. Instead of stopping, the robotaxi drove over the pedestrian because it had been programmed to pull over to the right when confronted with an unknown situation. She survived but will clearly be in line for massive compensation.
Since then Cruise has been in full damage-limitation mode. After initially holding back details of what happened, it soon withdrew its robotaxis in all US cities and its CEO quit. It was revealed that vehicles were not even driverless, since the cars had been remotely controlled with interventions by operators about every four or five miles. There are now mass redundancies and the future of the development is uncertain.
Tesla is also in defence mode. It has long marketed its driver aid software as “full self-driving”, but it is nothing of the sort. Drivers must stay alert and ready to take over, even though the car can operate itself much of the time, particularly on motorways. In the US, where there have been numerous accidents with Teslas in “full self-driving” mode, the manufacturer is facing several lawsuits.
In the UK, Tesla will fall foul of the legislation introduced into parliament last month, which prevents companies from misleading the public about the capability of their vehicles. Tesla’s troubles have been compounded by the revelations from ex-employee Lukasz Krupski who claims the self-drive capabilities of Teslas pose a risk to the public. Manufacturers will be forced to specify precisely which functions of the car – steering, brakes, acceleration – have been automated. Tesla will have to change its marketing approach in order to comply. So, while the bill has been promoted as enabling the more rapid introduction of driverless cars, meeting its restrictive terms may prove to be an insuperable obstacle for their developers.
These events highlight the technological difficulties faced by the move to driverlessness, as well as the fragility of the case supporting the development of autonomous cars. Every prediction of the technology coming to fruition in three or four years’ time has been found to be overoptimistic. Ministers have been sucked in, including Chris Grayling in 2017, who said that self-driving cars would be on the roads by 2021.
The tech companies have constantly underestimated the sheer difficulty of matching, let alone bettering, human driving skills. This is where the technology has failed to deliver. Artificial intelligence is a fancy name for the much less sexy-sounding “machine learning”, and involves “teaching” the computer to interpret what is happening in the very complex road environment. The trouble is there are an enormous number of potential use cases, ranging from the much-used example of a camel wandering down Main Street to a simple rock in the road, which may or may not just be a paper bag. Humans are exceptionally good at instantly assessing these risks, but if a computer has not been told about camels it will not know how to respond. It was the plastic bags hanging on Herzberg’s bike that confused the car’s computer for a fatal six seconds, according to the subsequent analysis.
That is why it is clearly a misplaced priority on the part of the government, headed by tech bro Rishi Sunak, to put forward a bill on autonomous vehicles while sidelining plans to reform the railways or legislate for electric scooters, which are in a legal no man’s land. The future may well not be driverless cars, and meanwhile there is a transport system in desperate need of attention. If this is the best that AI can do, maybe fears about its capabilities and its ability to put humans out of work are misplaced. Certainly, Sunak’s chauffeur can feel secure for now.