So, there you are, sitting at the wheel, daydreaming with a book on your lap, as the car drives itself along the M6 towards your meeting. Isn’t technology just wonderful? All of a sudden, a warning chime starts, red lights flash across the dash, the steering wheel whirls around and a disembodied voice tells you it’s going to hand back control in 10, no eight seconds, no six…
Yikes. Stuff is happening out there, but what? Faster than you can check the mirrors, the dashboard lights, or the side views, you’re back in charge, hands clawed on the wheel while the autonomous driving algorithm skulks in its black box. That jack-knifing truck in front is now your problem and moreover, if the latest report from the Law Commission becomes enshrined in UK law, so too, is the potential insurance bill…
The Scalextric Complex
Don’t go thinking this is just a bad dream, either. Autonomous cars might not be the tech du jour they were a few years ago, but we’re still fixated with them. From the mechanical automatons in Greek mythology and ancient China, to Da Vinci’s 15th century robot, all the way to R2-D2 and C-3PO from the Star Wars films, mankind has thrilled at the idea of an inner-directed machine that makes its own decisions – but as the Chinese proverb goes: ‘be careful what you wish for’.
As far as the tech industry is concerned, there’s a big financial prize at stake in being the first with a self-driving car, just look at the way Tesla has dominated the electric-car industry while not actually making any money out of it. At times, however, it’s hard not to see the hyperbole as a cross between God-playing fantasy and a Scalextric complex.
And this week the autonomous driving hype began all over again, with the Law Commission report which has been greeted by some as clearing up ‘grey areas’ for insurers, though in fact, it’s probably raised more questions than it has answered.
“It’s a good starting point,” says Benjamin Hatton, insurance analyst at data specialists GlobalData, “but such is the nature of this technology, that we’re five to ten years off self-driving becoming an issue.”
Nevertheless, autonomous cars are on our roads right now. They’ve been testing for years on US and Japanese roads, and while Tesla’s Autopilot is not a self-driving car, many owners have mistakenly used the system as such. Last year the UK government announced its intention to allow limited self-driving on UK motorways with planned changes to The Highway Code to accommodate the correct use of such systems.
Even modern cars have a limited form of self driving (SAE level 2), which allows the vehicle to maintain station in a lane, brake, accelerate and even restart from a standstill. And as the Law Commission report acknowledges, such new-car features will “develop to a point where a vehicle will be able to drive itself, without a human paying attention to the road… [which will have] profound legal consequences.”
Way to go! We’ve been writing about this for at least a decade and the UK Law Commission has only just figured this out. One’s reminded of that sketch from the Eighties TV show Not The Nine O’Clock News, where Rowan Atkinson playing a high-court judge asks what a digital watch and a video recorder is, before displaying a remarkable knowledge of a range of blow-up dolls…
Who is in charge of a driverless vehicle?
The report recommends the establishment of a clear responsibility for a self-driving vehicle when the driver is only a “user in charge” with a wide range of immunity from driving offences and legal responsibility falling on the car’s makers and its parts suppliers.
It draws heavily on the J30116:2021 standard established by the Society of Automobile Engineers (SAE), where 0 means no driving automation; 1 is some driver assistance; 2 is partial automation; 3 is conditional automation; 4 is a high level of driving automation and 5 is full automation.
While the report acknowledges that some vehicles will be completely autonomous where occupants will be simply passengers (called ‘no user in charge, or NUIC), it doesn’t really deal with the grey area of handing back control of the vehicle to the driver, the scenario raised at the start of this article. Just who is in charge during this period?
Ethical questions such the Trolley Problem have tended to dominate the self-driving debate. This is where the car’s self-driving algorithms have to choose between two very bad outcomes, often typified by the choice between hitting a crowded bus stop or a single mother and pram. While which-one-is-the-car-going-to-kill thought experiments are intriguing for ethicists (and journalists), such situations are unbelievably rare. Hand back, however, is in the here and now.
How long to hand back control of an automous car?
Some car makers think hand back to the driver can be safely accomplished within 10 seconds, others such as Volvo have been quoted as saying that it could take as long as two minutes for the driver to fully take back control from car’s systems. Not everyone thinks it will take that long, but even the most optimistic acknowledge it as an issue.
“I think that 10 sec is quite a long time,” says Dr Gill Pratt, chief executive of the Toyota Research Institute, pointing out that on a motorway you’ll have travelled a long way, while in a city a lot can happen in a very short period. Note here that at 70mph a car will cover over 300 metres in 10 seconds.
Perhaps the bigger question is just why is the car handing back control? If it’s because it has encountered across something it’s never seen before, then why is that? Has the decision-making software been properly developed? As Professor Pim van der Jagt, then MD of Ford’s Aachen centre but now technical director of AB Dynamics, once pointed out: “The average driver will travel half a million miles between an accident and as long as he or she is fully focussed they are probably safer than a system, which in certain conditions, might try to take over driving and make the wrong decision.”
Gill Pratt also acknowledged the human problem saying that predicting human behaviour is the biggest issue for software developers. “If you have a situation where there’s mixed traffic between automated vehicles and human-controlled vehicles,” he said, “or pedestrians or bicyclists, how do you predict what people are going to do?”
There might also be an issue with the lines of responsibility here. If car makers and tier-one software developers know that they will be liable in such evolving situations and accidents, then the tendency will be to hand back control and therefore liability as early as possible.
And will the driver be ready for that? While the report states that when the car is driving itself, the ‘person in charge’ should not sleep or (heaven forfend) use a mobile telephone, it does acknowledge that monitoring the car’s progress as a bystander is a lot more difficult.
That’s been borne out by experience on the test track. Five years ago, Ford’s product development chief, Raj Nair admitted to Motor Trend magazine that his engineers struggled to maintain ‘situational awareness,'” when driving in autonomous vehicles – in other words, they fell asleep.
So, what exactly can you do when your car is driving itself? Concept cars such as Mercedes-Benz’s 2015 F015 – Luxury in Motion concept had swivelling front seats so the driver might not be even looking at the road ahead. The implication is the car’s occupants wouldn’t be monitoring anything except each other or Netflix.
The other matter not covered by the Law Commissioners is the paradox of the four-way junction (or roundabout), where everyone gives way. In the Darpa Grand Challenge, a series of autonomy trials held by the US Defense Advanced Research Projects Agency in the early part of this century, self-driving vehicles would stop for 10, or 20 minutes simply making up their electronic minds on what to do. The Darpa contest is now considered to be a bellwether of the self-driving technology with many of the vehicle manufacturer and university teams hired by Silicon Valley giants to further develop the systems.
What’s became clear out of it, is that without a bit of assertive goal-orientated behaviour written into their algorithms, self-driving cars will defer to each other to the point where they won’t go anywhere – they need to be a bit pushy. Moreover pedestrians, cyclists and other road users might learn that they can exploit the high-level safety priorities embodied in the software and step or pull out in front of autonomous cars (relatively) safe in knowledge that they will stop.
GlobalData’s Benjamin Hatton agrees that the Law Commission report (if adopted in UK law), “could slow the speed at which these things are released because of the risk – it could hinder roll out, but where we are going, there’s a long way still to go.”
Question is, do we want to go there?
Self-driving has been pushed hard as a way of reducing the horrendous death toll on the world’s roads which in 2013, according to the World Health Organisation, numbered over 1.25 million, but those deaths and injuries are disproportionately concentrated in developing nations (Africa, India and China represented over three quarters of a million), where the roads aren’t always as well maintained, lanes and road-edge markings not as well defined as required by most self-driving systems, and observance and enforcement of road rules aren’t as stringent as in more developed nations. Self-driving as we understand it, is unlikely to work in a place like India at the moment; imagine what would happen if a cow sauntered into the path of an autonomous car?
In fact, before he retired, Kiyotaka Ise the former R&D head at Toyota told me that for autonomy to fully work, all vehicles would have to rigidly obey the speed limits and motor cycles, bicycles and horses might have to be banned from roads. So, there might well be freedoms that the driverless car might take from us to enable it to work. To maximise their utility, autonomous cars might require special lanes, or road space and it’s here that the thinking gets hazy. If they confer no marginal environmental benefit, are largely owned by wealthy middle classes and aren’t conferring positive, Benthamite benefits to the general population, then why should we give the Google car its own darn road?
Moreover, are the Law Commission proposals going to be those adopted by the rest of the world? Autonomy is a global industry and many different countries have their own take on what the priorities should be. If the UK is an outlier in terms of standards, then will the industry bother to write the special code and algorithms just for our roads?
Beyond vapid technology utopianists, self-driving still has a lot of dichotomies, contradictions and questions. There’s the cost of all the extra cameras, radar sensors and software, which according to one estimate could add up to €20,000 (nearly £17,000) to the price of each car. The computing requirements could involve the processing of up to 4,000 gigabytes of data a day according to one source. And it all has to be safe, not just in the way it drives, but also from malicious hacking. How valuable it would be to a terrorist group for example to take control of a fleet of trucks from anywhere in the world?
Yet the technological bandwagon rolls on, fuelled by hype and hope. Did we just witness another watershed passed last week? Who knows?
Opinion: What the government doesn’t tell you about E10 fuel
Opinion: Home or classic car? The housing crisis will affect classics, too
Buying a used car? Keep the faith – there are fellow car enthusiasts out there