Comment , , ,

Comment by John Mellor

Tesla_Model_S_Elon_Musk2

Elon Musk

GOOGLE – arguably the world leader in autonomous driving systems – as long as 12 months ago, went on the record publicly to urge caution about introducing semi-autonomous driving systems because they breed dangerous overconfidence in drivers.

The warning was timely as Tesla regroups around a high-profile death of one of its drivers who was using Autopilot as it fields investigator’s questions about what the car promised and what it delivered. Death, in fact.

Google’s caution is a salutary warning to spruikers of autonomous cars who are suggesting the driverless car will be ready for our highways within the next few years.

The online search giant, come autonomous car-maker, which tests autonomous cars for three million kilometres a day, told a TED conference in June 2015 that it was shocked when it saw vision of its own employees in Google test vehicles performing tasks other than driving and not paying any attention to the road ahead when cruising freeways at more than 100 kilometres per hour.

In spite of warnings in detailed pre-drive briefings that the system was still in development, Google’s drivers were looking at phones and laptops, rifling around in the back seat retrieving items from bags and removing their hands from the wheel for long periods of time – all at high speed.

Tesla Model S

Tesla Model S

The company found that within five minutes “behind the wheel” people developed a completely unjustifiable trust in the car and began to do “ridiculous things” and were effectively “misusing the technology”.

The test vision led the company to sound a warning against car-makers releasing self-driving systems too soon. It said that autonomous cars should not be introduced into the market unless they were foolproof because they brought on a false sense of security in drivers.

The warning is prescient in view of the recent death of a Tesla driver who was killed when his Tesla slammed into the side of a truck, without braking, while using Tesla’s semi-autonomous system called Autopilot. The truck crossed the path of the Model S and it is alleged the driver was watching a video at the time while his car system read the side of the truck as clear sky.

Autopilot was released by Tesla en masse to owners and downloaded and installed remotely into all Model S vehicles in October last year.

Autopilot engaged: The instrument cluster on the Model S shows the real-time information that the car uses to determine its driving behaviour.

Autopilot: The instrument cluster on the Model S shows the real-time information that the car uses to determine its driving behaviour.

It was described as a beta application, which means it was still under development. Indeed a feature of Autopilot is that it “learns” as it is used by owners and Tesla says the software has been driven for more than 200 million kilometres.

However, Autopilot caused considerable controversy at the time with a flurry of videos on social medial showing Teslas having near misses before the drivers intervened. It was clear from the videos and chat on social media that Tesla owners assumed, albeit incorrectly, that the “Autopilot” software means just that: a system similar to the autopilot in a plane which could take over control of your car.tesla_technology_2_portrait

Tesla founder Elon Musk rejects calls to disable the system and plans, instead, to escalate efforts to educate owners in how Autopilot works by introducing an explanatory blog. He says disclaimers provided with the system are “written in super plain language”.

But Google is clearly worried that rolling out technology before it is ready could give the autonomous cars a bad name and that highly publicised failures costing lives – some are already calling it the automotive version of the “blue screen of death” – could hold back its introduction for years as fearful knee-jerk politicians legislate it off the roads.

“At stake is a road toll in the United States that is the equivalent of a 737-sized passenger jet crashing every working day – a toll of some 37,000 people a year.”

Google’s warning to the industry came in a presentation at a TED conference when the chief technology officer heading up Google’s self-driving car program, Chris Urmson, questioned the approach of progressively releasing driver assistance systems because of the danger of lulling drivers into a false sense of security.

Mr Urmson said: “Conventional wisdom would say that we’ll just take these driver assistance systems and we’ll kind of push them and incrementally improve them and, over time, they’ll turn into self-driving cars.”

But Google has concluded that the more self-driving technology that is progressively introduced into cars “the less reliable the driver is going to get”.

“At stake is a road toll in the United States that is the equivalent of a 737-sized passenger jet crashing every working day a toll of some 37,000 people a year.”

Mr Urmson warned: “So by just making the cars incrementally smarter, we’re probably not going to see the wins (in reducing road deaths) we really need.”

At stake is a road toll in the United States that is the equivalent of a 737-sized passenger jet crashing every working day of the year – an annual toll of some 37,000 people.

tesla_technology_portraitThis is an aviation death toll that would be unacceptable to the community and would bankrupt the aviation industry but “goes with the territory” for motorists.

The prospect of autonomous cars hold great hope to slash deaths on the road but at this point in the development of the technology there resides a moral dilemma: at what point do you make it available?

Presently there are three approaches.

The car-makers are accelerating the introduction of automatic emergency braking systems, lane keeping, blind spot systems, adaptive cruise controls and automatic stability control, which are acknowledged to be making in-roads into the road trauma by reducing the number of crashes and the severity of them.

These are more benign in that they are there to help the driver to prevent mistakes or to intervene when a mistake is made. Car-makers are adding these systems rapidly and in the US, car-makers are promising autonomous emergency braking systems will be standard on all cars within a few years – well ahead of what legislators want – because rear-end collisions on freeways in America are huge killers.

Google’s view, on the other hand, says pooling these technologies under the label of semi-autonomous cars is risky because it creates the illusion that drivers can leave the car to its own devices. So the company is holding back until it knows that its technology, while saving lives, will not lead to accidents through driver complacency.

“Google has concluded that the more self-driving technology that is progressively introduced into cars ‘the less reliable the driver is going to get’.”

Tesla, on the other hand, has pushed the envelope to the point where even the name Autopilot suggests the car can drive itself when it clearly cannot do so with impunity.

But Tesla founder, Elon Musk, is unrepentant. He pushed hard to roll out Autopilot as soon as he could.

He takes the view that the rollout of semi-autonomous technology at Tesla should be made available as quickly as possible and argues, with some justification, that it would be “morally reprehensible” to sit on safety innovations while people continue to be killed on the roads.

It therefore comes down to a question of at what point is the technology ready to achieve maximum benefit without the technology itself causing harm, because it can also be argued that it is more than reprehensible to put into the hands of the public an Autopilot system that cannot tell the difference between the sky and the side of a truck!

Meanwhile, dealers, who in the US are fighting legal battles to stop Tesla from selling cars direct to the public from its own stores, might come to regard themselves as fortunate that Tesla chooses to run its own retail operations as the litigation industry questions the risks and wisdom of Autopilot and searches to ransack the deepest pockets associated with the sale of Tesla’s electric cars if they kill people.

COMMENT By John Mellor

Tesla at Pause Fest, Federation Square

Manheim
Manheim
Gumtree
Manheim
PitcherPartners
Gumtree
DealerCell
MotorOne
AdTorque Edge
Schmick