The big question the auto industry has been baffled about is staring in its face more strongly than ever. Can the world afford to let computers drive their cars? With the first ever reported fatality in an autonomous driving car, several questions are being raised about the dependability of such electronically led driving systems. A Tesla S driver who had totally given up control of the car’s steering, and was apparently playing a Harry Potter movie while the car was driving itself, passed away as the car went into a trailer which suddenly turned left in front of it.
US safety regulators have launched an investigation on the 25,000 Tesla cars in circulation to figure if the Autopilot system on the car, still in Beta stage, is safe or not. If found unsafe, the authorities might enforce a recall for the vehicle on Tesla.
Tesla, on its part has said that the accident was suffered under “extremely rare circumstances”. “The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S” the company said in a statement. Apparently the Autopilot system on the car mistook the swathes of white on the trailer and the space underneath as open sky.
Despite Tesla’s clarification, stock market reacted to the incident with the company’s shares falling by 3 per cent. The company, however, backed its technology and said that it’s the first fatality in more than 213 million kilometres of testing and use.
While Tesla’s Autopilot is a semi-autonomous system for public use, when drivers activate the system, it explains, among other things, that Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times’. The system also insists that the driver needs to maintain control and responsibility for his vehicle’ while using it. Humans, as we all know, though, don’t like playing by the rules most of the times. Should the drivers be given the option to become complacent? That’s the question everyone is asking.
Interestingly, Joshua Brown, the deceased had posted YouTube videos lauding the efficiency and utility of the Tesla Autopilot. He also called his car Tessy, and loved it for its capability to thwart dangers, even when he wasn’t paying attention. Maybe Brown relied on Tessy too much, something its makers have sufficiently warned the drivers against.
Here’s one of the videos Brown posted on Youtube praising it for averting a danger, while he wasn’t paying attention. The description of the video states “The truck tried to get to the exit ramp on the right and never saw my Tesla. I actually wasn’t watching that direction and Tessy [the name of my car] was on duty with autopilot engaged. I became aware of the danger when Tessy alerted me with the “immediately take over” warning chime and the car swerving to the right to avoid the side collision.”
In the meantime, the the Associated Press, contacted the driver of the tractor trailer, who said that the driver of the Model S was playing a Harry Potter movie from inside the car at the time of the accident. The movie was apparently still playing after the accident. Here we have the excerpts from the Associated Press report
“Frank Baressi, 62, the driver of the tractor-trailer and owner of Okemah Express LLC, said the Tesla driver was “playing Harry Potter on the TV screen” and driving so quickly that “he went so fast through my trailer I didn’t see him.”
“It was still playing when he died and snapped a telephone pole a quarter mile down the road,” Baressi said in an in an interview from his home in Palm Harbor, Florida. He acknowledged he couldn’t see the movie, only heard it.
Tesla Motors Inc. said it is not possible to watch videos on the Model S touch screen”.
Joshua Brown, was a huge fan of his car’s autopilot system and has posted twenty videos of the system on his Youtube channel. Some of these videos show Brown listening to audio books while driving. We are not sure whether such was the case when this accident happened, though.
It would be nice to know what our audience thinks of such autonomous driving systems. Sure, there are ample warnings before these systems are engaged. But does it make sense to implement these systems for public use so early in their life? Or should they be implemented for public use at all? We’d love to know your opinion. Share your thoughts with us through the comments section below, or voice your opinion by tagging us on one of the social media networks.