Autonomous technology: are we ready?!

nissan

 

The safety of autonomous vehicles has been put into question following the first fatality involving a self-driving car, in Florida, America.

The driver of a Tesla Model S has been killed after the car, with ‘Autopilot’ engaged, failed to distinguish a white tractor-trailer crossing the highway against a bright sky, preliminary reports suggest.

The Tesla Model S is not a fully-autonomous car. However, this fatality (released by Tesla Motors on Thursday) is sure to affect people’s trust in the ever-growing autonomous car industry.

Value my car

A statement from Tesla reads “this is the first known fatality in just over 130 million miles where Autopilot was activated. It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains that Autopilot is an assist feature that requires you to keep your hands on the steering wheel at all times, and that you need to maintain control and responsibility for your vehicle while using it.

“Additionally, every time that Autopilot is engaged, the car reminds the driver to always keep their hands on the wheel and be prepared to take over at any time.

“The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again. We do this to ensure that every time the feature is used, it is used as safely as possible.”

The technology is very clever and detects when there are no hands on the steering wheel. This is a safety feature and slows the car down.

 

Full Tesla statement can be read here.

Just a few months ago, Google’s autonomous car was responsible for its first crash, while Self-driving in California. The Google Lexus AV basically assumed it had right of way when re-joining traffic following avoiding an obstacle in the road. The Google car did not have right-of-way and resulted in it driving into the side of a bus that was in the lane it wanted to move into.

In the Google incident, Google accepted liability, however, in this incident with the Tesla Model S, Tesla has yet to accept responsibility. If autonomous technology is coming to our streets, there needs to be a structure in place for who exactly is responsible in the event of an incident; despite how small a quantity those may come in. It is still unclear as to who would be to blame in the event of an accident with an AV.

Google's autonomous car had a blip and ended up resulting in a car accident.

 

Autonomous technology isn’t exactly a ‘new’ thing, but it is more apparent now more than ever and is developing at a rapid pace. For example; people may trust autonomous technology when it’s not appropriate. In 1995 there was an incident where pilots put too much trust in autonomous technology and failed to take manual control, even as the autopilot crashed the A320 Airbus they were flying. On the other side; technology can malfunction (in this case the navigation system), with the crew aboard the Royal Majesty cruise ship failing to step in when the ship drifted off course for 24 hours before washing up on shore.

In 1995, the Airbus A320 crashed due to autonomous technology

 

There was a piece published last year by TechCrunch which alluded to the fact that it will be illegal to drive your own car by 2030, following the surge in driver-assistance and autonomous technology. This is probably (definitely) going too far. There’s little doubt that the motor industry is changing. Autonomous technology isn’t likely to go away. Manufacturers such as Volvo, Volkswagen and Toyota have all said they hope to have autonomous cars on our streets by 2020.

We’ve all complained about technology at some point in our lives. Whether that be a mobile phone or your toaster blowing up when you’re in a rush to have breakfast before leaving for work – yes, this really has happened – technology can fail. That’s not to say humans don’t fail. Humans fail every day; poor decisions, lack of reaction etc. That’s why there are figures such as, worldwide there is a fatality approximately every 60 million miles.

Don’t get me wrong. I understand the safety net that is autonomous technology, but should it not be advertised as just that; a safety net? To rely on technology gives drivers an excuse to be lazy or to be less observant on the road. Preventing accidents should always be the number one priority and us humans do make mistakes, but technology should really only be activated in an event of that mistake – such as Volvo’s emergency braking technology, which activates when it gages you haven’t done so in time.

Value my car

 

The likes of Volvo, for example, is in the process of launching the UK’s biggest autonomous driving trial in London, called Drive Me London which will see the Swedish car maker supply members of the public with autonomous technology-equipped XC90s. This trial has been designed to gather real-world data in order to develop future software. Initially the trial will feature semi-autonomous vehicles only, then introducing full-autonomy to motorways and A-roads in the future.

 

volvo have introduced a new autonomous car

 

So, what exactly is autonomous technology? Well, manufacturers have different opinions on what true autonomy is but the technology is essentially functional without being told what to do by a person.

AA president Edmund King told the press association that “the accident in the US is a reminder that driverless cars aren’t fool proof in the real world. It can and will enhance safety, but we need more research into the interactions between driverless cars and driver-driven vehicles before we allow all drivers to take their hands off the wheel.”

Whether we are actually ready for this kind of dependence or not remains to be seen. The fact that Tesla’s autonomous technology still comes with a ‘warning’ is a good thing and drivers should not be under the impression they can just lay back and read a book. Being at the helm of a wheel means that your life is in your hands, and not only your life but the lives of any passengers you may have with you, as well as others on the road. It’s important to remember that technology does fail and that these systems should be used as a back-up or safety net, and not as a dependent.

As well as legislative changes to allow cars to drive themselves, it’s likely that we will need major infrastructure upgrades to allow true autonomy, with road markings maintained to an agreed standard and the creation of common protocols between different systems. However, full autonomy is not expected to appear on our roads before the year 2025.

Article By  Danielle Bagnall

Comments

comments

About

With a passion for great content and all things vehicular, Daniel combines his interests to update our blog with the latest and greatest motoring news.