4.4
(55)

why is tesla autopilot braking because of driver long

Tesla´s autopilot car accidents are a media-based theme. This is a new technology that makes many people mistrust. Tests carried out in the UK have shown that it is not yet time to drive without our hands.

If you drove a car with cruise control, you might have for some time experienced a similar feeling, which often initially describing Tesla driver after switching to autopilot. It’s just an unusual thing when the vehicle itself does something you used to do. Gradually, you get used to it, after a while, you start to take it for granted and teach your car to trust. But one day an unexpected awakening may come.

Tesla gave a lot of energy, not only the development and improvement of autopilot mode, but also alert owners and explain exactly what this feature can do and what its limitations, and especially the fact that at all times the responsibility for management to the driver. He must keep his attention, watch the events around him, and be prepared to respond to anything that happens on the road.

Autopilot lulls

But not everyone who runs Tesla is following the manufacturer’s instructions. Not to mention that the automatic mode of driving tends to lull a person behind the wheel. When something unexpected happens, the driver’s reaction time is longer than that of a regular car, where a certain amount of attention is needed.

Tesla in autopilot mode simply drives, turns, adapts the speed to the vehicle ahead. Everything is seemingly perfectly perfect, and the driver can feel the sense of sitting in the car of the future. It is easy to get a false illusion that the car is under control.

Everything works until something unforeseen happens. A warning signal sounds and it is necessary for the driver to be ready to evaluate the situation and to intervene immediately. So both hands on the wheel – your car needs you, and now!

Autopilot can deal with a number of situations – it can also turn and brake, which is the only two functions it may need in critical situations. What it lacks, however, is experience and reason, especially the ability to respond to unusual events and to understand the reasons for the actions of other drivers.

Limited autopilot capabilities

The British company Thatcham Research is engaged in automotive safety and car safety tests for motor vehicles for insurance companies. In this spirit, the company decided to test Tesla in autopilot mode on a test track in Upper Heyford in Oxfordshire.

It has therefore created almost ideal conditions: a straight, clear stretch of road with several lanes on a clear day. The process itself was basically simple: Tesla follows the car in front of it, but the other car is suddenly changing the lane, avoiding another car standing in the lane. What will Tesla do? Look for yourself on BBC TV video: https://www.bbc.com/news/technology-44439523

This is not an unrealistic situation. What you see on a video can happen in normal traffic – for example, if the car before you register a standing column and the driver will solve the situation by going to the next lane.

All that Tesla had managed to do in a sudden situation was to step on the brake, and it is still a question of whether it would “be overwhelming” as the situation would require. Absolutely lacking any attempt to avoid a maneuver. Not to mention that the autopilot does not react in any way to the fact that the car has shifted its direction.

If a man-made such a mistake, it could be considered a failure because of inexperience or lack of attention. We would assume that the autopilot has the slightest information in this situation that there is free space on the sides of the vehicle and can therefore safely “pull off driving”. But no such thing happened.

The BBC commented on the test: “With the autopilot mode on, Model S was in the lane and slowed down and stopped when the vehicle stopped at the end of the column. In the second run, the car at the very last moment changed the lane, and Tesla did not catch up in time and went into a standing vehicle. “According to Matthew Avery’s research director, this is a classic demonstration of what happens when a driver relies too much on automated systems.

Tesla: Our users are instructed

Tesla said on a request for comment that the autopilot can not prevent all possible types of accidents. This would not be the case if the driver used autopilot correctly and did not rely excessively on the system. When asked what to expect under the over-reliance on the system, Tesla replied that all users of their cars are aware that if they do not pay enough attention to driving, they can come across (literally and transferred).

“The feedback we receive from our customers shows that they have a very clear idea of ​​what an autopilot is, how to use it properly, and what features it offers. When activating the autopilot, drivers are constantly reminded of their duty to keep their hands on the steering wheel and still maintain control of the vehicle. It’s designed to prevent misuse and is one of the strongest security measures. ”

Thatcham Research notes that the test has shown that the underlying problem lies in the fact that the driver relies excessively on the system. In the last few years, we have seen that this is happening quite often. That is why we are witnessing accidents with drivers who had a car in autopilot mode that hit a standing vehicle or other stationary obstacles. It is as if some Tesla owners did not understand the concept of “autopilot”, which at least so far means only “assisted management”.

How useful was this post?

Click on a star to rate it!

Average rating 4.4 / 5. Vote count: 55

No votes so far! Be the first to rate this post.

As you found this post useful...

Follow us on social media!

Joseph Karma
About Joseph Karma

I am Joseph. I take care of transportation, infrastructure, and urban logistics in the cities and countries, including busses, taxis, cars, ships, plains, and their self-driving and alternate power systems.