Tesla´s autopilot car accidents are a media-based theme. This is a new technology that makes many people mistrust it. Tests carried out in the UK have shown that it is not yet time to drive without our hands.

If you drove a car with cruise control, you might have for some time experienced a similar feeling, which often initially describes a Tesla driver after switching to autopilot. It’s just an unusual thing when the vehicle itself does something you used to do. Gradually, you get used to it, and after a while, you start to take it for granted and teach your car to trust. But one day an unexpected awakening may come.

Tesla gave a lot of energy, not only for the development and improvement of autopilot mode but also to alert owners and explain exactly what this feature can do and what its limitations are, especially the fact that at all times the responsibility of management is to the driver. He must keep his attention, watch the events around him, and be prepared to respond to anything that happens on the road.

Autopilot lulls

But not everyone who runs Tesla is following the manufacturer’s instructions. Not to mention that the automatic mode of driving tends to lull a person behind the wheel. When something unexpected happens, the driver’s reaction time is longer than that of a regular car, where a certain amount of attention is needed.

Tesla in autopilot mode simply drives, turns, and adapts the speed to the vehicle ahead. Everything is seemingly perfectly perfect, and the driver can feel the sense of sitting in the car of the future. It is easy to get a false illusion that the car is under control.

Everything works until something unforeseen happens. A warning signal sounds and the driver must be ready to evaluate the situation and intervene immediately. So both hands on the wheel – your car needs you, and now!

Autopilot can deal with several situations – it can also turn and brake, which are the only two functions it may need in critical situations. What it lacks, however, is experience and reason, especially the ability to respond to unusual events and to understand the reasons for the actions of other drivers.

Limited autopilot capabilities

The British company Thatcham Research is engaged in automotive safety and car safety tests for motor vehicles for insurance companies. In this spirit, the company decided to test Tesla in autopilot mode on a test track in Upper Heyford in Oxfordshire.

It has therefore created almost ideal conditions: a straight, clear stretch of road with several lanes on a clear day. The process itself was simple: Tesla followed the car in front of it, but the other car was suddenly changing lanes, avoiding another car standing in the lane.

This is not an unrealistic situation. What you see on a video can happen in normal traffic – for example, if the car registers a standing column the driver will solve the situation by going to the next lane.

All that Tesla had managed to do in a sudden situation was to step on the brake, and it is still a question of whether it would “be overwhelming” as the situation would require. Lacking any attempt to avoid a maneuver. Not to mention that the autopilot does not react in any way to the fact that the car has shifted its direction.

If a man-made mistake, it could be considered a failure because of inexperience or lack of attention. We would assume that the autopilot has the slightest information in this situation that there is free space on the sides of the vehicle and can therefore safely “pull off driving”. But no such thing happened.

The BBC commented on the test: “With the autopilot mode on, Model S was in the lane and slowed down and stopped when the vehicle stopped at the end of the column. In the second run, the car at the very last moment changed the lane, and Tesla did not catch up in time and went into a standing vehicle. “According to Matthew Avery’s research director, this is a classic demonstration of what happens when a driver relies too much on automated systems.

Tesla: Our users are instructed

Tesla said in a request for comment that the autopilot can not prevent all possible types of accidents. This would not be the case if the driver used autopilot correctly and did not rely excessively on the system. When asked what to expect under the over-reliance on the system, Tesla replied that all users of their cars are aware that if they do not pay enough attention to driving, they can come across (literally and transferred).

“The feedback we receive from our customers shows that they have a very clear idea of ​​what an autopilot is, how to use it properly, and what features it offers. When activating the autopilot, drivers are constantly reminded of their duty to keep their hands on the steering wheel and still maintain control of the vehicle. It’s designed to prevent misuse and is one of the strongest security measures. ”

Thatcham Research notes that the test has shown that the underlying problem lies in the fact that the driver relies excessively on the system. In the last few years, we have seen that this is happening quite often. That is why we are witnessing accidents with drivers who had a car in autopilot mode that hit a standing vehicle or other stationary obstacles. It is as if some Tesla owners did not understand the concept of “autopilot”, which at least so far means only “assisted management”.

LATEST POSTS

GOVERNMENT

PEOPLE
Unity of soul and mind making good decisions
The mind has a will but is incapable of controlling external intention. The soul is able to feel its identity with external intention but has no will.

CITIES

SUSTAINABILITY

TECHNOLOGY

smart cities, space, science, technology, quantum, government, economics, SDG, municipal services, startups, influencers, brands, pioneers, innovator's dictionary, history, design