With each beta update to Tesla’s Total Autonomous Driving (FSD), select owners who serve as free guinea pigs for the Californian manufacturer experience surprising improvements in their electric cars’ autonomous behavior.
As is well known, both the Tesla Autopilot and the FSD (Total Autonomous Driving) are neither autonomous at the moment in any way. This is because the driver must be vigilant at all times to take control of the electric car. While this removes the concept of autonomous from the equation, it does not remove that the behavior of vehicles equipped with the FSD in the beta version does not show capabilities that anticipate a genuinely autonomous future.
James Locke is one of Tesla’s owners in the United States to test the most advanced version of total autonomous driving. On his YouTube channel, he shows the evolutions and tests that he undergoes in his Tesla Model 3.
In November, he drove on a two-lane road like a driving lesson that ends with a full 180-degree turn. There is no continuation on the road except for a complete turn through a curve, which is more reminiscent of a mountain pass in the Alps than a conventional route.
On its first attempt in the inside lane, the car almost came to a complete stop trying to interpret the unfamiliar situation. Tesla, unlike Waymo, does not use previously mapped environments for its autonomous driving.
Locke’s car showed his ‘nervousness’ at the road’s unexpected behavior. The only vertical traffic sign is located at the end, merely indicating “End” since nothing is beyond the guardrail.
After the first unsuccessful attempt, the car received a software update corresponding to the version FSD beta 2020.44.15.4; with this installed, in less than a month, the protagonist’s Tesla manages to complete the turn without difficulty seen in the video.
Who has not found a garbage truck blocking the route one day when we were in a hurry? What would an autonomous car do while some operators do their work? Another Tesla equipped with the beta version of the FSD 2020.44.15.3 has faced this situation, with a less-trained version of the example above.
Within the complicated management that the car carries out during its adventure to overtake the garbage truck on a busy road, we see how the vehicle imitates human behavior when facing the maneuver.
On several occasions, the electric car threatens to overtake to have a full view of the situation that precedes it. In autonomous driving, at least for the moment, the vehicle is intended to reproduce the human behavior we are all used to. This is a technique to gain the trust of the human behind the wheel, who might not otherwise accept specific actions that they would consider unnatural.
The encounter with the truck was unprovoked, as it occurred on the way to a supercharger. It is possible that the neural network is not yet well trained for these tasks since the volume of cars with this software is still low.
In the video in question, even though the driver takes control for a few moments, you can see how the Tesla ends up autonomously overtaking the garbage truck, while it stops to pick up some buckets belonging to a house, a time they stop passing cars in the opposite direction.