With its complete self-driving features, Tesla appears to be encountering issues. John Bernal, a Tesla employee, was recently let go after he posted a video online showing the company’s full-self-driving beta system hitting a cone in San Francisco. Since being let off from the firm, Bernal has intensified his efforts to expose the flaws in the Tesla self-driving software.
Tesla’s object detection and collision prevention system have some serious issues
In a recent video Bernal posted on his Youtube channel called AI Addict, he puts his own car on the line.
“Recently, I was going through my phone when I noticed that the IIHS had selected Tesla as a Top Safety Plus option for its AI vision and its capacity to recognize, slow down for, and avoid things. In light of that, I reasoned that because my Tesla uses AI vision, I should test its capabilities by tossing items in front of it.
Bernal recruits a number of objects to place in front of the Tesla self-driving system to put it to the test. These include a complete truck, an office chair, a garbage can, a beer keg, an orange bucket, a shipping pallet, a barbecue, and a garbage can.
It is important to note up front that this is Tesla FSD rather than FDS Beta, which enables beta testers to access more sophisticated capabilities that are still under development. When Tesla dismissed him, Bernal no longer had access to FSD Beta.
The testing immediately gets off to a bad start. The main screen of Bernal’s Tesla makes it plain that it has detected a roadside item. On the screen, the orange bucket resembles a traffic cone. Nevertheless, the Tesla does nothing to halt and slams into the pail despite being detected.
That is not good for Tesla FSD. Bernal affirms that his car’s autosteer and the emergency braking are both turned on. Therefore, the lack of any attempt made by the automobile to stop or avoid the bucket seems to indicate a systemic failure.
Unfortunately, the result repeats itself throughout the rest of the testing
Bernal moves through the items one by one. Throughout the tests, Bernal constantly positioned them in the same location in the middle of the road. Bernal also conducted tests with a pallet that was both stationary and in motion, with the pallet tumbling over as he approached it.
Even if the items get bigger and the Tesla can detect them and even see them on the screen, it does nothing to avoid them while the test is ongoing. To avoid hitting the larger things and doing serious damage to his automobile, Bernal steers erratically.
One of the more unexpected failures is Tesla’s complete failure to detect Bernal’s friend’s Ford Ranger pulling out in front of him. Bernal had to stop the automobile by himself using all of the brakes. Even worse, Tesla repeatedly identified items on the road as pedestrians during the testing while making no attempt to stop.
Many people have commented on how this is a result of the shift from LiDar technology to AI. Elon Musk, CEO of Tesla, said that the AI vision system outperformed the LiDar systems in testing. This video, however, appears to demonstrate the contrary to be the case.