Another YouTuber Attempted To Fool Tesla’s FSD With A Fake Wall

Tesla's so-called Full Self-Driving was put to the test using a fake wall. The video is a follow-up to Mark Rober's controversial clip where a Model Y slams through a painted wall. This time, creator Kyle Paul used two Tesla EVs with mixed results. Wile E. Coyote is popular again. That’s thanks to former NASA engineer and YouTuber Mark Rober, who posted a video a couple of weeks ago showing how a Tesla Model Y drove through a fake wall while the so-called Full Self-Driving (FSD) software was supposedly active. To quickly recap, the experiment was designed to trick Tesla's controversial vision-only driver assistance system. While similar systems from many brands use a combination of cameras, radar and—in some cases—LiDAR to sense their surroundings, Tesla exclusively uses cameras. The trouble with that approach is that if something doesn't look like an obstacle, it may not be detected, which Rober's video tried to demonstrate. However, controversy quickly surrounded the whole thing, with many pointing out that Rober used baisc Autopilot—not FSD—during the test, and that the use of multiple takes smelled funny. The fact that a vehicle supplied by Luminar, a company that develops Lidar systems, was there, also raised questions about the objectiveness of the test. But now there’s a much more straightforward video from creator Kyle Paul that puts two Tesla EVs to the same test. This is important because, as you’ll see in the video below, if you happen to drive a Tesla that’s just a couple of years old, the performance of Autopilot and FSD is currently much lower than on newer cars. The video was shot on a closed course and FSD was enabled during all the attempts. The first runs were made with a Tesla Model Y crossover equipped with the previous-generation Hardware 3 computer running FSD version 12.5.4.2. As NotATeslaApp pointed out, that’s not the latest software version for this hardware–that would be 12.6–but that should not make a significant difference in this particular test. The Model Y failed the test, despite it having FSD engaged. The driver made multiple attempts but every time he had to slam on the brakes manually because the car did not detect the wall. Compared to Mark Rober’s video, where the painted wall was freestanding, the fake wall in Kyle Paul’s video was supported by a truck, so going through it would have totaled the Tesla. After concluding that the Model Y with HW3 and FSD enabled did not detect the obstacle, it was time to bring out a Tesla Cybertruck. The electric pickup has the latest hardware, dubbed HW4, and is running the newer FSD V13 software. As a reminder, Tesla’s advanced driver assistance systems (ADAS) use cameras to see the world around the vehicle, and cars with HW4 have more advanced cameras that can pick up more detail. The latest software version can also process video feeds at full resolution. The difference between the older Model Y and the newer Cybertruck is evident from the get-go. When approaching the fake wall while driving with FSD enabled, the obstacle is visible on the car’s touchscreen from an early stage and the EV comes to a full stop without human intervention. The test is run several times and the result is the same. For what it’s worth, this sort of situation is extremely unlikely to happen in the real world. However, it brings up a valid discussion about Tesla’s reluctance to use anything but video cameras for its driver aids. That said, it looks like the company has massively improved its game with the latest hardware and software. Older cars with HW3 might get FSD 13 sometime in the future, and that could up their obstacle-detection abilities, but we wouldn’t hold our breath for that.