Testing of the Tesla Model Y by YouTuber Mark Rober, showcasing the vehicle in Autopilot mode but unable to distinguish fake road scenes, resulting in a collision, has sparked online debate. Tesla offers Full Self-Driving (FSD) mode for automated driving, yet it remains untested.
In response, content creator Kyle Paul revisited the issue by conducting further tests using FSD mode with Tesla Model Y featuring hardware HW3, software version 12.5.4.2, and Tesla Cybertruck with hardware HW4 and software version 13.2.8.
Paul conducted repeated tests to verify the software’s ability to differentiate fake road scenes. It was found that Model Y with hardware HW3 failed to distinguish scenes but could detect them when the object was close. In contrast, Cybertruck with the latest hardware and software could successfully recognize and stop.
This indicates that Tesla’s FSD mode, in conjunction with AI4 on the latest HW4 hardware and software version 13, can effectively differentiate between real and fake road scenes. Conversely, Model Y may require retesting if software is updated to version 13, with HW3 being a potential limiting factor as previously mentioned by Elon Musk.
For further insights, watch the test clip at the end of the news.
Source: Not a Tesla App
TLDR:
Testing of Tesla Model Y in Autopilot mode raised concerns about its ability to distinguish fake road scenes, prompting further testing by Kyle Paul. Results showed HW4 hardware with FSD version 13 effectively identifying scenes, while HW3 hardware in Model Y struggled. Tesla’s FSD mode with AI4 on HW4 hardware proves efficient in differentiating real and fake road scenarios, hinting at potential software updates for Model Y.
Leave a Comment