NVIDIA’s Director of Robotics, Jim Fan, says Tesla’s latest Full Self-Driving (Supervised) v14 feels so natural that he “couldn’t tell if a neural net or a human drove” him home. He describes it as the first artificial intelligence system to pass what he calls the “Physical Turing Test,” a new bar for machine intelligence based on action in the physical world instead of a text chat.
What Jim Fan experienced with FSD v14
Jim Fan, who leads work on embodied AI and heads NVIDIA’s Project GR00T, only recently became a Tesla owner but was among the earliest to try FSD v14. He shared his thoughts on X after using the software for his commute.
Fan wrote that after a long workday, he could press a button, lean back, and “couldn’t tell if a neural net or a human drove you home.” He also said that, even though he fully understands how robot learning works, watching the steering wheel move on its own still feels “magical” at first. Over time, it turns into something routine, and he added that “like the smartphone, taking it away actively hurts.”
Fan argues that this is how society slowly adapts to very capable technologies, as they move from novelty to daily habit. He framed FSD v14 as a key example of this shift, because it handles more of the driving task end-to-end without the driver feeling constant tension or surprise.
What the “Physical Turing Test” means here
Alan Turing proposed his original test in 1950 as a way to judge if a machine could imitate human conversation so well that a person could not reliably tell it apart from another person. That test focuses on text-based dialogue and natural language.
Today, large language models can already produce humanlike chat, so many experts say they meet or come close to that classic bar. Still, holding a good conversation is very different from making safe, real-time decisions with a car in traffic. Jim Fan’s “Physical Turing Test” shifts the focus to real-world problem-solving and physical interaction.
In this version, an AI system passes when, over a sustained period, its behavior in physical tasks feels indistinguishable from a skilled human. For Fan, Tesla FSD v14 meets that mark during routine driving: after pressing the FSD button, the ride home felt so natural that the underlying driver, human or neural network no longer mattered to him.
How Tesla FSD v14 has evolved
Tesla’s FSD (Supervised) v14 and its follow-on build v14.2.2 arrive after years of incremental updates. Yet this release stands out because it leans deeper into Tesla’s end-to-end neural network approach, where a single large model handles perception, planning, and control with far fewer hand-coded rules.
Tesla has also added new behavior focused on how a car finishes a trip. An “arrival options” feature lets the vehicle pick more precise drop-off spots, like a driveway or a curb that makes sense for passengers, closer to how an attentive human driver would think about the last meters of a journey.
These updates, taken together, aim to reduce the number of edge cases that force the driver to intervene. And although Tesla still markets the feature as “Supervised” and requires hands on the wheel, early user feedback on v14.2.2 and v14.2.2.1 has praised the system for feeling more composed and predictable than earlier builds.
Elon Musk responded to Fan’s post on X and agreed with the “Physical Turing Test” framing. He said that with FSD v14, “you can sense the sentience maturing,” and he called Tesla’s stack the best “real-world AI” available today.
This exchange drew attention in part because NVIDIA and Tesla sit at key positions in the AI ecosystem. Tesla trains its driving models on massive compute clusters powered by NVIDIA GPUs, while NVIDIA uses Tesla’s progress as a high-profile example of advanced embodied AI in action. Jim Fan’s public praise adds extra weight to the idea that FSD v14 marks a meaningful step forward in automated driving.
You may also like to read:
- Tesla speeds out FSD v14.2.2.1 in holiday weekend rollout »
- Tesla FSD v14.2.2 rolls out with smarter vision and parking control »
- Tesla works on sun-tracking camera shield to protect FSD vision »