Tesla’s latest Full Self-Driving version 14 is impressing users and critics alike. In a video filmed at a Costco parking lot, the car behaved in a way that made many people do a double take. It noticed a person signaling that he was leaving, waited for him to drive off, and then parked neatly in the empty spot. That scene made the system seem more “alive” than ever before.
Video:
Tesla FSD (Supervised) v14 feels sentient in a Costco parking lot:
— The Tesla Newswire (@TeslaNewswire) October 25, 2025
Understands a man’s gesture indicating he’s leaving
Waits patiently for him to pull out and the spot to clear
Parks perfectly in the empty spacepic.twitter.com/tU1Ls2UV1A https://t.co/kkYCYid4Ae
The new FSD v14 update gives Tesla vehicles more awareness of their surroundings. It’s not just about sensing obstacles anymore. The car can now read human gestures and predict intent. For example, if someone is loading a trunk or waving to indicate they’re pulling out, the vehicle processes that as part of its driving plan. It stops and waits patiently instead of rushing into the space.
This awareness extends to other traffic in parking lots. The car recognizes pedestrians, carts, and moving vehicles and finds paths that avoid conflicts. These changes come from giant leaps in Tesla’s vision-based neural network which can now process more frame data per second and make decisions faster.
Drivers can tell the system where to park, in a lot, driveway, curb, or garage. The feature, called “Arrival Options,” adds custom control and reduces the need for human intervention.
Understanding human gestures
One of the biggest steps forward is gesture recognition. Tesla’s engineers trained the system to interpret hand signals from construction workers and pedestrians. The cameras track small movements and body positions.
Drivers see these detected signals on the screen before the vehicle acts. For instance, if a flagger waves “Go ahead,” the car confirms the instruction visually before proceeding.
Driving feels more natural
People using FSD v14 describe the driving as calm and confident. The system slows down early when it notices someone at a crosswalk and avoids abrupt braking in most cases.
Tesla achieved this by building a single neural network that integrates both navigation and environment perception. Instead of separating the map system and camera vision, version 14 merges them and adapt real time when the route changes or something blocks the road.
Beyond parking skills, FSD v14 introduces several new tools and options. It can now detect and pull over for emergency vehicles, reroute around road debris, and identify narrow or obstructed lanes.
Drivers can also choose between new profiles like “Sloth” for slow, careful driving or “Mad Max” for more assertive moves. Each profile adjusts lane changes, following distance, and reaction speed.
The vehicle can clean its own front camera using a smart wash nozzle that activates automatically.
The “Sentient” discussion
Elon Musk called FSD v14 “sentient” in interviews. While the claim is exaggerated, the update does seem more aware of human behavior. Testers note that the car hesitates appropriately, follows unspoken social rules, and adapts smoothly to group movement in parking lots. One driver said, “It feels like the car just gets it.”
Still, drivers must stay engaged. Tesla makes clear this remains supervised driving, not full autonomy. The human behind the wheel is always responsible.
Though improved, FSD v14 isn’t flawless. Some users reported longer waits for parking spots in busy areas and occasional confusion when multiple people moved near the same vehicle. But those moments are exceptions.
Most testers said it felt more confident than any earlier version. It handles unprotected turns better, recognizes school buses, and manages edge cases like delivery zones or shopping centers with fewer errors.
Tesla plans to expand FSD v14.1.3 to more users by the end of the month. A lighter version for older vehicles is also being tested. Future updates will likely improve “parking spot selection and quality,” which Tesla lists among the next goals.
FSD v14 might still need a watchful human, but many agree it’s the most “human-like” self-driving experience yet. The car doesn’t just follow signals, it seems to understand them. And in a busy Costco lot, that’s saying a lot.
You may also like to read:
- Tesla adds smart camera cleaning feature in FSD v14.1.3 »
- Real-time reasoning coming to Tesla FSD 14.3 and 14.4 »
- Tesla to launch lighter FSD V14 for hardware 3 by Mid-2026 »

