Tesla drivers are spotting a new 3D horse icon on their Full Self-Driving and Autopilot screens. The feature appears in software build 2025.45.9, which carries FSD v14.2.2.4 and rolled out more broadly in late January 2026.
Observers say this replaces earlier behavior where large animals often appeared as generic vehicles or as dogs on the visualization.
October leak did not list horses
Back in October 2025, code and assets from the 2025.38 update surfaced online and outlined 15 new visualization types. That leak covered new 3D models for ambulances, fire trucks, garbage trucks, street sweepers, trains, trams, school buses, golf carts, trailers, and several kinds of vulnerable road users, including people on scooters, skateboards, and in wheelchairs.
Yet horse assets were absent from that catalog, even though it was considered wide-ranging at the time. For that reason, many watchers now read the horse icon as a newer addition that arrived after 2025.38, likely through later software work or a server-side change tied to the v14 line of FSD.
Tesla’s animal detection has been a work in progress for years. Early Autopilot and FSD builds could show a dog, but many four‑legged shapes, such as deer or horses, often triggered the same basic icon, as older demo clips and owner videos make clear.
Company leaders have acknowledged that the system needed more training to separate animal types. Past comments referenced a gap between detecting that “something” is there and identifying if that shape is a dog, cat or horse. The new horse visualization lines up with that long‑stated goal of moving from broad detection to finer object classes.
Safety reasons behind clearer animal icons
Large animals can create serious risk on highways and on rural routes, and researchers who study autonomous mobility systems have highlighted this point in academic work. For a driver‑assistance system, a horse and rider present a different risk profile than a small dog or a pedestrian, so better labeling can support better behavior on the road.
Tesla is leaning harder into FSD as an AI product grounded in data volume, and recent reporting notes that the company is aiming for about 10 billion real‑world miles to back unsupervised FSD. The fleet has already logged several billion miles, and each improvement in edge‑case classification, including horses, feeds into training for those neural networks.
Reports tied to FSD v13 and v14 indicate that Tesla can leave many assets in the software image and then flip them on or off through server‑side configuration. That setup lets the company roll out new visuals in stages, test behavior on a slice of the fleet, and tune the interface without forcing a full firmware download every time.
This is a small feature, yet many drivers treat it as a sign that Tesla is tightening up narrow perception gaps that earlier videos exposed.
You may also like:
- Tesla software rollout brings new 3D icons and mapping »
- Tesla builds real-time 3D worlds to train self-driving cars »
- Tesla’s 2025.44.25.1 update introduces Unreal Engine visuals »