Uncategorized

Tesla Full Self Driving requires human intervention every 13 miles

It gave pedestrians room but ran red lights and crossed into oncoming traffic.

Enlarge / An independent automotive testing company has evaluated Tesla FSD, and it found some concerning results. (credit: PonyWang/Getty Images)

Tesla’s controversial “Full Self Driving” is now capable of some quite advanced driving. But that can breed undeserved complacency, according to independent testing. The partially automated driving system exhibited dangerous behavior that required human intervention more than 75 times over the course of more than 1,000 miles (1,600 km) of driving in Southern California, averaging one intervention every 13 miles (21 km).

AMCI Testing evaluated FSD builds 12.5.1 and then 12.5.3 across four different environments: city streets, rural two-lane highways, mountain roads, and interstate highways. And as its videos show, at times FSD was capable of quite sophisticated driving behavior, like pulling into a gap between parked cars to let an oncoming vehicle through, or moving left to give space to pedestrians waiting at a crosswalk for the light to change. AMCI also praised FSD for how it handled blind curves out in the countryside.

“It’s undeniable that FSD 12.5.1 is impressive, for the vast array of human-like responses it does achieve, especially for a camera-based system,” said Guy Mangiamele, director of AMCI Testing.

Read 3 remaining paragraphs | Comments

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy