Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I drove Teslas for 4 years and recently went back to a humble Toyota. I really don't think the pure-camera FSD will ever work properly. Between a Model 3 and Model X I've seen them do so many goofy things on the road, and get stumped by very common environmental factors such as fog, and strong sun with snow on the ground (glare). I've seen FSD get confused by poorly marked roads and swerve into exit lanes, or drive on the complete wrong side of the road. I stopped trusting it and I think cameras alone are just not fault tolerant enough.


RGB cameras aren't, but there are other kinds of vision they could be using, like IR and polarization cameras or event cameras for motion.

It's certainly not good they've picked systems strictly worse than human vision.


I suspect the greater shortcoming isn't in the sensor suite department, but the intelligence department. Either way, what Tesla's putting on public roads currently seems criminally negligent and I simply don't understand why no TLA has intervened yet.


How is it "criminally negligent" when the driver has the ability and the obligation to take over if the car starts to do something weird?


You'd have to be actively ignoring reality to not know of Tesla's problems in the autonomy department.

Just search for phantom braking:

https://www.youtube.com/watch?v=6HDbDXeRSPw

That's not even getting into the insanity that is having random members of the public LARP as FSD QA on public roads, shared with other random unwitting members of the public, who given the choice wouldn't want any of this happening around them while driving their kids to school.


I think it’s because we see everyone texting while driving. No one wants to enforce safe driving.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: