Self-Driving Tesla Nearly Hits Oncoming Train, Raises New Concern On Car's Safety ( lemmy.zip )

Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty's Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

gardylou ,
@gardylou@lemmy.world avatar

"Sorry about your dead husband, trains weren't in the training data. Our bad. Anyway, his loss is not in vain, as now that our engineers are aware that trains can be a potential driving hazard, we are going to fix this soon in a future software update."

uebquauntbez ,

Hyperloops .... hype ... oops

lemmyhavesome ,

Full Self Demolition

captain_aggravated ,
@captain_aggravated@sh.itjust.works avatar

I hope Tesla owners only get themselves killed.

Fades ,

What a horrible thing to say, especially since Elon and Tesla have only relatively recently turned to absolute shit. There are a lot of Tesla drivers that don't support what he has done to the company and all that.

Here you are advocating for the death of people because they purchased a vehicle. A lot of people bought Teslas as they were one of the better EVs at the time during Tesla's climb to their peak (which they have since fallen very far from). They too deserve death?

captain_aggravated ,
@captain_aggravated@sh.itjust.works avatar

Here you are advocating for the death of people because they purchased a vehicle.

No; I'm expressing the same sentiment that I express for motorcycle riders that refuse to wear a helmet. I really, genuinely don't care if they beat their brains out on the front bumper of a Hyundai, but I don't think they get to force a Hyundai driver to hose brains off their car.

Teslas are death traps. Their owners can make that choice for themselves but I don't think they get to make it for others, which is what they try to do every time they turn on that self-driving feature.

Ozymati ,
@Ozymati@lemmy.nz avatar

Guess it gained self awareness and realised it was a tesla

enleeten ,

"Your father's a Cybertruck!"

Etterra ,

TFW even your Tesla thinks your Elon fanboy tweets are insufferable.

Honytawk ,
Jakeroxs ,

This is showing it works or no? I can't tell and there isn't audio, it seems like it would be stopped correctly.

SaltySalamander ,
@SaltySalamander@fedia.io avatar

Definitely shows it working.

Jakeroxs ,

Yeah, I was thinking maybe the weird flashing lights on the screen was maybe being pointed to as not working right or something to that effect? Idk lol no context provided at all

noxy ,
@noxy@yiffit.net avatar

Feels like these things were more capable a decade ago when they had radar.

Not that they should be called "full self driving" either then or now, but at least radar can deal fog better than regular ass cameras

Furbag ,

Oh boy, and they just removed "steering wheel nag" in a recent update. I can't imagine that will have any unintended consequences.

WoahWoah ,

Not really. They just removed unprompted nag. If you're not constantly keeping a hand on the wheel and looking at the road, it nags more and will pull you over if you ignore it.

If you turn off the internal driver monitoring camera, you can't engage FSD or even use lane assist.

ArtemisimetrA ,

Let them earn their Darwin awards

explodicle ,

I'd rather see some Free Market Darwinism™ in the form of a lawsuit.

ArtemisimetrA ,

Yeah I guess I'd be ok with that. I may have lost faith in our judicial system to get that shit done

n3m37h , (edited )

10x safer than a human!

// added a fun link

angelmountain ,

I don't want to disagree, but I would like a source to support this claim

raspberriesareyummy ,

That exclamation point in the comment you replied to should be your hint that it's sarcasm.

n3m37h ,

No, Musk said this at one point at some press conference

raspberriesareyummy ,

Not the person I was replying to but okay... what has that got to do with the sarcasm of the comment I was referring to?

n3m37h ,

Look @ original comment, Musk has stated many a time that FSD is safer than human drivers and I'm pretty sure at one point he said it was 10x safer... No sarcasm in that statement.

Oh here

n3m37h ,

Musk has multiple times stated that FSD is safer than han driving. I'm not gonna bother finding the bids as I'm at work

n3m37h ,

https://motherfrunker.ca/fsd/

Here ya go someone else did it for me

Gsus4 ,
@Gsus4@mander.xyz avatar

*a drunken human

MacStache ,

Those trains sure are weird and confusing, with their back and forth those tracks and all. Makes you wonder about train safety, it does!

itsonlygeorge ,

Tesla opted not to use LIDAR as part of its sensor package and instead relies on cameras which are not enough to determine accurate location data for other cars/trains etc.

This is what you get when billionaires cheap out on their products.

skyspydude1 ,

Not only that, but took out the radar, which while it has its own flaws, would have had no issue seeing the train through the fog. While they claimed it was because they had "solved vision" and didn't need it anymore, it's bullshit, and their engineering team knew it. They were in the middle of sourcing a new radar, but because of supply chain limitations (like everyone in 2021) with both their old and potential new supplier, they wouldn't continue their "infinite growth" narrative and fElon wouldn't get his insane pay package. They knew for a fact it would negatively affect performance significantly, but did it anyway so line could go up.

While no automotive company's hands are particularly clean, the sheer level of willful negligence at Tesla is absolutely astonishing and have seen and heard so many stories about their shitty engineering practices that the only impressive thing is how relatively few people have died as a direct result of their lax attitude towards basic safety practices.

Imalostmerchant ,

I never understood Musk's reasoning for this decision. From my recollection it was basically "how do you decide who's right when lidar and camera disagree?" And it felt so insane to say that the solution to conflicting data was not to figure out which is right but only to listen to one.

Jakeroxs ,

Also that LIDAR is more expensive then cameras, which means higher end user price, as far as I remember.

Imalostmerchant ,

I wasn't sure if he admitted that as being the reason (even though it obviously is)

Jakeroxs ,

I thought that was his main justification, idk tho, I don't listen to the earnings calls or interviews myself lol

Cornpop ,

All about saving a buck.

wirehead ,

I mean, I think he's a textbook example of why not to do drugs and why we need to eat the rich, but I can understand the logic here.

When you navigate a car as a human, you are using vision, not LIDAR. Outside of a few edge cases, you aren't even using parallax to judge distances. Ergo, a LIDAR is not going to see the text on a sign, the reflective stripes on a truck, etc. And it gets confused differently than the eye, absorbed by different wavelengths, etc. And you can jam LIDAR if you want. Thus, if we were content to wait until the self-driving-car is actually safe before throwing it out into the world, we'd probably want the standard to be that it navigates as well as a human in all situations using only visual sensors.

Except, there's some huge problems that the human visual cortex makes look real easy. Because "all situations" means "understanding that there's a kid playing in the street from visual cues so I'm going to assume they are going to do something dumb" or "some guy put a warning sign on the road and it's got really bad handwriting"

Thus, the real problem is that he's not using LIDAR as harm reduction for a patently unsafe product, where the various failure modes of the LIDAR-equipped self-driving cars show that those aren't safe either.

smonkeysnilas ,

I mean the decision was stupid from an engineering point of view, but the reasoning is not entirely off. Basically it follows the biological example: if humans can drive without Lidar and only using their eyes than this is proof that it is possible somehow. It's only that the current computer vision and AI tech is way worse than humans. Elon chose to ignore this, basically arguing that it is merely a software problem for his developers to figure out. I guess in reality it is a bit more complex.

Wrench ,

LIDAR would have similarly been degraded in the foggy conditions that this occurred in. Lasers are light too.

While I do think Tesla holds plenty of responsibility for their intentionally misleading branding in FSD, as well as cost saving measures to not include lidar and/or radar, this particular instance boils down to yet another shitty and irresponsible driver.

You should not be relying on FSD over train tracks. You should not be allowing FSD to be going faster than conditions allow. Dude was tearing down the road in thick fog, way faster than was safe for the conditions.

WoahWoah ,

Well said.

FreddyDunningKruger ,

One of the first things you learn to get your driver's license is the Basic Speed Law, you must not drive faster than the driving conditions would allow. If only Full Self Driving followed the law and reduced its max speed based on the same.

Rekorse ,

If you were to strictly take that rule seriously, you should not allow FSD to drive at all, as at any speed its more dangerous than the person driving it (given an average driver who's not intoxicated).

Rekorse ,

A Tesla drover might get the impression that the cars "opinion" is better than their own, which could cause them to hesitate before intervening or to allow the car to drive in a way they are uncomfortable with.

The misinformation about the car reaches the level of negligence because even smart people are being duped by this.

Honestly I think some people just dont believe someone could lie so publicly and loudly and often, that it must be something else besides a grift.

Pazuzu ,

Maybe it shouldn't be called full self driving if it's not fully capable of self driving

nifty ,
@nifty@lemmy.world avatar

For now, cars need more than computer vision to navigate because right now adding cameras by themselves doesn’t help a car spatially orient itself in its environment. What might help? I think the consensus is that the cameras need to get a 360 deg view of the surroundings and the car needs a method for making sense of these inputs without that understanding being the focus of attention.

It seems Teslas do add sensors in appropriate locations to be able to do that, but there’s some disconnect in reconciling the information: https://www.notateslaapp.com/news/1452/tesla-guide-number-of-cameras-their-locations-uses-and-how-to-view. A multi-modal sensing system would bypass reliance on getting everything right via CV.

Think of you focusing on an object in the distance and moving toward it: while you’re using your eyes to look at it, you’re subconsciously computing relative distance and speed as you approach it. it’s your subconscious memory of your 3D spatial orientation that helps you make corrections and adjustments to your speed and approach. Outside of better hardware that can reconcile these different inputs, relying on different sensor inputs would make the most robust approach for autonomous vehicles.

Humans essentially keep track of their body in 3D space and time without thinking about it, and actually most multicellular organisms have learned to do this in some manner.

nyan ,

Are there any classes of object left that Tesla FSD has not either hit or almost hit? Icebergs, maybe?

WhiskyTangoFoxtrot ,
Buffalox ,

Obvious strong blinking red light ahead, obvious train passing ahead...

Tesla FSD: Hmmm let's not even slow down, I don't see any signs of problems.

FSD is an acronym for Fool Self Driving.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines