Self-Driving Tesla Nearly Hits Oncoming Train, Raises New Concern On Car's Safety ( lemmy.zip )

Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty's Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

helpmyusernamewontfi ,

what's so "new" about this concern? I'd probably be able to afford a house if I had a dollar for every article I saw on Tesla's wrecking or nearly wrecking because of FSD.

0x0 ,

TL;DR Tesla driver almost won Darwin award.

TammyTobacco ,

It was so foggy that I'm not surprised the car couldn't figure out what was happening. The guy also said his car had driven towards trains twice before, so he's definitely a dumbass for continuing to use self driving, especially in heavy fog.

skyspydude1 ,

If only there was some sort of sensing technology that wasn't purely optical, that'd be pretty neat. Maybe even using something like radio, for detection and ranging. Too bad no one's ever come up with something like that.

Akasazh ,
@Akasazh@feddit.nl avatar

I don't see any information about the crossing. Was it a crossing without gates? As the sensors must've picked that up when driving towards it. If so, is a huge oversight not putting up gated crossings nowadays, certainly on busy roads, regardless of the performance of self driving cars.

Woovie ,

The video in the article shows lowered arms flashing. Very visible with plenty of time to stop despite the foggy conditions. It just didn't.

Akasazh ,
@Akasazh@feddit.nl avatar

Ah. I've read it, but I have media tuned of, so I didn't see the video. Thanks for the clarification!

Woovie ,

Yep of course!

ElderWendigo ,
@ElderWendigo@sh.itjust.works avatar

Not being able to identify a railroad crossing without a gate is a failing of the car not the train. Gated crossings are not guaranteed, nor should they be because they don't make sense for every situation in which roads and tracks cross.

Akasazh ,
@Akasazh@feddit.nl avatar

True, but it would be an exceptional failure if the car missed a gated crossing, as it turns out it was.

pirat ,

they don't make sense for every situation in which roads and tracks cross.

https://www.youtube.com/watch?v=peXry-_B87g

pirat ,

they don't make sense for every situation in which roads and tracks cross.

https://www.youtube.com/watch?v=peXry-_B87g

sp3tr4l , (edited )

This is horrible!

Obviously the Tesla's cold gas thrusters must be malfunctioning! The Fully Autonomous Only Non Insane Car AutoPilot was clearly going to jump over the train Speed Racer style!

Thank goodness the driver realized the thruster failure light was on and was able to avoid the worst.

Edit: This is sarcasm! I hate using /s because I'd like to believe people can tell, but I guess not this time...

buddascrayon ,

When you look at the development notes on the self-driving at Tesla, anyone with a brain wouldn't trust that shit, not even a little bit. Most of what they did is placate Musk's petty whims and delusions. Any real R&D issues were basically glazed over it given quick software fixes.

Scolding7300 ,

Are there notes available to read?

buddascrayon ,

Here's one of the few articles that wasn't paywalled when I pulled up a Google search on it. It's really not hard to find the stories are all over the place.

https://www.businessinsider.com/elon-musk-tesla-autopilot-fsd-engineers-unsettled-2021-8

VirtualOdour ,

Demonstrate what you mean because it really sounds like you're describing what you feel should be true to justify your emotions about the bad Twitter man.

And to be clear, I mean link the documents you claim to have read and the parts of them you claim demonstrate this.

buddascrayon ,

Just need to Google "Tesla self-driving development engineers and Elon Musk", and you'll find lots of articles. Here's one of the few that wasn't paywalled.

https://www.businessinsider.com/elon-musk-tesla-autopilot-fsd-engineers-unsettled-2021-8

WoahWoah ,

This is an article from 2021 about a book researched in 2019.

buddascrayon ,

Yeah, during development of the Tesla self driving system.

WoahWoah , (edited )

Read the development notes from the first years of any technology you use. The research you're "referencing" is six years old at this point.

What's next? You going to criticize an iPod Nano to make a point about the broken screen on your iPhone 8? Criticize Google assistant from 2019 to harangue OpenAI?

Look at what six years of development means: https://youtu.be/qTDlRLeDxxM?si=dFZzLcO_a8wfy2QS

buddascrayon ,

It's not about how well or badly it worked when they were developing it, it's about the developing process. It's about the fact that they had to appease Elon Musk's ego in all aspects of developing the self drive system. To a disastrous degree.

And again there is a world of difference between the iPhone or Open AI or Google Assistant not working right and a car driving itself not working right because when those other things don't work nobody dies or gets hurt. But a car can mame and kill people extremely easily.

VirtualOdour ,

That's a very old article about even older opinions, now totally outdated as shown by statements like;

Almost five years on, Tesla still hasn't completed its autonomous road trip — no car company has even come close.

You're using unsubstantiated statements from the start of development which is totally different to what you claimed before being asked for a source.

Current development FSD has hit huge milestones which competitors have not.

misterundercoat ,

He fortunately avoided the train, but unfortunately still owns a Tesla.

werefreeatlast ,

Oh! As a token of ah....of...aah.. a knowledge mental acknowledgement, we the US people would like to gift this here Tesla to you all, Putin, and Iran leadership. You get a Tesla and you get a Tesla....and you get a Tesla!

Buffalox ,

Obvious strong blinking red light ahead, obvious train passing ahead...

Tesla FSD: Hmmm let's not even slow down, I don't see any signs of problems.

FSD is an acronym for Fool Self Driving.

nyan ,

Are there any classes of object left that Tesla FSD has not either hit or almost hit? Icebergs, maybe?

WhiskyTangoFoxtrot ,
nifty ,
@nifty@lemmy.world avatar

For now, cars need more than computer vision to navigate because right now adding cameras by themselves doesn’t help a car spatially orient itself in its environment. What might help? I think the consensus is that the cameras need to get a 360 deg view of the surroundings and the car needs a method for making sense of these inputs without that understanding being the focus of attention.

It seems Teslas do add sensors in appropriate locations to be able to do that, but there’s some disconnect in reconciling the information: https://www.notateslaapp.com/news/1452/tesla-guide-number-of-cameras-their-locations-uses-and-how-to-view. A multi-modal sensing system would bypass reliance on getting everything right via CV.

Think of you focusing on an object in the distance and moving toward it: while you’re using your eyes to look at it, you’re subconsciously computing relative distance and speed as you approach it. it’s your subconscious memory of your 3D spatial orientation that helps you make corrections and adjustments to your speed and approach. Outside of better hardware that can reconcile these different inputs, relying on different sensor inputs would make the most robust approach for autonomous vehicles.

Humans essentially keep track of their body in 3D space and time without thinking about it, and actually most multicellular organisms have learned to do this in some manner.

itsonlygeorge ,

Tesla opted not to use LIDAR as part of its sensor package and instead relies on cameras which are not enough to determine accurate location data for other cars/trains etc.

This is what you get when billionaires cheap out on their products.

skyspydude1 ,

Not only that, but took out the radar, which while it has its own flaws, would have had no issue seeing the train through the fog. While they claimed it was because they had "solved vision" and didn't need it anymore, it's bullshit, and their engineering team knew it. They were in the middle of sourcing a new radar, but because of supply chain limitations (like everyone in 2021) with both their old and potential new supplier, they wouldn't continue their "infinite growth" narrative and fElon wouldn't get his insane pay package. They knew for a fact it would negatively affect performance significantly, but did it anyway so line could go up.

While no automotive company's hands are particularly clean, the sheer level of willful negligence at Tesla is absolutely astonishing and have seen and heard so many stories about their shitty engineering practices that the only impressive thing is how relatively few people have died as a direct result of their lax attitude towards basic safety practices.

Imalostmerchant ,

I never understood Musk's reasoning for this decision. From my recollection it was basically "how do you decide who's right when lidar and camera disagree?" And it felt so insane to say that the solution to conflicting data was not to figure out which is right but only to listen to one.

Jakeroxs ,

Also that LIDAR is more expensive then cameras, which means higher end user price, as far as I remember.

Imalostmerchant ,

I wasn't sure if he admitted that as being the reason (even though it obviously is)

Jakeroxs ,

I thought that was his main justification, idk tho, I don't listen to the earnings calls or interviews myself lol

Cornpop ,

All about saving a buck.

wirehead ,

I mean, I think he's a textbook example of why not to do drugs and why we need to eat the rich, but I can understand the logic here.

When you navigate a car as a human, you are using vision, not LIDAR. Outside of a few edge cases, you aren't even using parallax to judge distances. Ergo, a LIDAR is not going to see the text on a sign, the reflective stripes on a truck, etc. And it gets confused differently than the eye, absorbed by different wavelengths, etc. And you can jam LIDAR if you want. Thus, if we were content to wait until the self-driving-car is actually safe before throwing it out into the world, we'd probably want the standard to be that it navigates as well as a human in all situations using only visual sensors.

Except, there's some huge problems that the human visual cortex makes look real easy. Because "all situations" means "understanding that there's a kid playing in the street from visual cues so I'm going to assume they are going to do something dumb" or "some guy put a warning sign on the road and it's got really bad handwriting"

Thus, the real problem is that he's not using LIDAR as harm reduction for a patently unsafe product, where the various failure modes of the LIDAR-equipped self-driving cars show that those aren't safe either.

smonkeysnilas ,

I mean the decision was stupid from an engineering point of view, but the reasoning is not entirely off. Basically it follows the biological example: if humans can drive without Lidar and only using their eyes than this is proof that it is possible somehow. It's only that the current computer vision and AI tech is way worse than humans. Elon chose to ignore this, basically arguing that it is merely a software problem for his developers to figure out. I guess in reality it is a bit more complex.

Wrench ,

LIDAR would have similarly been degraded in the foggy conditions that this occurred in. Lasers are light too.

While I do think Tesla holds plenty of responsibility for their intentionally misleading branding in FSD, as well as cost saving measures to not include lidar and/or radar, this particular instance boils down to yet another shitty and irresponsible driver.

You should not be relying on FSD over train tracks. You should not be allowing FSD to be going faster than conditions allow. Dude was tearing down the road in thick fog, way faster than was safe for the conditions.

WoahWoah ,

Well said.

FreddyDunningKruger ,

One of the first things you learn to get your driver's license is the Basic Speed Law, you must not drive faster than the driving conditions would allow. If only Full Self Driving followed the law and reduced its max speed based on the same.

Rekorse ,

If you were to strictly take that rule seriously, you should not allow FSD to drive at all, as at any speed its more dangerous than the person driving it (given an average driver who's not intoxicated).

Rekorse ,

A Tesla drover might get the impression that the cars "opinion" is better than their own, which could cause them to hesitate before intervening or to allow the car to drive in a way they are uncomfortable with.

The misinformation about the car reaches the level of negligence because even smart people are being duped by this.

Honestly I think some people just dont believe someone could lie so publicly and loudly and often, that it must be something else besides a grift.

Pazuzu ,

Maybe it shouldn't be called full self driving if it's not fully capable of self driving

MacStache ,

Those trains sure are weird and confusing, with their back and forth those tracks and all. Makes you wonder about train safety, it does!

n3m37h , (edited )

10x safer than a human!

// added a fun link

angelmountain ,

I don't want to disagree, but I would like a source to support this claim

raspberriesareyummy ,

That exclamation point in the comment you replied to should be your hint that it's sarcasm.

n3m37h ,

No, Musk said this at one point at some press conference

raspberriesareyummy ,

Not the person I was replying to but okay... what has that got to do with the sarcasm of the comment I was referring to?

n3m37h ,

Look @ original comment, Musk has stated many a time that FSD is safer than human drivers and I'm pretty sure at one point he said it was 10x safer... No sarcasm in that statement.

Oh here

n3m37h ,

Musk has multiple times stated that FSD is safer than han driving. I'm not gonna bother finding the bids as I'm at work

n3m37h ,

https://motherfrunker.ca/fsd/

Here ya go someone else did it for me

Gsus4 ,
@Gsus4@mander.xyz avatar

*a drunken human

ArtemisimetrA ,

Let them earn their Darwin awards

explodicle ,

I'd rather see some Free Market Darwinism™ in the form of a lawsuit.

ArtemisimetrA ,

Yeah I guess I'd be ok with that. I may have lost faith in our judicial system to get that shit done

Furbag ,

Oh boy, and they just removed "steering wheel nag" in a recent update. I can't imagine that will have any unintended consequences.

WoahWoah ,

Not really. They just removed unprompted nag. If you're not constantly keeping a hand on the wheel and looking at the road, it nags more and will pull you over if you ignore it.

If you turn off the internal driver monitoring camera, you can't engage FSD or even use lane assist.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines