Self-Driving Tesla Nearly Hits Oncoming Train, Raises New Concern On Car's Safety ( lemmy.zip )

Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty's Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

werefreeatlast ,

Oh! As a token of ah....of...aah.. a knowledge mental acknowledgement, we the US people would like to gift this here Tesla to you all, Putin, and Iran leadership. You get a Tesla and you get a Tesla....and you get a Tesla!

misterundercoat ,

He fortunately avoided the train, but unfortunately still owns a Tesla.

buddascrayon ,

When you look at the development notes on the self-driving at Tesla, anyone with a brain wouldn't trust that shit, not even a little bit. Most of what they did is placate Musk's petty whims and delusions. Any real R&D issues were basically glazed over it given quick software fixes.

Scolding7300 ,

Are there notes available to read?

buddascrayon ,

Here's one of the few articles that wasn't paywalled when I pulled up a Google search on it. It's really not hard to find the stories are all over the place.

https://www.businessinsider.com/elon-musk-tesla-autopilot-fsd-engineers-unsettled-2021-8

VirtualOdour ,

Demonstrate what you mean because it really sounds like you're describing what you feel should be true to justify your emotions about the bad Twitter man.

And to be clear, I mean link the documents you claim to have read and the parts of them you claim demonstrate this.

buddascrayon ,

Just need to Google "Tesla self-driving development engineers and Elon Musk", and you'll find lots of articles. Here's one of the few that wasn't paywalled.

https://www.businessinsider.com/elon-musk-tesla-autopilot-fsd-engineers-unsettled-2021-8

WoahWoah ,

This is an article from 2021 about a book researched in 2019.

buddascrayon ,

Yeah, during development of the Tesla self driving system.

WoahWoah , (edited )

Read the development notes from the first years of any technology you use. The research you're "referencing" is six years old at this point.

What's next? You going to criticize an iPod Nano to make a point about the broken screen on your iPhone 8? Criticize Google assistant from 2019 to harangue OpenAI?

Look at what six years of development means: https://youtu.be/qTDlRLeDxxM?si=dFZzLcO_a8wfy2QS

buddascrayon ,

It's not about how well or badly it worked when they were developing it, it's about the developing process. It's about the fact that they had to appease Elon Musk's ego in all aspects of developing the self drive system. To a disastrous degree.

And again there is a world of difference between the iPhone or Open AI or Google Assistant not working right and a car driving itself not working right because when those other things don't work nobody dies or gets hurt. But a car can mame and kill people extremely easily.

VirtualOdour ,

That's a very old article about even older opinions, now totally outdated as shown by statements like;

Almost five years on, Tesla still hasn't completed its autonomous road trip — no car company has even come close.

You're using unsubstantiated statements from the start of development which is totally different to what you claimed before being asked for a source.

Current development FSD has hit huge milestones which competitors have not.

sp3tr4l , (edited )

This is horrible!

Obviously the Tesla's cold gas thrusters must be malfunctioning! The Fully Autonomous Only Non Insane Car AutoPilot was clearly going to jump over the train Speed Racer style!

Thank goodness the driver realized the thruster failure light was on and was able to avoid the worst.

Edit: This is sarcasm! I hate using /s because I'd like to believe people can tell, but I guess not this time...

Akasazh ,
@Akasazh@feddit.nl avatar

I don't see any information about the crossing. Was it a crossing without gates? As the sensors must've picked that up when driving towards it. If so, is a huge oversight not putting up gated crossings nowadays, certainly on busy roads, regardless of the performance of self driving cars.

Woovie ,

The video in the article shows lowered arms flashing. Very visible with plenty of time to stop despite the foggy conditions. It just didn't.

Akasazh ,
@Akasazh@feddit.nl avatar

Ah. I've read it, but I have media tuned of, so I didn't see the video. Thanks for the clarification!

Woovie ,

Yep of course!

ElderWendigo ,
@ElderWendigo@sh.itjust.works avatar

Not being able to identify a railroad crossing without a gate is a failing of the car not the train. Gated crossings are not guaranteed, nor should they be because they don't make sense for every situation in which roads and tracks cross.

Akasazh ,
@Akasazh@feddit.nl avatar

True, but it would be an exceptional failure if the car missed a gated crossing, as it turns out it was.

pirat ,

they don't make sense for every situation in which roads and tracks cross.

https://www.youtube.com/watch?v=peXry-_B87g

pirat ,

they don't make sense for every situation in which roads and tracks cross.

https://www.youtube.com/watch?v=peXry-_B87g

0x0 ,

TL;DR Tesla driver almost won Darwin award.

TammyTobacco ,

It was so foggy that I'm not surprised the car couldn't figure out what was happening. The guy also said his car had driven towards trains twice before, so he's definitely a dumbass for continuing to use self driving, especially in heavy fog.

skyspydude1 ,

If only there was some sort of sensing technology that wasn't purely optical, that'd be pretty neat. Maybe even using something like radio, for detection and ranging. Too bad no one's ever come up with something like that.

helpmyusernamewontfi ,

what's so "new" about this concern? I'd probably be able to afford a house if I had a dollar for every article I saw on Tesla's wrecking or nearly wrecking because of FSD.

FangedWyvern42 ,
@FangedWyvern42@lemmy.world avatar

Every couple of months there’s a new story like this. And yet we’re supposed to believe this system is ready for use..

darki ,

It is ready because Musk needs it to be ready. Watch out, this comment may bring the morale down, and Elron will be forced to ... Cry like a baby 😆

Buffalox ,

Didn't he recently claim Tesla robotaxi is only months away?
Well I suppose he didn't say how many months, but the implication was less than a year, which has been his claim every year since 2016.

dustyData ,

He said that Teslas were an investment worth hundreds of thousands of dollars because owners would be able to use them as robot taxis when they weren't using their car and charge a small fee by next year…in 2019. Back then he promised 1 million robot taxis nationwide in under a year. Recently he gave the date august 8 to reveal a new model of robot taxi. So, by Cybertruck estimates, I would say a Tesla robot taxi is a possibility by late 2030.

He is just spewing shit to keep the stock price afloat, as usual.

dual_sport_dork ,
@dual_sport_dork@lemmy.world avatar

He also said they were ready to manufacture the 2nd generation Tesla Roaster "now," which was back in 2014. No points for guessing that as of yet (despite taking in millions of dollars in preorders) they have not produced a single one.

Given this very early and still quite relevant warning, I'm astounded that anyone is dumb enough to believe any promise Elon makes about anything.

Thorny_Insight ,

In what way is it not ready to use? Does cars have some other driver assistant features that are fool proof? You're not supposed to blindly trust any of those. Why would FSD be an exception? The standards people are aplying to it are quite unreasonable.

ammonium ,

Because it's called Full Self Drive and Musk has said it will be able to drive without user intervention?

Thorny_Insight ,

It's called Full Self Driving (Supervised)

Yeah, it will be able to drive without driver intervention eventually. Atleast that's their goal. Right now however, it's level 2 and no-one is claiming otherwise.

In what way is it not ready to use?

noxy ,
@noxy@yiffit.net avatar

Full Self Driving (sike!)

dream_weasel ,

The naming is poor, but in no way does the car represent to you that no intervention is required. It also constantly asks you for input and even watches your eyes to make sure you pay attention.

Honytawk ,

The car maybe not, but the marketing sure does

dream_weasel ,

Marketing besides the naming we have already established and Elon himself masturbating to it? Is there some other marketing that pushes this narrative, because I certainly have not seen it.

Holyginz ,

No, the standards people are applying to it are the bare minimum for a full self driving system like what musk claims.

Thorny_Insight ,

It's a level 2 self driving system which by definition requires driver supervision. It's even stated in the name. What are the standards it doesn't meet?

piranhaphish ,

It's unreasonable for FSD to see a train? ... that's 20ft tall and a mile long? Am I understanding you correctly?

Foolproof would be great, but I think most people would set the bar at least as high as not getting killed by a train.

Thorny_Insight ,

Did you watch the video? It was insanely foggy there. It makes no difference how big the obstacle is if you can't even see 50 meters ahead of you.

Also, the car did see the train. It just clearly didn't understand what it was and how to react to it. That's why the car has a driver who does. I'm sure this exact edge case will be added to the training data so that this doesn't happen again. Stuff like this takes ages to iron out. FSD is not a finished product. It's under development and receives constant updates and keeps improving. That's why it's classified as level 2 and not level 5.

Yes. It's unreasonable to expect brand new technology to be able to deal with every possible scenario that a car can encounter on traffic. Just because the concept of train in a fog makes sense to you as a human doesn't mean it's obvious to the AI.

piranhaphish ,

In what way is it not ready to use?

To me it seems you just spent three paragraphs answering your own question.

can't even see 50 meters ahead

didn't understand what it was and how to react to it

FSD is not a finished product. It's under development

doesn't mean it's obvious to the AI

If I couldn't trust a system not to drive into a train, I don't feel like I would trust it to do even the most common tasks. I would drive the car like a fully attentive human and not delude myself into thinking the car is driving me with "FSD."

Thorny_Insight ,

You can't see 50 meters ahead in that fog.

piranhaphish ,

Completely true. And I would dictate my driving characteristics based on that fact.

I would drive at a speed and in a manner that would allow me to not almost crash into things. But especially trains.

Thorny_Insight ,

I agree. In fact I'm surprised the vehicle even lets you enable FSD in that kind of poor visibility and based on the video it seemed to be going quite fast aswell.

Honytawk ,

LIDAR can

Thorny_Insight ,

Yeah there's a wide range of ways to map the surroundings. Road infrastructure, however is designed for vision so I don't see why just cameras wouldn't be sufficient. The issue here is not that it's didn't see the train - it's on video, after all - but that it didn't know how to react to it.

assassin_aragorn ,

You’re not supposed to blindly trust any of those. Why would FSD be an exception?

Because that's how Elon (and by extension Tesla) market it. Full self driving. If they're saying I can blindly trust their product, then I expect it to be safe to blindly trust it.

And if the fine print says I can't blindly trust it, they need to be sued or put under legal pressure to change the term, because it's incredibly misleading.

Thorny_Insight ,

Full Self Driving (Beta), nowdays Full Self Driving (Supervised)

Which of those names invokes trust to put your life in it's hands?

It's not in fine print. It's told to you when you purchase FSD and the vehicle reminds you of it every single time you enable the system. If you're looking at your phone it starts nagging at you eventually locking you out of the feature. Why would they put driver monitoring system in place if you're supposed to put blind faith into it?

That is such an old, beat up strawman argument. Yes, Elon has said it would be fully autonomous in a year or so which turned out to be a lie but nobody today is claiming it can be blindly trusted. That simply just is not true.

assassin_aragorn ,

Unfortunately, companies also have to make their products safe for idiots. If the system is in beta or must be supervised, there should be inherently safe design that prevents situations like this from happening even if an idiot is at the wheel.

Thorny_Insight ,

ESP is not idiot proof either just to name one such feature that's been available for decades. It assists the driver but doesn't replace them.

Hell, cars themselves are not idiot proof.

SaltySalamander ,
@SaltySalamander@fedia.io avatar

Hell, cars themselves are not idiot proof.

Yup, almost always, there's an idiot in the driver seat.

VirtualOdour ,

Yeah and cars should have a system to stop idiots doing dumb things, best we have is a license so if it's good enough for cars without added safety features is good enough for them with

Honytawk ,

It isn't Full Self Driving if it is supervised.

It's especially not Full Self Driving if it asks you to intervene.

It is false advertisement at best, deadly at worst.

Thorny_Insight ,

It's misleading advertising for sure. At no point have I claimed otherwise.

The meaning of what qualifies as "full self driving" is still up for debate however. There are worse human drivers on the roads than what the current version of FSD is capable of. It's by no means flawless but it's much better than most people even realize. It's a vehicle capable of self driving even if not fully.

noxy ,
@noxy@yiffit.net avatar

Of whiat words is FSD an acronym?

dream_weasel ,

Ever couple of months you hear about every issue like this, just like you hear about every airline malfunction. It ignores the base rate of accurate performances which is very high.

FSD is imperfect but still probably more ready for use than a substantial fraction of human drivers.

buddascrayon ,

This isn't actually true. The Tesla full self driving issues we hear about in the news are the ones that result in fatal and near fatal accidents, but the forums are chock full of reports from owners of the thing malfunctioning on a regular basis.

dream_weasel ,

It IS actually true. It does goofy stuff in some situations, but on the whole is a little better than your typical relatively inexperienced driver. It gets it wrong about when to be assertive and when to wait sometimes, it thinks there's enough space for a courteous merge but there isn't (it does some Chicago style merges sometimes), it follows the lines on the road like they are gospel, and doesn't always properly estimate how to come to a smooth and comfortable stop. These are annoying things, but not outrageous provided you are paying attention like you're obliged to do.

I have it, I use it, and I make lots of reports to Tesla. It is way better than it used to be and still has plenty of room to improve, but a Tesla can't reboot without having a disparaging article written about it.

Also fuck elon, because I don't think it gets said enough.

bane_killgrind ,

typical relatively inexperienced driver

Look at the rates that teenagers crash, this is an indictment.

provided you are paying attention

It was advertised as fully autonomous dude. People wouldn't have this much of a hard-on for trashing it if it wasn't so oversold.

Thorny_Insight ,

This fully autonomous argument is beat to death already. Every single Tesla owner knows you're supposed to pay attention and be ready to take over when necessary. That is such a strawman argument. Nobody blames the car when automatic braking fails to see the car infront of it. It might save your ass if you're distracted but ultimately it's always the driver whose responsible. FSD is no different.

Rekorse ,

You realize it can be true that the driver is at fault when they crash and that the crash was more likely to happen because you have Elon contradicting his own marketing team constantly and confusing people.

He literally would take reporters in his car and take his hands off the wheel. He just fundamentally doesn't care about safety now. Probably doesn't about safety later, just saw a way to make some money.

Pazuzu ,

If it's not fully capable of self driving then maybe they shouldn't call it full self driving

Thorny_Insight ,

Sure. Make then change the name to something different. I'm fine with that.

Though I still don't know what most people actually mean by full self driving and how it's different from what FSD can do right now.

buddascrayon ,

Seriously you sound like a Mac user in the '90s. "It only crashes 8 or 9 times a day, it's so much better than it used to be. It's got so many great features that I'm willing to deal with a little inconvenience..." Difference being that when a Mac crashes it just loses some data and has to reboot but when a Tesla crashes people die.

dream_weasel ,

These are serious rate differences man.

Every driver, and even Tesla, will tell you it's a work in progress, and you'd be hard pressed to find someone who has had an accident with it. I'd be willing to bet money that IF You find someone who has had an accident they have a driving record that's shitty without it too.

If you want to talk stats, let's talk stats, but "It seems like Tesla is in the news a lot for near crashes" is a pretty weak metric, even from your armchair.

Rekorse ,

Is 200ish crashes and 6 deaths per year too many?

I know its an absolute number but we are asking if its worth sacrificing people for the potential of safer driving later.

Can you explain why you are so confident that this will all be worth it in the end?

Evidence that teslas are more dangerous than other cars: https://www.thedrive.com/news/tesla-drivers-have-the-highest-crash-rate-of-any-brand-study

Evidence for the 200 crashes and 6 deaths a year claim for FSD: https://www.theverge.com/2024/4/26/24141361/tesla-autopilot-fsd-nhtsa-investigation-report-crash-death

frostysauce ,

Seriously you sound like a Mac user in the '90s Linux user today.

FTFY

lolcatnip ,

You hear so much about the people Jeffrey Dahmer murdered, but never anything about all the people he didn't murder!

dream_weasel ,

Cute.

Here's some actual information

People are terrible at probability estimation, and even with two fatal accidents a month FSD is likely still safer than most of the people on the road per million miles driven.

lolcatnip ,

I see you've decided to be condescending, and also made a falsifiable claim. This is the part where you bring some actual data or STFU.

dream_weasel , (edited )

Whatever you say Mr Dahmer joke instead of content. I see that was really all in good faith and maybe I unintentionally hurt your feelings by citing a source on base rate biases?

What data would you like me to bring for discussion since you've been so open thus far? Do you want me to bring some data showing that teslas spend more time not having accidents than having accidents? I'm happy to go do some homework to enrich this interaction.

It's not as though you can just ask Tesla for every case of an FSD crash. The falsifiable claim is just me tossing a number, the point is that memorable bad press and bad stats are not the same.

Rekorse ,

Why would you think there isnt data on Tesla crashes? Are they hiding their broken cars from bystanders and police or something now?

riodoro1 ,

What a bunch of morons people were in 1912 to believe a ship could be unsinkable. Amirite guys?

Pazuzu ,

The Titanic probably wouldn't have sunk if it hit the iceberg head on. Clearly the Tesla simply mistook the train for an iceberg and itself for an ocean-liner and opted for a more ideal collision. The driver should have disabled 'sea mode' if they didn't want that behavior, it's all clearly spelled out in the owners manual.

cestvrai ,

As a frequent train passenger, I’m not overly concerned.

Seems a bit too weak to derail, probably only delay.

ElPenguin ,

As someone with more than a basic understanding of technology and how self driving works, I would think the end user would take special care driving in fog since the car relies on cameras to identify the roads and objects. This is clearly user error.

danny801 ,
@danny801@lemmy.world avatar

Lol

Linkin.park-numb.mp3.exe

tb_ ,
@tb_@lemmy.world avatar

This is clearly user error.

When it's been advertised to the user as "full self driving", is it?

Furthermore, the car can't recognize the visibility is low and alert the user and/or refuse to go into self driving?

Maddier1993 ,

When it's been advertised to the user as "full self driving", is it?

I wouldn't believe an advertisement.

tb_ , (edited )
@tb_@lemmy.world avatar

I wouldn't trust Musk with my life either.

But, presumably, we have moved beyond the age of advertising snake oil and miracle cures; advertisements have to be somewhat factual.

If a user does as is advertised and something goes wrong I do believe it's the advertiser who is liable.

0x0 ,

But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.

Keyword presumably.

tb_ ,
@tb_@lemmy.world avatar

Right. But can you blame the user for trusting the advertisement?

0x0 ,

At the dealership? Kinda, yeah, it's a dealership and news like this pop up every week.

On the road? I wouldn't trust my life to any self-driving in this day and age.

SaltySalamander ,
@SaltySalamander@fedia.io avatar

I mean, yes. I blame anyone who falls for marketing hype of any kind.

jaybone ,

If the product doesn’t do what it says it does, that’s the product / manufacturers fault. Not the users fault. Wtf lol how is this even a debate.

helpmyusernamewontfi ,

problem is most people do. anybody remember watch dogs?

darganon ,

There are many quite loud alerts when FSD is active in subpar circumstances about how it is degraded, and the car will slow down. That video was pretty foggy, I'd say the dude wasn't paying attention.

I came up on a train Sunday evening in the dark, which I hadn't had happen in FSD, so I decided to just hit the brakes. It saw the crossing arms as blinking stoplights, probably wouldn't have stopped?

Either way that dude was definitely not paying attention.

noxy ,
@noxy@yiffit.net avatar

Leaving room for user error in this sort of situation is unacceptable at Tesla's scale and with their engineering talent, as hamstrung as it is by their deranged leadership

SaltySalamander ,
@SaltySalamander@fedia.io avatar

If you are in the driver's seat, you are 100% responsible for what your car does. If you let it drive itself into a moving train, that's on you.

noxy ,
@noxy@yiffit.net avatar

I cannot fathom how anyone can honestly believe Tesla is entirely faultless in any of this, completely and totally free of any responsibility whatsoever.

I'm not gonna say they're 100% responsible but they are at least 1% responsible.

SaltySalamander ,
@SaltySalamander@fedia.io avatar

If Tesla is at fault for an inattentive driver ignoring the myriad warnings he got to remain attentive when he enabled FSD and allowing the 2 ton missile he's sitting in to nearly plow into a train, then Dodge has to be responsible for the Challenger being used to plow into those protestors in Charlottesville.

God fucking damn it, why do you people insist on making me defend fucking Tesla?!

noxy ,
@noxy@yiffit.net avatar

Not defending Tesla is free, you can just immediately enjoy the immense benefits of not defending Tesla.

solrize ,

The good news is that we can finally see the light at the end of the tunnel...

Tronn4 ,

"It's just a freight train coming your waaaaayyyyyyy!" -metalilica

shortwavesurfer ,

Seriously, you just had to throw that pun in there. LOL.

cyberpunk007 ,

"new" concerns lol. There are so many of these articles with self driving cars crashing.

admin , (edited )
@admin@lemmy.my-box.dev avatar

Counterpoint: we don't get much articles about human drivers crashing, because we're so used to it. That doesn't make it a good metric to consider their safety.

Edit: Having said that, this wasn't even an article. Just an unsourced headline with a photo. One should strongly consider the possibility of a selection bias at work here.

Thorny_Insight ,

80 people die every single day in traffic accidents in the US alone and we're focusing on the leading company trying to solve this issue when their car almost hits a train.

cyberpunk007 ,

Let's pretend it's 50/50 humans drive ng cars and self driving cars. The numbers would be a lot higher. It's not really a fair comparison.

SaltySalamander ,
@SaltySalamander@fedia.io avatar

Unfounded conjecture. You can't spout your feelings as if they're objective fact.

Cornelius_Wangenheim ,

Tesla is not remotely close to being the leading company. That would be Google/Waymo.

Thorny_Insight ,

What makes them the leader? You can't even buy a car from them and I would be willing to bet that the number of kilometers driven on autopilot/FSD on Teslas is orders of magnitude greater than the competition and rapidly increasing each day. Even the most charitable view would place them on par with Tesla at best. Waymo/Cruze both have remote operators helping for when their vehicles get stuck. Even the MB Drive Pilot will ask for the driver to take over when needed. They're not fully functional self-driving vehicles no more than Teslas are.

piranhaphish ,

I've never hit a train. And I've also never almost hit a train. I think I could go my entire life never almost hitting trains and I would still consider that the bare minimum for a mammal with two eyes and a brain.

SaltySalamander ,
@SaltySalamander@fedia.io avatar

Congratulations. Want a cookie? People drive into trains all the time. You can literally find dozens of videos online showing this very thing.

ohwhatfollyisman ,

looks like the engineers misunderstood what "training mode" was supposed to do.

they would want to improve their track record after this, otherwise the public would just choo them up.

jaybone ,

Usually with ML, you separate your train and test data.

assassin_aragorn ,

Oh the engineers probably perfectly understood what was going on. But they don't have the ability to correct Musk when he's spewing bullshit.

Ethically speaking though they're supposed to refuse signing off on the work and whistleblow the issues, so they aren't free of guilt.

Right now at my work we have a gap in our safety analysis with a contractor's product, and we've had to fight the VP to explain how we can't just say "it's their problem so we won't deal with it" if its part of our product. One of my colleagues had to go up a head to inform the head of safety that we were having issues. It's still an ongoing fight, but I cannot in good conscience allow the product to be finalized when we know there's a safety issue that needs to be addressed.

Don't get me wrong, it isn't an easy thing to do, and I'm really grateful that my coworker is very steadfast on this. But engineers aren't supposed to approve of any work they know is unsafe.

Rentlar ,

Speculation here, I wonder if the ditch lights and general light placement that is different than a normal car confused the self driving module into thinking it's on the wrong side of the road...?

https://www.railpictures.net/images/d2/2/6/4/7264.1457189772.jpg

•w•

Snowpix ,
@Snowpix@lemmy.ca avatar

The locomotive had long passed in the video of the incident, it just kept driving towards the train until it swerved and nearly took out a crossing signal. Funny as that would be, I doubt the ditch lights were the cause.

Rentlar ,

Hmm yeah it seems the oncoming movement from the side may throw it off somehow.

One of the many forseeable problems with solely relying on cameras and visual processing.

0x0 ,

Not in this case, but it may have been when two drivers killed two motorcyclists.

Cornelius_Wangenheim , (edited )

The dash cam footage is linked in the article. Looks to be mostly foggy conditions and their system completely ignoring the warning lights.

You999 ,

I work for a railroad and also own a Tesla. FSD doesn't actually know what a train is at all. If you watch the visualization while in FSD you'll see trains as a long string of semi trucks and it sees the crossing arms as flashing red stop lights (ie treat like stop sign).

Rentlar ,

trains as a long string of semi-trucks

So many long distance delivery trucks take the same route across the country. Why don't we just string them all together, then have one big-ass truck engine in the front pulling it all? And to save on how big the motor needs to be, we'll have steel on steel contact to reduce friction. Whoops, you've got a train all of a sudden. 😅

flashing red lights treated like a stop sign

Seems kind of dangerous to treat as a rule. Not just at railroad crossing but a stopped car with hazards on or a firetruck might confuse the self driving module...

You999 ,

So many long distance delivery trucks take the same route across the country. Why don't we just string them all together, then have one big-ass truck engine in the front pulling it all? And to save on how big the motor needs to be, we'll have steel on steel contact to reduce friction. Whoops, you've got a train all of a sudden. 😅

Get out of here with your crazy ideas...

https://sh.itjust.works/pictrs/image/c5dd8677-ae7a-41df-b319-ba3a557da9b2.jpeg

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines