Robot cars can be crashed with tinfoil and painted cardboard ( www.theregister.com )

A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.

MeatPilot ,
@MeatPilot@lemmy.world avatar
EvilBit ,

https://xkcd.com/1958/

TL;DR: faking out a self-driving system is always going to be possible, and so is faking out humans. But doing so is basically attempted murder, which is why the existence of an exploit like this is not interesting or new. You could also cut the brake lines or rig a bomb to it.

ArbitraryValue ,

People seem to hold computers to a higher standard than other people when performing the same task.

Fedizen ,

I think human responses vary too much: could you follow a strategy that makes 50% of human drivers crash reliably? probably. Could you follow a strategy to make 100% of autonomous vehicles crash reliably? Almost certainly.

Infynis ,
@Infynis@midwest.social avatar

This is the real reason Elon Musk doesn't want people tracking his plane. If we know where he is, Wile E Coyote could catch up to him and trick his car into crashing into a brick wall, by painting a tunnel on it

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines