Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

  • conicalscientist@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    4 hours ago

    Anyone with half a brain could tell you plain cameras is a non-starter. This is nearly a Juicero level blunder. Tesla is not a serious car company nor tech company. If markets were rational it would have been the end for Tesla.

    • LifeInMultipleChoice@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      3 hours ago

      Austin should just pull the permits until all the taxis have lidar installed and tested. Or write a bill that fines the manufacturer $100 billion for any self driving car that kills a person and puts the proceeds 50% to the family and 50% to infrastructure. One of the first rules of robotics was always about not harming humans.

  • happydoors@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    ·
    3 hours ago

    I love that one of the largest YouTubers is the one that did this. Surely, somebody near our federal government will throw a hissy fit if he hears about this but Mark’s audience is ginormous

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      32 minutes ago

      Honestly I think Mark should be more scared of Disney coming after him for mapping out their space mountain ride.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      3
      ·
      1 hour ago

      He is studiously apolitical, the only political comment I could find from him was the very sensible advice that we need to tone down our hyperpartisanship :)

      https://x.com/MarkRober/status/1641487680168153089?lang=en

      For me, I criticize any vehicle that is objectively crappy… and some vehicles where I find them subjectively crappy… and I hope folks don’t assume I’m doing that because of my political leanings.

  • rational_lib@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    ·
    edit-2
    5 hours ago

    The rain test was far more concerning because it’s much more realistic of a scenario. Both a normal person and the lidar would’ve seen the kid and stopped, but the cameras and image processing just isn’t good enough to make out a person in the rain. That’s bad. The test portrays it as a person in the middle of a straight road, but I don’t see why the same thing wouldn’t happen at a crosswalk or other place where pedestrians are often in the path of a vehicle. If an autonomous system cannot make out pedestrians in the rain reliably, that alone should be enough to prevent these vehicles from being legal.

    • LifeInMultipleChoice@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      The question there would be does Austin have crosswalks that don’t have red lights. Many places put a light at every cross walk, but not all. Most beaches don’t have them at every crosswalk, they just have laws that if someone is in or entering the crosswalk you have to stop for the pedestrians. (They would all be at risk from what you are saying).

      • Tot@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        Not every pedestrian follows the rules of the lights though. And not every pedestrian makes it across the road in time before the light changes colors from red to green.

  • King3d@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    6 hours ago

    This is like the crash on a San Francisco bridge that happened because of a Tesla that went into a tunnel and it wasn’t sure what to do since it went from bright daylight to darkness. In this case the Tesla just suddenly merged lanes and then immediately stopped and caused a multi car pile up.

    • fallingcats@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      18
      ·
      6 hours ago

      You’d think they have cameras with higher dynamic range and faster auto exposure in their cars by now. Nope, still penny pinching.

        • Mic_Check_One_Two@reddthat.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 hours ago

          Yeah, pulling radar from the cars was the beginning of the end. Early teslas had radar, and that was what led to all of the “car sees something three vehicles ahead and brakes to avoid a pileup that hasn’t even started yet” type of collision avoidance videos. First, pulling radar was a cost cutting thing. Then Elon demanded that they pull out the lidar too, and that’s when their crash numbers skyrocketed.

  • rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    6 hours ago

    Mark Rober is about to be listed as FBI public enemy #1 :(

  • fubarx@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    1
    ·
    8 hours ago

    There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:

    Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.

    This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻‍♂️

    Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!

    • hedge_lord@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      Kids already have experience playing hopscotch, so we can just have them jump between the rooves of moving cars in order to cross the street! It will be so much more efficient, and they can pretend that they are action heroes. The ones who survive will make for great athletes too.

    • chilicheeselies@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      7 hours ago

      There is no way insurance companies would go for that. What is far more likely is that policies simply wont cover accidents due to autonomous systems. Im honeslty surprised they wouls cover them now.

      • ThePantser@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 hours ago

        If it’s a feature of a car when you bought it and the insurance company insured the car then anything the car does by design must be covered. The only way an insurance company will get out of this is by making the insured sign a statement that if they use the feature it makes their policy void, the same way they can with rideshare apps if you don’t disclose that you are driving for a rideshare. They also can refuse to insure unless the feature is disabled. I can see in the future insurance companies demanding features be disabled before insuring them. They could say that the giant screens blank or the displayed content be simplified while in motion too.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 hours ago

        What is far more likely is that policies simply wont cover accidents due to autonomous systems.

        If the risk is that insurance companies won’t pay for accidents and put people on the hook for hundreds of thousands of dollars in medical bills, then people won’t use autonomous systems.

        This cannot go both ways. Either car makers are legally responsible for their AI systems, or insurance companies are legally responsible to pay for those damages. Somebody has to foot the bill, and if it’s the general public, they will avoid the risk.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        6 hours ago

        Not sure how it plays for Tesla, but for Waymo, their accidents per mile driven are WAY below non-automation. Insurance companies would LOVE to charge a surplus for automated driving insurance while paying out less incidents.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 hours ago

      Once that happens, Level 4 driving will come standard

      Uhhhh absolutely not. They would abandon it first.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      7 hours ago

      If no one is liable then it’s tempting to deliberately confuse them to crash

  • pjwestin@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    8 hours ago

    To be fair, the roadrunner it was following somehow successfully ran into the painting.