In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • comfy@lemmy.ml
    link
    fedilink
    English
    arrow-up
    72
    ·
    11 hours ago

    I hope some of you actually skimmed the article and got to the “disengaging” part.

    As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

    It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

  • madcaesar@lemmy.world
    link
    fedilink
    English
    arrow-up
    121
    arrow-down
    1
    ·
    17 hours ago

    My 500$ robot vacuum has LiDAR, meanwhile these 50k pieces of shit don’t 😂

    • Animal@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      12 hours ago

      Holy shit, I knew I’d heard this word before. My Chinese robot vacuum cleaner has more technology than a tesla hahahahaha

    • rumba@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      49
      ·
      16 hours ago

      Vacuum doesn’t run outdoors and accidentally running into a wall doesn’t generate lawsuits.

      But, yes, any self-driving cars should absolutely be required to have lidar. I don’t think you could find any professional in the field that would argue that lidar is the proper tool for this.

      • rmuk@feddit.uk
        link
        fedilink
        English
        arrow-up
        49
        arrow-down
        1
        ·
        edit-2
        16 hours ago

        …what is your point here, exactly? The stakes might be lower for a vacuum cleaner, sure, but lidar - or a similar time-of-flight system - is the only consistent way of mapping environmental geometry. It doesn’t matter if that’s a dining room full of tables and chairs, or a pedestrian crossing full of children.

        • rumba@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          50
          ·
          edit-2
          16 hours ago

          I think you’re suffering from not knowing what you don’t know.

          Let me make it a but clearer for you to make a fair answer.

          Take a .25mw lidar sensor off a vacuum, take it outdoors and scan an intersection.

          Will that laser be visible to the sensor?

          is it spinning fast enough to track a kid moving in to an intersection when you’re traveling at 73 feet per second?

          • Forbo@lemmy.ml
            link
            fedilink
            English
            arrow-up
            53
            ·
            edit-2
            15 hours ago

            You’re mischaracterizing their point. Nobody is saying take the exact piece of equipment, put it in the vehicle and PRESTO. That’d be like asking why the vacuum battery can’t power the car. Because duh.

            The point is if such a novelty, inconsequential item that doesn’t have any kind of life safety requirements can employ a class of technology that would prevent adverse effects, why the fuck doesn’t the vehicle? This is a design flaw of Teslas, pure and simple.

            • rumba@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              23
              ·
              15 hours ago

              But they do, there are literally cars out there with lidar sensors.

              The question was why can’t I have a lidar sensor on my car if my $150 vacuum has one. The lidar sensor for a car is more than $150.

              You don’t have one because there are expensive at that size and update frequency. Sensors that are capable of outdoor mapping at high speed cost the price of a small car.

              The manufacturers suspect and probably rightfully so that people don’t want to pay an extra 10 - 30 grand for an array of sensors.

              The technology readily exists rober had one in his video that he used to scan a roller coaster. It’s not some conspiracy that you don’t have it on cars and it’s not like it’s not capable of being done because waymo does it all the time.

              There’s a reason why waymo doesn’t use smaller sensors they use the minimum of what works well. Which is expensive, which people looking at a mid-range car don’t want to take on the extra cost, hence it’s not available

              • Echo Dot@feddit.uk
                link
                fedilink
                English
                arrow-up
                4
                ·
                2 hours ago

                Good God it’s like you’re going out of the way to intentionally misunderstand the point.

                Nobody is saying that the lidar on a car should cost the same as a lidar on a vacuum cleaner. What everyone is saying is that if the company that makes vacuum cleaners thinks it’s important enough to put lidar on, surely you’re not the company that makes cars should think that it’s important enough to put lidar on.

                Stop being deliberately dense.

              • Lemmyoutofhere@lemmy.ca
                link
                fedilink
                English
                arrow-up
                25
                ·
                edit-2
                13 hours ago

                Only Tesla does not use radar with their control systems. Every single other manufacturer uses radar control mixed with the camera system. The Tesla system is garbage.

                • rumba@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  11 hours ago

                  yeah, you’d think they’d at least use radar. That’s cheap AF. It’s like someone there said I have this hill to die on, I bet we can do it all with cameras.

          • rmuk@feddit.uk
            link
            fedilink
            English
            arrow-up
            33
            arrow-down
            5
            ·
            edit-2
            15 hours ago

            I think you’re suffering from not knowing what you don’t know.

            and I think you’re suffering from being an arrogant sack of dicks who doesn’t like being called out on their poor communication skills and, through either a lack of self-awareness or an unwarranted overabundance of self-confidence, projects their own flaws on others. But for the more receptive types who want to learn more, here’s Syed Saad ul Hassan’s very well-written 2022 paper on practical applications, titled Lidar Sensor in Autonomous Vehicles which I found also serves as neat primer of lidar in general..

            • racemaniac@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              3
              ·
              5 hours ago

              Wow, what’s with all the hostility against him.

              It’s maybe because i also know a bit about lidars that his comment was clear to me (“ha, try putting a vacuum lidar in a car and see if it can do anything useful outside at the speeds & range a car needs”).

              Is it that much of an issue if someone is a bit snarky when pointing out the false equivalence of “my 500$ vacuum has a lidar, but a tesla doesn’t? harharhar”.

              • rmuk@feddit.uk
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                28 minutes ago

                But, yes, any self-driving cars should absolutely be required to have lidar.

                So they think self-driving cars should have lidar, like a vacuum cleaner. They agree, and think it’s a good idea, right?

                I don’t think you could find any professional in the field that would argue that lidar is the proper tool for this.

                …then in the next sentence goes on to say that lidar is not the correct tool. In the space of a paragraph they make two points which directly contradict one-another. Hence my response:

                What is your point here, exactly?

                They could have said “oops, typo!” or something but, no, instead they went full on-condescending:

                I think you’re suffering from not knowing what you don’t know.

                I stand by my response:

                arrogant sack of dicks

                And while I’m not naive enough to believe that upvotes and downvotes are any kind of arbiter of objective truth, they at least seem to suggest, in this case, that my interpretation is broadly in line with the majority.

            • rumba@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              38
              ·
              15 hours ago

              Well look at you being adult and using big words instead of just insulting people. Not even going to wastime on people like you, I’m going to block you and move on and hope that everyone else does the same so you can sit in your own quiet little world wondering why no one likes you.

  • buddascrayon@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    ·
    15 hours ago

    It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

    So, who’s the YouTuber that’s gonna test this out? Since Elmo has pushed his way into the government in order to quash any investigation into it.

    • bay400@thelemmy.club
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 hours ago

      It basically already happened in the Mark Rober video, it turns off by itself less than a second before hitting

  • /home/pineapplelover@lemm.ee
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    5
    ·
    14 hours ago

    To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well. That being said, tesla shouldn’t rely on cameras

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      35 minutes ago

      Watch the video it’s extremely obvious to a human driver that there is something wrong with that view ahead. It’s even pointed out in the video that humans use additional visual clues when a situation is ambiguous.

      The cars don’t have deduction and reasoning capabilities so they need additional sensors to give them more information to compensate due to their lack of brains. So it’s not really sensible to compare self-driving systems to humans. Humans have limited sensory input but it’s compensated for by reasoning abilities, Self-Driving cars do not have reasoning abilities but it’s compensated for by enhanced sensory input.

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      6 hours ago

      I’d take that bet. I imagine at least some drivers would notice something sus’ (due to depth perception, which should be striking as you get close, or lack of ANY movement or some kind of reflection) and either

      • slow down
      • use a trick, e.g. flicking lights or driving a bit to the sides and back, to try to see what’s off

      or probably both, but anyway as other already said, it’s being compared to other autopilot systems, not human drivers.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      edit-2
      10 hours ago

      To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well.

      this isn’t being fair. It’s being compared to the other- better- autopilot systems that use both LIDAR and radar in addition to daylight and infrared optical to sense the world around them.

      Teslas only use daylight and infrared. LIDAR and radar systems both would not have been deceived.

    • comfy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      10 hours ago

      The video does bring up human ability too with the fog test (“Optically, with my own eyes, I can no longer see there’s a kid through this fog. The lidar has no issue.”) But, as they show, this wall is extremely obvious to the driver.

        • T156@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 hours ago

          They already have trouble enough with trucks carrying traffic lights, or with speed limit drivers on them.

            • Echo Dot@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              33 minutes ago

              I have seen trucks with landscape scenes painted on the side and I’ve never crashed into one of those thinking that it was a portal to a random sunlit field.

    • TorJansen@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      12 hours ago

      Yeah, the Roadrunner could easily skip by such barriers, frustrating the Coyote to no end. Tesla is not a Roadrunner.

  • Banana@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    35
    ·
    17 hours ago

    And the president is driving one of these?

    Maybe we should be purchasing lots of paint and cement blockades…

    • LeninOnAPrayer@lemm.ee
      link
      fedilink
      English
      arrow-up
      16
      ·
      edit-2
      15 hours ago

      When he was in the Tesla asking if he should go for a ride I was screaming “Yes! Yes Mr. President! Please! Elon, show him full self driving on the interstate! Show him full self driving mode!”

    • Chewget@lemm.ee
      link
      fedilink
      English
      arrow-up
      18
      ·
      17 hours ago

      The president can’t drive by law unless on the grounds of the White House and maybe Camp David. At least while in office. They might be allowed to drive after leaving office…

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        9 hours ago

        This isn’t true at all. I can’t tell if you’re being serious or incredibly sarcastic, though.

        The reason presidents (and generally ex presidents, too) don’t drive themselves is because the kind of driving to escape an assassination attempt is a higher level of driving and training than what the vast majority of people ever have. There’s no law saying presidents are forbidden from driving.

        In any case, I would be perfectly happy if they let him drive a CT and it caught fire. I’d do a little jib, and I wouldn’t care who sees that.

          • FuglyDuck@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 hours ago

            you’re gonna have to drop a source for that.

            because, no, they’re not. the Secret Service provides a driver specially trained for the risks a president might face, and very strongly insists, but they’re not “prohibited” from driving simply because they’re presidents.

            to be clear, the secret service cannot prohibit the president from doing anything they really want to do. Even if it’s totally stupid for them to do that. (This includes, for example, Trump’s routine weekend round of golf at Turd-o-Lardo)

            • alcoholicorn@lemmy.ml
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              3 hours ago

              to be clear, the secret service cannot prohibit the president from doing anything they really want to do

              Was Trump lying when he said the SS wouldn’t take him back to the capital on Jan 6?

              I could definitely see him lying about that so he doesn’t look like he abandoned his supporters during the coup, but I could also see the driver being like “I can’t endanger you, mr president” and ignoring his requests.

              • FuglyDuck@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 hours ago

                Was Trump lying when he said the SS wouldn’t take him back to the capital on Jan 6?

                Definitely not. There is no way in hell the secret service would have taken the president to that shit show. Doesn’t mean that they would have physically arrested him if he insisted going on his own, however.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        16 hours ago

        The real question is, in a truly self-driving car, (not a tesla) are you actually driving?

  • MochiGoesMeow@lemmy.zip
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    7
    ·
    23 hours ago

    If you get any strong emotions on material shit when someone makes a video…you have 0 of my respect. Period.

    • doomcanoe@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      2
      ·
      edit-2
      17 hours ago

      Saw a guy smash a Stradivarius on video once. definitely had strong emotions on that one.

      Really torn up about not having your respect tho…

      • njm1314@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        16 hours ago

        I think you could argue that that’s not just material stuff though. That’s historical and significant culturally.

    • Tilgare@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      16 hours ago

      I have no clue what you’re trying to say, but the significant amount of outrage a day or two later that I suddenly saw explode on Twitter was mind boggling to me. Couldn’t tell if it was bots or morons but either way, people are big mad about the video.

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    274
    arrow-down
    1
    ·
    1 day ago

    As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

    This has been known.

    They do it so they can evade liability for the crash.

    • fibojoly@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      2
      ·
      edit-2
      22 hours ago

      That makes so little sense… It detects it’s about to crash then gives up and lets you sort it?
      That’s like the opposite of my Audi who does detect I’m about to hit something and gives me either a warning or just actively hits the brakes if I don’t have time to handle it.
      If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.

      • Red_October@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        ·
        19 hours ago

        The point is that they can say “Autopilot wasn’t active during the crash.” They can leave out that autopilot was active right up until the moment before, or that autopilot directly contributed to it. They’re just purely leaning into the technical truth that it wasn’t on during the crash. Whether it’s a courtroom defense or their own next published set of data, “Autopilot was not active during any recorded Tesla crashes.”

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        13 hours ago

        even your audi is going to dump to human control if it can’t figure out what the appropriate response is. Granted, your Audi is probably smart enough to be like “yeah don’t hit the fucking wall,” but eh… it was put together by people that actually know what they’re doing, and care about safety.

        Tesla isn’t doing this for safety or because it’s the best response. The cars are doing this because they don’t want to pay out for wrongful death lawsuits.

        If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.

        It’s musk. he’s fucking vile, and this isn’t even close to the worst thing he’s doing. or has done.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      13 hours ago

      Any crash within 10s of a disengagement counts as it being on so you can’t just do this.

      Edit: added the time unit.

      Edit2: it’s actually 30s not 10s. See below.

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        14 hours ago

        Where are you seeing that?

        There’s nothing I’m seeing as a matter of law or regulation.

        In any case liability (especially civil liability) is an absolute bitch. It’s incredibly messy and likely will not every be so cut and dry.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          13 hours ago

          Well it’s not that it was a crash caused by a level 2 system, but that they’ll investigate it.

          So you can’t hide the crash by disengaging it just before.

          Looks like it’s actually 30s seconds not 10s, or maybe it was 10s once upon a time and they changed it to 30?

          The General Order requires that reporting entities file incident reports for crashes involving ADS-equipped vehicles that occur on publicly accessible roads in the United States and its territories. Crashes involving an ADS-equipped vehicle are reportable if the ADS was in use at any time within 30 seconds of the crash and the crash resulted in property damage or injury

          https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

          • FuglyDuck@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            13 hours ago

            Thanks for that.

            The thing is, though the NHTSA generally doesn’t make a determination on criminal or civil liability. They’ll make the report about what happened and keep it to the facts, and let the courts sort it out whose at fault. they might not even actually investigate a crash unless it comes to it. It’s just saying “when your car crashes, you need to tell us about it.” and they kinda assume they comply.

            Which, Tesla doesn’t want to comply, and is one of the reasons Musk/DOGE is going after them.

            • NotMyOldRedditName@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              12 hours ago

              I knew they wouldn’t necessarily investigate it, that’s always their discretion, but I had no idea there was no actual bite to the rule if they didn’t comply. That’s stupid.

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 hours ago

        So, as others have said, it takes time to brake. But also, generally speaking autonomous cars are programmed to dump control back to the human if there’s a situation it can’t see an ‘appropriate’ response to.

        what’s happening here is the ‘oh shit, there’s no action that can stop the crash’, because braking takes time (hell, even coming to that decision takes time, activating the whoseitwhatsits that activate the brakes takes time.) the normal thought is, if there’s something it can’t figure out on it’s own, it’s best to let the human take over. It’s supposed to make that decision well before, though.

        However, as for why tesla is doing that when there’s not enough time to actually take control?

        It’s because liability is a bitch. Given how many teslas are on the road, even a single ruling of “yup it was tesla’s fault” is going to start creating precedent, and that gets very expensive, very fast. especially for something that can’t really be fixed.

        for some technical perspective, I pulled up the frame rates on the camera system (I’m not seeing frame rate on the cabin camera specifically, but it seems to either be 36 in older models or 24 in newer.)

        14 frames @ 24 fps is about 0.6 seconds@36 fps, it’s about 0.4 seconds. For comparison, average human reaction to just see a change and click a mouse is about .3 seconds. If you add in needing to assess situation… that’s going to be significantly more time.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        13 hours ago

        AEB braking was originally designed to not prevent a crash, but to slow the car when a unavoidable crash was detected.

        It’s since gotten better and can also prevent crashes now, but slowing the speed of the crash was the original important piece. It’s a lot easier to predict an unavoidable crash, than to detect a potential crash and stop in time.

        Insurance companies offer a discount for having any type of AEB as even just slowing will reduce damages and their cost out of pocket.

        Not all AEB systems are created equal though.

        Maybe disengaging AP if an unavoidable crash is detected triggers the AEB system? Like maybe for AEB to take over which should always be running, AP has to be off?

      • GoodLuckToFriends@lemmy.today
        link
        fedilink
        English
        arrow-up
        7
        ·
        18 hours ago

        Because even braking can’t avoid the crash. Unavoidable crash means bad juju if the ‘self driving’ car image is meant to stick around.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        16 hours ago

        Breaks require a sufficient stopping distance given the current speed, driving surface conditions, tire condition, and the amount of momentum at play. This is why trains can’t stop quickly despite having breaks (and very good ones at that, with air breaks on every wheel) as there’s so much momentum at play.

        If autopilot is being criticized for disengaging immediately before the crash, it’s pretty safe to assume its too late to stop the vehicle and avoid the collision

        • filcuk@lemmy.zip
          link
          fedilink
          English
          arrow-up
          8
          ·
          16 hours ago

          This autopilot shit needs regulated audit log in a black box, like what planes or ships have.
          In no way should this kind of manipulation be legal.

    • bazzzzzzz@lemm.ee
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      2
      ·
      1 day ago

      Not sure how that helps in evading liability.

      Every Tesla driver would need super human reaction speeds to respond in 17 frames, 680ms(I didn’t check the recording framerate, but 25fps is the slowest reasonable), less than a second.

      • orcrist@lemm.ee
        link
        fedilink
        English
        arrow-up
        53
        arrow-down
        3
        ·
        1 day ago

        They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

        And then that creates a discussion about how much time the human driver has to have in order to actually solve the problem, or gray areas about who exactly controls what when, and it complicates the situation enough where maybe Tesla can pay less money for the deaths that they are obviously responsible for.

        • jimbolauski@lemm.ee
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          4
          ·
          23 hours ago

          They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

          The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

          • FuglyDuck@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            13 hours ago

            The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

            these strategies aren’t about actually winning the argument, it’s about making it excessively expensive to have the argument in the first place. Every motion requires a response by the counterparty, which requires billable time from the counterparty’s lawyers, and delays the trial. it’s just another variation on “defend, depose, deny”.

          • michaelmrose@lemmy.world
            link
            fedilink
            English
            arrow-up
            15
            ·
            19 hours ago

            They can also claim with a straight face that autopilot has a crash rate that is artificially lowered without it being technically a lie in public, in ads, etc

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            ·
            21 hours ago

            Defense lawyers can make a lot of hay with details like that. Nothing that gets the lawsuit dismissed but turning the question into “how much is each party responsible” when it was previously “Tesla drove me into a wall” can help reduce settlement amounts (as these things rarely go to trial).

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        62
        arrow-down
        1
        ·
        1 day ago

        It’s not likely to work, but them swapping to human control after it determined a crash is going to happen isn’t accidental.

        Anything they can do to mire the proceedings they will do. It’s like how corporations file stupid junk motions to force plaintiffs to give up.

    • Simulation6@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      1 day ago

      If the disengage to avoid legal consequences feature does exist, then you would think there would be some false positive incidences where it turns off for no apparent reason. I found some with a search, which are attributed to bad software. Owners are discussing new patches fixing some problems and introducing new ones. None of the incidences caused an accident, so maybe the owners never hit the malicious code.

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        13 hours ago

        if it randomly turns off for unapparent reasons, people are going to be like ‘oh that’s weird’ and leave it at that. Tesla certainly isn’t going to admit that their code is malicious like that. at least not until the FBI is digging through their memos to show it was. and maybe not even then.

      • Dultas@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        20 hours ago

        I think Mark (who made the OG video) speculated it might be the ultrasonic parking sensors detecting something and disengaging.

    • thistleboy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      9
      ·
      18 hours ago

      I believe the outrage is that the video showed that autopilot was off when they crashed into the wall. That’s what the red circle in the thumbnail is highlighting. The whole thing apparently being a setup for views like Top Gear faking the Model S breaking down.

      • octopus_ink@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 minutes ago

        In addition to the folks pointing out it likes to shut itself off (which I can neither confirm nor deny)

        I am not going to click a link to X, but this article covers that, and links this raw footage video on X which supposedly proves this claim to be false.

        https://www.pcmag.com/news/tesla-on-autopilot-runs-over-mannequin-hits-wall-in-viral-video-but-is

        Some skeptical viewers claim Autopilot was not engaged when the vehicle ran into the wall. These allegations prompted Rober to release the “raw footage” in a X post, which shows the characteristic signs of Autopilot being engaged, such as a rainbow road appearing on the dash.

        https://twitter.com/MarkRober/status/1901449395327094898

      • PraiseTheSoup@lemm.ee
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        1
        ·
        18 hours ago

        Autopilot shuts itself off just before a crash so Tesla can deny liability. It’s been observed in many real-world accidents before this. Others have said much the same, with sources, in this very thread.

        • melpomenesclevage@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          18 hours ago

          well yes but as long as there’s deniability built into my toy, then YOU’RE JUST A BIG DUMB MEANIE-PANTS WHO HATES MY COOL TOYS BECAUSE YOU DON’T HAVE ONE because there’s no other possible reason to hate a toy this cool.

  • TommySoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    414
    arrow-down
    2
    ·
    2 days ago

    Notice how they’re mad at the video and not the car, manufacturer, or the CEO. It’s a huge safety issue yet they’d rather defend a brand that obviously doesn’t even care about their safety. Like, nobody is gonna give you a medal for being loyal to a brand.

    • merc@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 hours ago

      To be fair, and ugh, I hate to have to stand up for these assholes, but…

      To be fair, their claim is that the video was a lie and that the results were manufactured. They believe that Teslas are actually safe and that Rober was doing some kind of Elon Musk takedown trying to profit off the shares getting tanked and promote a rival company.

      They actually do have a little bit of evidence for those claims:

      1. The wall changes between different camera angles. In some angles the wall is simply something painted on canvas. In other angles it’s a solid styrofoam wall.
      2. The inside the car view in the YouTube video doesn’t make it clear that autopilot mode is engaged.
      3. Mark Rober chose to use Autopilot mode rather than so-called Full Self Driving.

      But, he was interviewed about this, and he provided additional footage to clear up what happened.

      1. They did the experiment twice, once with a canvas wall, then a few weeks later with a styrofoam wall. The car smashed right into the wall the first time, but it wasn’t very dramatic because the canvas just blew out of the way. They wanted a more dramatic video for YouTube, so they did it again with a styrofoam wall so you could see the wall getting smashed. This included pre-weakening the wall so that when the car hit it, it smashed a dramatic Looney-Tunes looking hole in the wall. When they made the final video, they included various cuts from both the first and second attempts. The car hit the wall both times, but it wasn’t just one single hit like it was shown in the video.

      2. There’s apparently a “rainbow” path shown when the car is in Autopilot mode. [RAinbows1?!? DEI!?!?!?!] In the cut they posted to YouTube, you couldn’t see this rainbow path. But, Rober posted a longer cut of the car hitting the wall where it was visible. So, it wasn’t that autopilot was off, but in the original YouTube video you couldn’t tell.

      3. He used Autopilot mode because from his understanding (as a Tesla owner (this was his personal vehicle being tested)), Full Self Driving requires you to enter a destination address. He just wanted to drive down a closed highway at high speed, so he used Autopilot instead. In his understanding as a Tesla owner and engineer, there would be no difference in how the car dealt with obstacles in autopilot mode vs. full self driving, but he admitted that he hadn’t tested it, so it’s possible that so-called Full Self-Driving would have handled things differently.

      Anyhow, these rabid MAGA Elon Fanboys did pick up on some minor inconsistencies in his original video. Rober apprently didn’t realize what a firestorm he was wading into. His intention was to make a video about how cool LIDAR is, but with a cool scene of a car smashing through a wall as the hook. He’d apparently been planning and filming the video for half a year, and he claims it just happened to get released right at the height of the time when Teslas are getting firebombed.

      • booly@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        17 hours ago

        Kinda depends on the fact, right? Plenty of factual things piss me off, but I’d argue I’m correct to be pissed off about them.

        • OnASnowyEvening@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          11 hours ago

          Right. Just because sometimes we have to accept something, doesn’t mean we have to like it.

          (Though the other commenter implied people commonly or always angered by fact, but then we have nothing to talk about.)

    • can@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      161
      arrow-down
      2
      ·
      2 days ago

      These people haven’t found any individual self identity.

      An attack on the brand is an attack on them. Reminds me of the people who made Stars Wars their meaning and crumbled when a certain trilogy didn’t hold up.

      • Billiam@lemmy.world
        link
        fedilink
        English
        arrow-up
        89
        arrow-down
        1
        ·
        1 day ago

        An attack on the brand is an attack on them.

        Thus it ever is with Conservatives. They make $whatever their whole identity, and so take any critique of $whatever as a personal attack against themselves.

        I blame evangelical religions’ need for martyrdom for this.

        • mojofrododojo@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 hours ago

          “Mark my word, if and when these preachers get control of the [Republican] party, and they’re sure trying to do so, it’s going to be a terrible damn problem. Frankly, these people frighten me. Politics and governing demand compromise. But these Christians believe they are acting in the name of God, so they can’t and won’t compromise. I know, I’ve tried to deal with them.” ― Barry Goldwater

        • CarbonatedPastaSauce@lemmy.world
          link
          fedilink
          English
          arrow-up
          24
          ·
          1 day ago

          You pretty much hit the nail on the head. These people have no identity or ability to think for themselves because they never needed either one. The church will do all your thinking for you, and anything it doesn’t cover will be handled by Fox News. Be like everyone else and fit in, otherwise… you have to start thinking for yourself. THE HORROR.

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        39
        ·
        1 day ago

        The term you are looking for is “external locus of identity”. And, yes.

      • jumperalex@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        1
        ·
        1 day ago

        So literally every single above average sports fan?

        The pathological need to be part of a group so bad it overwhelmes all reason is a feature I have yet to understand. And I say that as someone who can recognize in myself those moments when I feel the pull to be part of an in group.

        • mic_check_one_two@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 hours ago

          That’s just tribalism in general. Humans are tribal by nature as a survival mechanism. In modern culture, that manifests as behaviors like being a rabid sports fan.

        • DogWater@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          22 hours ago

          It’s evolutionary. Humans are social pack animals. The need for inclusion was evolved into us over however many years.

    • rtxn@lemmy.world
      link
      fedilink
      English
      arrow-up
      48
      ·
      1 day ago

      The styrofoam wall had a pre-cut hole to weaken it, and some people are using it as a gotcha proving the video was faked. It would be funny if it wasn’t so pathetic.

      • TommySoda@lemmy.world
        link
        fedilink
        English
        arrow-up
        51
        ·
        1 day ago

        Yeah, but it’s styrofoam. You could literally run through it. And I’m sure they did that more as a safety measure so that it was guaranteed to collapse so nobody would be injured.

        But at the same time it still drove through a fucking wall. The integrity doesn’t mean shit because it drove through a literal fucking wall.

      • scops@reddthat.com
        link
        fedilink
        English
        arrow-up
        26
        ·
        1 day ago

        For more background, Rober gave an interview and admitted that they ran the test twice. On the first run, the wall was just fabric, which did not tear away in a manner that was visually striking. They went back three weeks later and built a styrofoam wall knowing that the Tesla would fail, and pre-cut the wall to create a more interesting impact.

        • Confused_Emus@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          21
          ·
          1 day ago

          Particularly disappointing part of that interview was Rober saying he still plans to buy a new Tesla. Safety issues aside, why would anyone want to do that?

          • hovercat@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            8
            ·
            1 day ago

            Knowing the insanity of die-hard Tesla fans, it’s likely to try and protect himself.

            “I love my Tesla, but” has been a meme for years now because if you ever went on forums to get help or complain what a giant heap of shit the car was, and didn’t bookend it with unabashed praise, you’d have people ripping you to shreds calling you a FUDster and Big Oil shill who’s shorting the stock and trying to destroy the greatest company the world has ever known.

            People have learned over the years that even with the most valid of criticism for the company, the only way to even attempt to have it received is by showing just how much you actually love Tesla and Daddy Elon, and your complaints/criticism are only because you care so much about the company and want them to do better. Yes, it’s fucking stupid and annoying, but sadly this is the reality we’ve created for ourselves.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            20 hours ago

            Because the car actually does stop for things that aren’t fake walls made to look like a road, and at least for people as tested by testing agencies

            This is the euro NCAP testing.

            https://youtu.be/4Hsb-0v95R4

            Note: not all of these cars have lidar, but some do.

        • rtxn@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          edit-2
          1 day ago

          Hopefully with a Mythbusters-style remote control setup in case it explodes. And the trunk filled with ANFO to make sure it does.

      • samus12345@lemm.ee
        link
        fedilink
        English
        arrow-up
        15
        ·
        1 day ago

        Yeah, because he knew that thing probably wasn’t gonna stop. Why destroy the car when you don’t have to? Concrete wouldn’t have changed the outcome.

    • Lost_My_Mind@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 day ago

      me waving a little handheld flag on a tiny pole that just says “Brand loyalty”

      …what? No medal???