• EurekaStockade@lemmy.world
    link
    fedilink
    English
    arrow-up
    96
    arrow-down
    1
    ·
    1 year ago

    don’t come with a requirement that drivers watch the road

    Seems it’s like every other Mercedes then

  • deafboy@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    1
    ·
    1 year ago

    And they managed to do it without us obsessing about their CEO several times a day? I refuse to believe that!

  • cAUzapNEAGLb@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    ·
    edit-2
    1 year ago

    As of April 11, there were 65 Mercedes autonomous vehicles available for sale in California, Fortune has learned through an open records request submitted to the state’s DMV. One of those has since been sold, which marks the first sale of an autonomous Mercedes in California, according to the DMV. Mercedes would not confirm sales numbers. Select Mercedes dealerships in Nevada are also offering the cars with the new technology, known as “level 3” autonomous driving.

    Drivers can activate Mercedes’s technology, called Drive Pilot, when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control. The technology does not work on roads that haven’t been pre-approved by Mercedes, including on freeways in other states.

    U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500.

    Mercedes is also working on developing level 4 capabilities. The automaker’s chief technology officer Markus Schäfer expects that level 4 autonomous technology will be available to consumers by 2030, Automotive News reported.

    • Cosmic Cleric@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      1 year ago

      Paywalled.

      On a different subject, why would someone downvote a one-word comment that accurately describes what the content is behind?

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        There are people who are pathologically contrarian. I’ve had to end a friendship over it—the endless need to say something negative about literally everything that ever happens and an unwillingness to be charitable to others.

      • moonpiedumplings@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Because some of us have fat fingers and accidentally downvote when we scroll on mobile.

        One of the things I liked about reddit was that, since it saved downvoted posts, I could go through the list every once in a while and undownvote the accidents.

        Can’t do that here though, and I sometimes notice posts or comments I’ve accidentally downvoted.

        Anyway, people shouldn’t care so much, we don’t have a karma system or the like here anyways, so why does it matter?

        • Cosmic Cleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Anyway, people shouldn’t care so much, we don’t have a karma system or the like here anyways, so why does it matter?

          Well, only speaking for myself, I don’t care, it just seemed so weird since it was an accurate single word, so I was curious.

          I also wonder sometimes if it’s a bot system purposely trying to force engagement.

          Lol trust me, I get downvotes all the time for things I say here on Lemmy. If I let them bother me I’d be in the psychiatric system by now.

          Anti Commercial-AI license (CC BY-NC-SA 4.0)

  • merthyr1831@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    11
    ·
    1 year ago

    Love how companies can decide who has to supervise their car’s automated driving and not an actual safety authority. Absolutely nuts.

    • Trollception@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      Who said there was no safety authority involved? I thought it was part of the 4 level system the government decided on for assisted driving.

  • nucleative@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    Wonder how this works with car insurance. Os there a future where the driver doesn’t need to be insured? Can the vehicle software still be “at fault” and how will the actuaries deal with assessing this new risk.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      I believe Mercedes takes responsibility if there is an accident while driving autonomously.

      • Rinox@feddit.it
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        1 year ago

        Will it pull a Tesla and switch off the autopilot seconds before an accident?

          • T156@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            edit-2
            1 year ago

            If memory serves, that’s not an intentional feature, but more a coincidence, since if the driver thinks the cruise control is about to crash the car, they’ll pop the brakes. Touching the brakes disengages the cruise control by design, so you end up with it shutting down before a crash happens.

            • nucleative@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              That makes perfect sense. If the driver looks up to notice that he’s in a dangerous, unfixable situation, slams the breaks, disconnecting the autopilot (which have been responaible for letting the situation develop) hopefully the automaker can’t entirely say “not our fault, the system wasn’t even engaged at the time of the collision”

      • Hacksaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        5
        ·
        1 year ago

        No. I don’t think this is a good solution. Companies will put a price on your life and focus on monetary damage reduction. If you’re about to cause more property damage than your life is worth (to Mercedes) they’ll be incentivized to crash the car and kill you rather than crash into the expensive structure.

        Your car should be you property, you should be liable for the damage it causes. The car should prioritise your life over monetary damage. If there is some software problem causing the cars to crash, you need to be able to sue Mercedes through a class action lawsuit to recover your losses.

    • Hugin@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      Berkshire Hathaway owns Geico the car insurance company. In one of his annual letters Buffett said that autonomous cars are going to be great for humanity and bad for insurance companies.

      “If [self-driving cars] prove successful and reduce accidents dramatically, it will be very good for society and very bad for auto insurers.”

      Actuaries are by definition bad at assessing new risk. But as data get collected they quickly adjust to it. There are a lot of cars so if driverless cars become even a few percent of cars on the road they will quickly be able to build good actuarial tables.

  • daikiki@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    4
    ·
    1 year ago

    According to who? Did the NTSB clear this? Are they even allowed to clear this? If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?

    • Trollception@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      16
      ·
      edit-2
      1 year ago

      You do realize humans kill hundreds of other humans a day in cars, right? Is it possible that autonomous vehicles may actually be safer than a human driver?

      • KredeSeraf@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 year ago

        Sure. But no system is 100% effective and all of their questions are legit and important to answer. If I got hit by one of these tomorrow I want to know the process for fault, compensation and pathway to improvement are all already done not something my accident is going to landmark.

        But that being said, I was a licensing examiner for 2 years and quit because they kept making it easier to pass and I was forced to pass so many people who should not be on the road.

        I think this idea is sound, but that doesn’t mean there aren’t things to address around it.

        • Trollception@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          6
          ·
          1 year ago

          Honestly I’m sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.

          • KredeSeraf@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            2
            ·
            1 year ago

            That still doesn’t address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because “it’ll eventually get better” is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.

            But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn’t clearly codified law I might not, and you might be callous enough to say you don’t care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you’d have to go through a long, expensive court battle to determine fault because no one had done it it. So you’re in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn’t your fault.

            That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.

            • Trollception@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              3
              ·
              edit-2
              1 year ago

              To be clear I never said that I didn’t care about an individual’s safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.

              The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn’t count as it’s not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.

              https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars

          • stoly@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            1 year ago

            You’re deciding to prioritize economic development over human safety.

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        1 year ago

        Only on closed courses. The best AI lacks the basic heuristics of a child and you simply can’t account for all possible outcomes.

  • Ultragigagigantic@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    1 year ago

    if it can drive a car why wouldn’t it be able to drive a truck?

    I’m surprised companies don’t just build their own special highway for automated trucking and use people for last mile stuff.

  • elrik@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    4
    ·
    1 year ago

    How is this different from the capabilities of Tesla’s FSD, which is considered level 2? It seems like Mercedes just decided they’ll take on liability to classify an equivalent level 2 system as level 3.

    • rsuri@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      1 year ago

      According to the mercedes website the cars have radar and lidar sensors. FSD has radar only, but apparently decided to move away from them and towards optical only, I’m not sure if they currently have any role in FSD.

      That’s important because FSD relies on optical sensors only to tell not only where an object is, but that it exists. Based on videos I’ve seen of FSD, I suspect that if it hasn’t ingested the data to recognize, say, a plastic bucket, it won’t know that it’s not just part of the road (or at best can recognize that the road looks a little weird). If there’s a radar or lidar sensor though, those directly measure distance and can have 3-D data about the world without the ability to recognize objects. Which means they can say “hey, there’s something there I don’t recognize, time to hit the brakes and alert the driver about what to do next”.

      Of course this still leaves a number of problems, like understanding at a higher level what happened after an accident for example. My guess is there will still be problems.

    • philpo@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      It’s not about the sensors, it’s about the software. That’s the solution.

      • skyspydude1@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Please tell me how software will be able to detect objects in low/no-light conditions if they say, have cameras with poor dynamic range and no low-light sensitivity?

  • WamGams@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    Had this been BMW I would have assumed it was an onion article.