Tesla reportedly asked highway safety officials to redact information about whether driver-assistance software was in use during crashes::Elon Musk’s Tesla has faced investigations into Autopilot, including an ongoing NHTSA probe of more than 800,000 Teslas after several crashes.

  • Eideen@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    3 years ago

    I don’t see the problem for Tesla. Regardless of whether the autopilot is active or not, the driver is responsible.

    • qaz@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      3 years ago

      Legally yes, but in my opinion when you market your car as self-driving you do share a certain level of responsibility if it self-drives into an accident.

    • UlrikHD@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 years ago

      The driver is fully responsible, but Tesla are also making the big bucks with misleading marketing of how good their driving assistance is. So it’s more profitable to keep people unaware of its actual capabilites.

      • Eideen@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 years ago

        Am I confusing Autopilot and FSD?

        There has been little progress in the Autopilot software. Will FSD is much more features rich.

        Most mid to high end car manufacturers are making money on L2 driver assistance.