After years of promising investors that millions of Tesla robotaxis would soon fill the streets, Elon Musk debuted his driverless car service in a limited public rollout in Austin, Texas. It did not go smoothly.

The 22 June launch initially appeared successful enough, with a flood of videos from pro-Tesla social media influencers praising the service and sharing footage of their rides. Musk celebrated it as a triumph, and the following day, Tesla’s stock rose nearly 10%.

What quickly became apparent, however, was that the same influencer videos Musk promoted also depicted the self-driving cars appearing to break traffic laws or struggle to properly function. By Tuesday, the National Highway Traffic Safety Administration (NHTSA) had opened an investigation into the service and requested information from Tesla on the incidents.

Let me tell you how thrilled we all are to have a new hazard added to Austin streets.

  • bloup@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    With good integrated design, the LIDAR could be practically invisible. So weird to think the average person actually would care about the details of how something works, or maybe Elon Musk just literally cannot imagine a car that uses LIDAR without it having a big assembly on the roof.

      • nemith@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I have a theory on this. They use a lot of ML for FSD and Ap. That a radical redesign would change the model enough that they would have to retrain from scratch. So they keep the facelifts extremely minor .

        Just a theory.

        • vinniep@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Tesla’s used to have LiDAR - they turned them off in software and stopped adding them to newer models.

  • darkpanda@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 months ago

    Our eyes are not perfect organs so why pretend like they are? Our eyes fail us:

    • when it’s too dark
    • when it’s too bright
    • when there’s fog
    • when there’s too much rain and snow
    • when there’s glare from the sun
    • when there’s obstructions
    • when there’s sensory overload
    • when there’s something covering our eyes like dirt and mud
    • when we can only see on the visible spectrum

    Why wouldn’t we want more incoming data to account for these shortcomings? Optical-only vision-based solutions are incomplete because our eyes are incomplete. I can’t see that a car is stopped dead in the road 10 feet ahead of me in thick fog, but an advanced set of telemetry sensors can. My eyes are not better than the scores of technology we’ve built over the past few decades and I’ve been practicing with them for 46 years. Give me a helmet that includes LIDAR and infrared and night vision and sonar and telemetry from a satellite and GPS and weather tracking and god knows what else and I’ll be much less likely to rear end that car in the fog. We humans invent technology to make up for our shortcomings, so why go with the idea of “if it’s good enough for biological evolution it’s good enough for these multi-ton contraptions we have hurtling down highways next to each other several metres apart at 100 km per hour every second of every day?” It sounds ludicrous on its face. We can choke on a peanut because our swallow tube is the next to the breathing tube ffs. We can do better.

  • sculd@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Unless we figure out who would “personally” be liable when an accident happen we should not have any self driving cars on the street.

    Right now if someone crash into my car I know who is liable.

    If self driving car crash into my car, are you telling me to sue Tesla? Lol, as if that is feasible.