Tesla under investigation by California attorney general over Autopilot safety, marketing::The California attorney general is investigating Tesla over the electric car company’s driver assistance technology, CNBC has learned.
Tesla under investigation by California attorney general over Autopilot safety, marketing::The California attorney general is investigating Tesla over the electric car company’s driver assistance technology, CNBC has learned.
Tesla autopilot is not safe, it’s killing people: https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
Just look at this insane pileup and tell me that you want to drive this crap: https://www.businessinsider.com/tesla-stops-tunnel-pileup-accidents-driver-says-fsd-enabled-video-2023-1?op=1
About the pileup case, don’t take me wrong if the Tesla wouldn’t have stopped it wouldn’t have caused the accident, that I agree… and it’s terrible and should be disabled until these issues are solved.
But at the end most of the pileup was caused by people being people. Like it could have been a normal car or even a Tesla that stopped for any good reason and the pileup would have happened anyway… The first cars stopped fine but then some other cars that didn’t keep a safe distance/ were going too fast or were distracted started crashing.
You can see the pileup in the video. Tesla has a phantom break for no reason, and the cars driving behind slam into it.
They wouldn’t have slammed into it, if they’d kept their safe distance as @XTornado@lemmy.ml wrote. I’m in no way defending Tesla‘s „Autopilot“, it should be banned until they pass a very difficult test proving true self driving capabilities and multiple layers auf fail safes (which they can’t right now). But examples where an autopilot Tesla did something stupid and other people making human errors are disingenuous: if somebody drops their cigarette and breaks unexpectedly and the cars behind don’t keep their distance and slam into it, the reason they have an accident is not the cigarette but their dangerous safety distance.
It’s also not just Tesla’s that have automatic breaking that can fail. Even basic cars have that “feature” now. I fucking hate it. My entire family was in the car and we almost got rear ended super hard because ours kicked in for the right reasons, but did it super far back from the incident and fast as hell. The person behind us had less than a second to react, luckily, there was no one in the next lane and they were able to swerve around. I’ve been terrified to drive it ever since.
I agree no one should be tailgating but the algorithm needs to factor that into situation when it is happening. I knew exactly how I wanted to handle the situation but the stupid car prevent me from doing it the safer way.
Drivers aren’t safe and they’re killing people. The FUD around this is hyperfocusing scrutiny. The real shame here is just leaving it up to Tesla. Governments should be throwing huge piles of money towards R&D, best practice standards, and breaking down barriers to testing and implementation. Thousands of lives are being lost because we are not moving fast enough with safe self driving tech. We have the ability to do this and it’s a shame it’s going so slowly at the cost of driver and pedestrian lives.