Tesla Cybertruck Drives Itself into a Pole, Owner Says 'Thank You Tesla'

  • I've owned a Model 3 for years now, and FSD is scary as hell. We haven't paid for it -- and we won't -- but every time we get a free trial of it (mostly recently this past Fall), I give it a whirl, and I end up turning it off. Why? Because it does weird shit like slow down at an intersection with a green light. I don't feel like I can trust it, at all, and it makes me more anxious than just using standard auto-steer and cruise control (which still ghost breaks sometimes). I don't get why anyone uses FSD.

  •    Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
    
    I walked away without a scratch. This could have easily killed an innocent pedestrian or bicyclist. How is this best safety engineering? If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.

  • Post gets 8m views as of this writing. Owner doesn’t want this message to get viral because then tesla gets flak for this. Takes the blame. Wants to share the message of FSD’s fallibility with everyone. Praises tesla for safety.

    My head hurts with how oxymoronic this is. My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner. “Thank you sir for doing treat work and for fixing this problem in the future”

  • Worth reading the actual tweet, not just the article's truncation of it

    > Soooooo my @Tesla @cybertruck crashed into a curb and then a light post on v13.2.4.

    > Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.

    > It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb.

    > Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven't heard of any accident on V13 at all before this happened. It is easy to get complacent now - don't.

    > @Tesla_AI how do I make sure you have the data you need from this incident? Service center etc has been less than responsive on this.

    > I do have the dashcam footage. I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material.

    > Spread my message and help save others from the same fate or far worse.

    https://x.com/MrChallinger/status/1888546351572726230

  • The cybertruck owner is clearly only interested in their own safety. Luckily, in my country cybertrucks are not allowed on the road, for other people's safety.

  • Some cars, when I see photos of them smashed up, I get very sad. NA Miata, Corvette C4, etc. A totaled Cybertruck, honestly, good riddance. It is an extraordinarily difficult vehicle to love.

    Very glad to hear no pedestrians got hit. Really hope the driver takes some kind of lesson away from this experience.

  • This is because Tesla‘s implementation hasn’t worked, doesn’t work, can’t work, won’t work, won’t ever work, and has been a decade-long intentional fraud from a con artist that was designed to pump up a meme stock. THAT worked.

  • Anothe proof that self-driving cars with human backups should never be allowed on public roads. It's going to be used unsafely because it encourages such behaviour.

  • At some point these reckless drivers need to start going to jail. I realize it's not going to happen in the US because the government has been captured, but there's clearly some missing messaging where these drivers don't get the point that they need to be paying attention and not tweeting on their phone while their car drives into a lamppost.

  • Interesting comments from X:

    Snowball: "So FSD failed but you still managed to find a way to praise Tesla. You failed too for not taking over in time. But your concern isn't for the lives of third parties that You and FSD endangered. No, you are worried about Tesla getting bad publicity. You have misplaced priorities."

    Jonathan Challinger (the driver who crashed): "I am rightly praising Tesla's automotive safety engineers who saved my life and limbs from my own stupidity, which I take responsibility for. [...]"

    Fair points from both sides I think.

  • There was a post on BlueSky about this the other day. Someone linked a picture of the intersection: https://bsky.app/profile/pickard.cc/post/3lhtkghk6q224

    It is worth noting that this picture is a reply to a screenshot of someone saying the following:

      > I've lived in 8 different states in my life and most roads I've seen do everything they can to prevent human error (or at least they do once the human has shown them what they did wrong). The FSD should not have been fooled this easily, but the environment was the worst it could have been, also.
    
      - Tweet source: https://x.com/MuscleIQ2/status/1888695047044124989
    
    I point this out because I think probably the biggest takeaway here is how often people will bend over backwards to reach the conclusion that they want, rather than update their model to the new data (akin to Bayesian Updating for you math nerds). While this example is egregious, I think we all should take a hard look at ourselves and question where we do this too. There's not one among us that isn't resistant to change our beliefs, yet it's probably one of the most important things we can do if we want to improve things. If we have any hope of being able to not be easily fooled by hype, if we are to be able to differentiate real innovation from cons, if we are able to avoid joining Cargo Cults, then this seems to be a necessity. It's easy to poke fun at this dude, but are we all certain that we're so different? I would like to think so, but I fear making such a claim is repeating the same mistake I/we are calling out.

  • > And rather than being upset with Tesla for selling him a smart dumpster on wheels, the driver took blame for the incident, saying he should have been paying attention.

    In all fairness he really should have been paying attention.

    You don't get to abdicate your responsibility to Team Elon because reasons. At the end of the day you will be sitting in the defendant's chair while Tesla will just quietly settle out of court.

  • I think it would be wise to physically test as many corner cases as possible under extreme conditions. At night, in the snow, going down a hill, birds flying across the road at the same moment a baby robot crawls on to it.

  • Was this statement made before or after the driver was contacted by Tesla?

  • Musk is currently claiming that Tesla will have driverless robotaxis on the road in June 2025.[1]

    Be afraid. Be very afraid.

    Tesla is in a bind. They've been promising self driving Real Soon Now since 2016, with occasional fake demos. Meanwhile, Waymo slowly made it work, and is taking over the taxi and car service industry, city by city.

    This is a huge problem for Tesla's stock price and Musk's net worth. Now that everybody in automotive makes electric cars, that's not a high-margin business any more. Tesla is having ordinary car company problems - market share, build quality, parts, service, unsold inventory. Tesla pretends they are a special snowflake and deserve a huge P/E ratio, but that's no longer the reality.

    Tesla doesn't want to test in California because of "regulation". This is bogus. The California DMV is rather lenient on testing driverless cars, and California was the first state to allow them. There was no new legislation, so DMV just copied the procedures for human drivers with a few mods. Companies can get a "learner's permit" for testing with a safety driver easily, and quite a few companies have done that. The next step up is the permit for testing without a safety driver, which is comparable to a regular driver's license. It's harder to get, and there are tests. About a half dozen companies have reached that point. No driving for hire at that level. Finally there's the deployment license, which Waymo and Zoox have. That's like a commercial drivers license, and is hard to get and keep. Cruise had one, but it was revoked after a crash where someone was killed.

    That's what really scares Tesla. The California DMV can and will revoke or suspend an autonomous driving license just like they'd revoke a human one. Tesla can't just pay off everyone involved and go on.

    Waymos are all over San Francisco and Los Angeles, dealing with heavy traffic, working their way around double-parked cars, dodging bikes, skateboarders, and homeless crazies, backing out when faced with an oncoming truck in a one lane street, and doing OK in complex urban settings. Tesla has never demoed that level of performance. Not even close.

    [1] https://www.reuters.com/technology/tesla-robotaxis-by-june-m...

  • The front end of the truck is impressively smashed. It must have been going quite fast?

  • Your defective autopilot will drive you into a post and you will be grateful.

  • "Big fail on my part, obviously.”

    Hello? Whether it's Full Self-Driving or not, it's always your fault.

  • I've had a 2017 Model X since new that came with FSD. I had Tesla upgrade the FSD computer (for free), and drove like a granny during the FSD trial period when you needed to have a certain "score" (which was mostly dictated by not cornering or braking hard) to be eligable for FSD.

    I try it every major release, and am disappointed every time. In situations where I'd be confident, it is overly cautious. In situations where I'd be cautious, its overly confident and dangerous.

    I think its best use is to keep the car in the lane while I'm distracted by something (pulling out a sandwich to eat, etc). And it seems like newer Teslas have eye tracking, so it might not even be useful for that.

  • Is there a term for when society shifts the frame of reference on all dimensions so that anything that would have been deemed odd / wrong now naturally ends up positive ?

  • I've seen the tweet before, and the issue I have is: one person is claiming FSD crashed their car. No video, no other evidence.

    I'm not saying it wasn't FSD, but it is a possibility FSD wasn't even enabled.

  • As a general rule, a person who knows the patch level of FSD in their Cybertruck is a person who is about to say something unimaginably stupid.

  • “Think of how stupid the average person is, and realize half of them are stupider than that.”

      ― George Carlin

  • I often wonder how far Tesla would have progressed FSD if it wasn’t for the zealous hatred of LIDAR.

  • Why is this death trap still legal?

  • “It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb. Big fail on my part, obviously.”

  • This is a particularly extreme case of ‘I love my Tesla, but…’

  • I’m mostly impressed that the pole held up.

  • "Full Self" Driving

  • obviously he was.. taking a nap.

  • [flagged]

  • one spooky sub is https://old.reddit.com/r/CyberStuck

    FSD is clearly not even beta quality

    people keep saying it's "trying to commit suicide"

    and it's being fixed on the fly at everyone else's cost in life

    But now they are removing federal reporting requirements so buyers will NEVER know