> "requires a fully attentive driver and will display a series of escalating warnings requiring driver response."
I understand the reasoning behind it, but watching the video () of the test shows that the car did not warn the driver, and even if it did it was speeding too much leaving almost no time for a driver to respond
Disclaimer- I have never used FSD before
() https://dawnproject.com/the-dawn-project-and-tesla-takedowns...
The real problem is that it didn't recognize and stop for the stop signs on the school bus. The child is basically an afterthought designed to appeal to the emotion of those whom logic fails. Even if no kids materialized the way a bus stop works (bus stops, then kids cross) means that detecting the kid really shouldn't be the primary trigger for stopping in this situation, the stop sign needs to be recognized and acted upon. Any ability to detect and stop for pedestrians is secondary to that.
Ignoring a stop sign, not even slowing down, thats a major safety flaw.
I am wondering if there is a safety certification body for self driving technology. If not, one is needed because consumers can't be expected to be aware of all the limitations of the latest update they have installed.
There must be basic safety standards these systems need to meet, a disclaimer can't be the solution here.
I think the idea of self-driving needs to be strongly re-evaluated. There are countless videos of people in their Tesla's driving down the road.. from the back seat. FSD is simply not feasible right now, but it seems that when people let Tesla Take the Wheel, they are easily duped into assuming it will always work - when it doesn't.
Until there are federal standards and rigorous testing of self-driving vehicle technologies they should not be installed, or advertised.
Regardless of one's stance on Tesla, it's sad to see this post flagged.
Before anyone says it's on the responsibility of the driver, that's only while there is still a driver.
https://www.reddit.com/r/teslamotors/comments/1l84dkq/first_...
This has done the rounds on other platforms. A couple of important points:
- the failure here is that the car didn't stop for the bus on the other side of the road with the extended stop sign. (Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)
- the FSD version for the robotaxi service is private and wasn't the one used for this test. The testers here only have access to the older public version, which is supposed to be used with human supervision
- the dawn project is a long-time Tesla FSD opponent that acts in bad faith - they are probably relying on false equivalence of FSD beta vs robotaxi FSD
Nevertheless this is a very important test result for FSD supervised! But I don't like that the dawn project are framing this as evidence for why FSD robotaxi (a different version) should not be allowed without noting that they have tested a different version.
How much of Tesla's FSD is learned from watching a lot of recordings of human drivers and how much of it is explicitly coded rules?
I'd expect school bus stop signs to be challenging to learn because of how context dependent they are compared to regular stop signs. Some examples, drawn from the rules in my state (Washington) are below. These may be different in other states.
With a regular stop sign you stop, then you go when it is safe to do so.
When you stop for a school bus sign you are supposed to stay stopped as long as the sign is there. You don't go until the school bus retracts the sign. It is essentially like a red light rather than a stop sign.
Whether or not a school bus stop sign applies to you depends on the direction you are traveling and on the structure of the roadway.
It applies if either of the following are true: (1) you are traveling in the same direction as the bus, (2) it is a two lane roadway and the lanes are not separated by a physical barrier or median. Three or more lanes or a barrier or median and cars traveling opposite the bus don't have to stop.
It's another Dan O'Dowd test. I'm now convinced that all of his tests are elaborate fakes, since he has a constant stream of super dangerous failures, but none of them are reproducible for anyone else.
I don't know why this is flagged unless it's just the Tesla/Musk association. I thought that self-driving vehicles are a popular topic on HN.
Why on earth can't this be done by normal testing agencies? Why do things like "Tesla Takedown" have to participate in it? Even if the test was 100% legit, that connection to mere protest movements taints it immediately. It's like when oil companies publish critical research on climate change. Or Apple publishing research that AI is not that good and their own fumblings should not be seen as a bad omen. This kind of stuff could be factually completely correct and most rational people would still immediately dismiss it due to conflict of interest. All this will do is flame up fanboys who were already behind it and get ignored by people who weren't. If real goal is to divide society, this is how you do it.
this is a dan o'dowd production. he's spent significant capital trying to take down autopilot/fsd for years and has played dirty in the past.
It seems crazy to me that the U.S. allows beta software to be used to control potentially lethal vehicles on public roads. (Not that I have much faith in human controlled lethal vehicles)
> The tests were setup by anti-Tesla organizations
:)
Any good reason to believe these tests?
[flagged]
[flagged]
[flagged]
[flagged]
Hopefully the dummies were from the board of directors
Isn’t FSD modeled after real drivers?
In that case it could’ve learnt almost nobody ever fully stops at a stop sign, or in this case a bus stopped with a stop sign.
Why are Tesla related posts still being flagged? Mr Musk stepped out of his governmental role so criticism of his assets is no longer unavoidably political. My understanding was that criticism of Tesla was for the past few months seen as a political action and that many here don't want any inflammatory political discussions about the current US administration, but what's the current reason for flagging? This is surely tech/business news through and through.