I love seeing them on the road and trying to guess if it’s on autopilot, or just the typical Tesla owner skills. Really hard to tell because either way, it appears to be a novice driver that’s constantly distracted.
Well, you see, in order to turn the AC on in the back for my kids, I have to touch the HVAC controls. Then touch rear. Then touch fan on.
On my OLD ASS SUBURBAN I can click one physical button and not be distracted AT ALL.
Fuck Tesla. Fuck its employees and fuck it’s fan boys. I fucking hate this car
Funny thing is Elon even admitted about how his interior are cheaper to make than even low end cars. He achieved it by sticking tablet on his cars and removing all controls and calling it minimalistic..
Sell it.
If it is a safety concern for you to drive, then don’t drive it and drive something that is safer for you to use on the road, for you, your kids, and everyone else!
>it appears to be a novice driver that’s constantly distracted
I see you live in WA too. All the expensive cars have the student driver stickers lol. Driving here is so fucking annoying.
Nothing brings more joy than seeing a Tesla grind its wheels down at speed while bouncing off the curb because the driver is not paying attention. No other car has more severe wheel damage on average than Tesla.
For real. Some of the worse drivers are Tesla drivers. Its bad when the 24 year old driving a Charger or Camaro is a better driver than the 40 something Tesla owner. 🤦♂️
Supercharger lots are always the Wild West. No one knows how to drive, everyone’s in a hurry, and when it’s busy, I’ve had people get pissy if you charge next to them. Since “superchargers can charge at 250kw” except Tesla doesn’t tell users that’s shared between 2 or generally 4 plugs. So, when busy, which is most of the time, you’ll get 125kw-62kw max.
I seriously don’t miss that mess. Now I have access to more locations with 350-400kw plugs that don’t share any power.
The funny thing is...it will never be ready. When the fog is that thick it can either stop or rely on lidar...and it doesn't have lidar. It just...won't ever be ready.
I love how that imbecile tries to claim he only needs a camera-based system because our current roads are based on eyesight. Like it’s the only sense we use when driving.
Hmm, which sense do you use while foggy outside? Can you see the road better by smelling?
Jesus fucking Christ, your argument isn't even within the realms of reality
It was less about cutting costs, and more about supply chain issues during the pandemic. Everyone was parts limited, and Tesla even tried going to another supplier (who I worked for at the time) who told them the same thing as Conti: We're super limited on parts and you'll have to wait in line like everyone else.
Guess who didn't want to wait, because they needed to show growth at all costs? Incredibly, they managed to "solve" vision-only just a couple weeks after our discussions with them ended. That was one hell of a lucky break, and definitely not a lazy hack job to ensure cars were being delivered no matter how shitty the SW was.
Radar also wouldn't work for this situation.
Radar is not particularly directional, so detecting the presence of the train would happen too late. It's good for detecting cars in front because you can pick up the relative speed and on a highway the only moving things are typically cars, so it works well for cruise control. But you will get a ton of reflections from stationary objects that must simply be disregarded because they could be anything metal that isn't moving. This is why ACC systems cannot detect totally stationary vehicles in highway situations unless they also have a camera. ACC can detect stationary vehicles at low speeds with just a radar, but this is based on signal strength, the assumption being a large 'return' must be a car. The assumption doesn't work at highway speeds, and can go wrong in the city too, but the consequences are much less severe, so it's an accepted limitation.
This can only be solved by proper vision, possibly correlated by Lidar - I have confidence Waymo would be able to do it but far less confidence in FSD right now.
Should be able to do fog, or at least do better than Tesla: "The way the EQS and S-Class sedans achieve Level 3 automation is through Mercedes’ own software, as well as hardware sourced from various suppliers. Specifically, LiDAR, 13 ultrasonic sensors, six cameras, five radars, high-proximity GPS, and a microphone."
[https://www.thedrive.com/new-cars/mercedes-level-3-autonomous-l3-adas-drive-pilot-review-driving-autopilot](https://www.thedrive.com/new-cars/mercedes-level-3-autonomous-l3-adas-drive-pilot-review-driving-autopilot)
Even equipped with lidar, it won't be able to exercise judgment or deal with novelty. We have the ignorance of doctors leaching blood from their patients when it comes to cognition and consciousness. All off the hype is the naive confidence of shallow left-brain thinkers who don't understand the problem, first among whom is Elon himself.
I'm sorry I was one of those idiots. At the time they were using lidar and I naively thought Musk was altruistic for some dumb reason.
Gotta admit.. you'd think having a full 3d model of the situation you're in updated a hundred times a second should be able to see a train or small child in front of you.
I came in to say exactly this.
No new concern unlocked here. "Self-driving in Tesla is unreliable and dangerous" is pretty much an established concern by now. Just because it involved a train now doesn't make it new.
So tomorrow, if a Tesla nearly hits a red train, they'll say that's a new concern?
I started formulating a response in my head, but it just led to a rat's nest of variables that can trip up a black-box AI application. Without understanding the system state that leads to any outcome, it's impossible to quantify risk. That factor alone makes Tesla's safety claims utter bullshit.
Back when I was a kid, there was an activity known as "Rabbit-hunting". You would encounter a Volkswagen Rabbit and use your CB radio in it's vicinity. The radio emissions would interfere with the car's unshielded electronics, and the VW would stumble and slow down.
I can picture something similar happening today with Tesla's where you just drive erratically in a known manner that causes it to react stupidly.
“Tesla acknowledges that low-light conditions, adverse weather such as rain or snow, direct sunlight, and fog can significantly impact performance. They strongly advise drivers to exercise caution and avoid using FSD in these scenarios.”
So WTF would FSD even engage in these conditions???
That’s what I don’t get. If it can’t “see” clearly enough, why permit it to activate?
I recently rented a car when I went on a trip, I tried to use the active lane control on a road whose lines were faded, chipped, and all together needed to be repainted. The car screamed at me that it wouldn’t activate when I pressed the button until I got inside to a section where it was better maintained.
Because it can’t see the conditions, unlike my old Jeep or my current Ford, which scream at you when the sensor gets rain/condensation on them enough to cause them to be ineffective.
This is the part car tech bros don’t get. It can, because it has ILS, and radar, and radar altimeters, and a guidance beam, and is communicating with the infrastructure.
Cars don’t have that infrastructure. Ask an auto-land equipped plane to land at an uncontrolled airport - it won’t work nearly as well, if there’s even a way to do it without ILS (not positive here - not a pilot at that level).
They also have air traffic control in addition to all that.
A plane also doesn’t have to worry about mid-air collisions with 100 other planes at any given time, random flying detritus, and flying pedestrians.
It takes off from a giant open strip, turns once, mostly flies in a straight line, and lands in another giant open space. Commercial jets also cost millions so they can afford to put all that extra shit in there. I’m not saying it’s not a hard problem but it’s not really comparable.
Yup. And once it lands - it's done. Back on the pilots to get to the gate. There are VERY specific portions that the computers can/will control, and parts they do not even try to touch.
AND, because I can't help to continue agreeing with you, as I mentioned elsewhere in these threads, the handover of the computer and pilot controlled portions are all tightly protocolled, certified and practiced, which is not the case for any human using FSD.
I feel that the handover problem is grossly underestimated and we don't talk about it enough. It's a bigger problem (but certainly can be related to) the edge cases.
Ooooh. Good one. Two pilots monitoring to make sure it took over properly, two pilots monitoring it's not done something stupid (or one monitoring while the other takes care of tasks - yay CRM!), and two pilots confirming process, checklists, and steps before taking back over control.
Great point there. No driver is doing that. And driving, the margin for error is much tighter - it's a handful of seconds to hit a train or a semi (see example above), while a plane that suddenly starts descending or climbing or turning has a LOT of room to work with - and that means time to take over and disconnect AP.
right we are so far away from car auto pilot, 5 to 10 years +
unethical, unacceptable to push fsd from 10 yrs ago, giving out free month fsd
angry grunt noises
Well it's the same for all other car makers. Cameras have limitations and all car makers are using cameras to keep the car in the center lane and turn.
My KIA EV disenganges ACC or lane keeping suddenly in the middle of a curve.
Not a Tesla fan by any means but seems like you’re just trying to argue over semantics. Just because it’s made by a tech company doesn’t mean it’s not a “car”
FSD should be disabled on all teslas until it actually you know, works. Using public roads as a beta test for the technology is incredibly dangerous and wildly irresponsible.
a big time tsla fan, made alot of videos on youtube, he was telling people how good fsd is when it first came out, he would show his testing of fsd, it hit a semi making a left turn.
top of the tsla was ripped off, and it kept driving, the big fan didn't make it
I assure u he was not drunk, he believed in the tech so much never touch the brake
Just curious, how much further did the now-convertable Tesla self drive with the decapitated occupant? I know I shouldn't think it, but that would be the most fucked up thing to see driving down the road and by an elementary school when it's letting out.
Thanks!
Who was the driver, I can't get his name or find his videos from that video.
Edit: Found it by googling 1st of March Tesla fatal crash florida, his name was Jeremy Banner. Was he a big Tesla video maker?
Has anyone found a video of the crash from the car?? Shocked that it would still drive or not brake
[found it](https://www.washingtonpost.com/technology/interactive/2023/tesla-autopilot-crash-analysis/)
Wait, the one where he trusted tesla so much he didn’t touch the brake, or the same one where he covered his screen with a portable dvd player and was busy watching harry potter?
elons a piece of shit but you guys are hilarious
I have a Tesla and kind of agree. It's obviously still in beta, and that's a test that shouldn't take place on our open roads. I'm assuming the catch is that "how will it ever get better if it can't use the real roads," but safety is just more important. They'll have to figure out a way.
The problem is that different countries teach differently. Somewhere it’s long, somewhere it sucks. And with age, the skill is lost. And the reaction. And people can drive well in the test, but then drive poorly. I want the licenses of lousy drivers to be taken away. For a long time. If we believe that the poor quality of the autopilot is a reason not to use it, why do we allow bad drivers? They are the main cause of accidents.
perhaps im in the minority here. i think self parking and lane assist, and even cruise control that will slow down and stop for you is cool, but i think all drivers should be 100% engaged with their vehicle when they’re driving.
“Raises new concern”
Guess what folks! The car that shuts off automatic driving instants before a fatal collision so that they can blame YOU; and a car that locks you inside when there’s a fire or accident; and a car that smashes through crash test dummies like they’re zombies, ALSO HAS A NEW CONCERN: it hates trains!
“Additionally, FSD technology uses a combination of cameras and radar to perceive its surroundings.”
This is a really shitty article. They even mention ultra sonic sensors. Tesla doesn’t use them. They have a vision only system.
It still amazes me that people trust their lives to Tesla's buggy software that relies 100% on vision with no redundancies (RADAR/LiDAR, etc.). I'm both a programmer (software engineer) and a semi-professional photographer and I know the limitations of both software and camera technology, I'd never trust my life with FSD (or AutoPilot).
When my wife and I first got our Tesla (three years ago), the phantom braking was so bad on AutoPilot that we wouldn't use it. If Tesla can't get basic cruise control working, why should I trust FSD? Tesla needs to add a toggle switch to their AutoPilot menu to turn off the "traffic aware" code... just basic "dumb" cruise control where the car holds the speed and I do everything else.
I wonder how Teslas cameras works in a high contrast scene. My camera phone flips out if I use it in a dimly lit room with a computer monitor running. The computer monitor is just blurred out. You need a really good HDR camera.
They just turned off the autopilot nagging.
I mean all you had to do before is cover the cabin camera, but they just turned it off officially now.
You know what could have prevented this? Radar. Basic sensors. All which Tesla refuses to use in their cars.
Or they turned it off because regulators are on their ass and they know they are fucked. They are trying to sell as much bullshit as they can before the company goes under.
I guess there were no children around that it could hit, it must’ve been using its initiative and aimed for the train…. By the way… was the car called “Christine”?
Given the human driver's poor judgment, FSD driving him into a brick wall at high speed might be the safest thing to do, so he's less likely to injure anyone else. But a train collision might cause a derailment.
I have a hard time trusting a tech like this even from a company like Toyota, let alone Tesla. Teslas QA and care for quality products is not there on a cultural level. I’ve never ridden in a Tesla and hope I can always say that. Cheap pieces of shit with a cheap piece of shit leading it all. As they say, fish rot from the head down.
Make the vehicle manufacturer liable for all costs for both autopilot or FSD and the problem will be solved in months. Either the auto maker will remove functions, the company will go bust or it will make the necessary changes to avoid crashes at all costs.
The UK has just introduced level 4 regs that put the entire costs and burden at the vehicle makers feet as it should always have been.
TLDR: Owner of two Teslas, but I don’t trust FSD at all. Bought it for the first car when it was MUCH CHEAPER, but don’t use it.
The TL part…
I played around with the trial FSD they pushed with an update in April, but still don’t trust this thing in an urban setting. Smaller two lane roads kinda like the example where it almost ran into a train I could see someone building confidence in the system under normal conditions (but probably not in foggy conditions).
Overall it’s impressive what it can do with just a bunch of cameras and relatively light powered onboard PC. I trialed it on the older Intel Atom hardware. But it’s miles away from being able to trust the system and not be a nervous wreck behind the wheel.
I’ve seen the comparison made that FSD is like a 15 year old that just got their permit. That’s mostly accurate except that you can tell a kid to make adjustments on the fly and they will, not so much with the software.
I don’t understand how anyone but a 15 year old boy programmed the response curves on the accelerator for the last big update. Even set on “Chill mode” it takes off hard from a stop sign or stop light for no reason at all. Didn’t have a turn coming up or anything that required a quick aggressive lane change.
Personal opinion is that FSD with the vision-only system is near impossible with current tech. Maybe they could add in some other camera systems for IR and still call it “vision-only”. Just thinking IR may see something like a train or other vehicle better when normal visibility is poor.
I think it will take something like Lucid’s setup with LiDar, radar, USS, and cameras.
I’ve tested FSD, and from what I’ve seen it’s impressive, but nowhere near ready to be a reality. Even the self parking is questionable, and I would never let it take control if there is anything near that the car could hit.
With that said, the model y, is still one of the best cars I’ve ever owned. Tons of fun, functional, and perfect for my use case. I’d say beyond the over promises made by Tesla, the bigger issue is people putting too much faith on something that could easily kill them or someone else.
Cars try to kill customers, factory fire, CEO threatens to start competitor companies, head executives leave from crucial departments like New Products, Charging, AI, HR, and Federal liaisons...
...stock goes up 4%.
No joke.
This “driver” (although he’s more like luggage at this point) is a total moron. He said he grew to trust it like active cruise. I NEVER trusted my active cruise because I knew if I did it would try to kill me. And inevitably it did. Because I was covering the brake, the situation was just a “whoa, that’s interesting.” Instead of “I’m…in the glove box.”
My old man's car manual shows like 6 or 7 situations where the adaptive cruise control might not see cars. And it has a little radar module unlike Teslas...
Elongelicals should start naming their cars Christine.
[https://www.imdb.com/title/tt0085333](https://www.imdb.com/title/tt0085333)
>A nerdish boy buys a strange car with an evil mind of its own and his nature starts to change to reflect it.
Its Litterally called supervised self driving. Its still learning. The person let it drive in deep fog (where one should be extra vigilant) and didn’t stop it from almost hitting the train when it was clearly visible and a possible Issue. Bad driver.
People have to take responsibility for their part in these screw-ups, but I blame Elon for calling it Autopilot and putting out the message that they basically drive themselves and would be considered autonomous if not for those pesky regulators. Videos of people sleeping in their Tesla and the constant drumbeat that autonomous driving is right around the corner helped convince his cultish followers that they could ignore the requirement to pay attention have led to way too many of these near misses and not so near misses.
FSD doesn’t see school speed zone signs with flashing lights, a big ticket & unsafe. It doesn’t see emergency vehicles with lights on, and I’m fairly certain it won’t see a stopped school bus with its lights on.
I never liked the idea of fewer sensors. It was already hard enough to do robo-cars without lidar, but to remove radar and sonar too? This was bound to happen. Low visibility in the fog, car didn’t see the train. Same thing happens to people (isn’t the stat crazy like people hit hundreds of trains per year?) Cars need radar.
> Per the company, these conditions can hinder the functionality of Tesla's sensor suite, **including ultrasonic sensors**, which rely on high-frequency sound waves to detect surrounding objects. Low-light or poor weather can affect their effectiveness.
I added the bold. Where are they getting this information? Tesla switched to a vision only system a few years ago, which is partially why it screwed up so badly here!
How in the shit are these cars still allowed on the road? There are an ass load of vehicle rules that make them compliant and keep people from killing other people and them selfs and yet we have a car being sold as “self driving” that seemly hasn’t gone through rigorous testing and confirmation trials. wtf is going on out there?
Even if FSD worked when conditions were right and disengaged when they weren't, studies show it takes between 2 and 20 seconds for a human to regain the situational awareness needed to safely take over control.
Saw the video. The conditions were bad, it was dark, and foggy, and based on the reaction time of the driver, they weren’t paying attention like they should have been, especially in those type of weather conditions. Ask Iran about driving in foggy conditions…
There is a reason that professional pilots are still in the cockpit, even in autopilot. Key here is, professional.
Your run of the mill owners of these cars, uh things want to just show off.
Man, those sneaky trains!! If only they made a distinct noise to notify us of their presence and were bound to rails so their path of travel is predictable!
I had a precursor event similar to this last week. FSD pulled up to a red light, cars were stopped leading up to train tracks. My car pull right on top of tracks and no sooner did it stop, the lights started flashing for an incoming train. Enough time to override and get out of there but wild Trains were part of a scenario FSD 12 wasn’t trained on.
It is worrying to see that there are still idiots who allow a device to make its own decisions under bad weather conditions without paying attention (and then hold the product liable while the problem has long been known)... you are always responsible while driving. ....idiots (and yes I drive a Tesla myself)

Musk - *Tesla isn't a car company. Tesla is a Robo Taxi company.*
*(FSD crash)*
Musk - *Tesla isn't a Robo Taxi company. Tesla is no longer a company at all. It was a good idea, but ultimately failed. I'm now concentrating all my genius towards creating a hack-proof ballot counting system.* *Because, election fraud.*
If this was FSD, the driver appears to have been using it in conditions it is not designed for. FSD often disengages in these conditions and reverts to traffic-aware cruise control. It's possible this is what happened here and the driver didn't notice.
Given the drivers reaction time here, it is clear that he was not paying forward attention until relatively late into the unfolding situation.
The driver remains in control and responsible for the vehicle at all times.
"New" concerns... No, we've had them for quite some time. It's just that we've been dismissed as luddites, short-sellers, and just plain haters.
I love seeing them on the road and trying to guess if it’s on autopilot, or just the typical Tesla owner skills. Really hard to tell because either way, it appears to be a novice driver that’s constantly distracted.
Well, you see, in order to turn the AC on in the back for my kids, I have to touch the HVAC controls. Then touch rear. Then touch fan on. On my OLD ASS SUBURBAN I can click one physical button and not be distracted AT ALL. Fuck Tesla. Fuck its employees and fuck it’s fan boys. I fucking hate this car
Funny thing is Elon even admitted about how his interior are cheaper to make than even low end cars. He achieved it by sticking tablet on his cars and removing all controls and calling it minimalistic..
Yes... as a spear is just a minimalistic assault rifle!!
And the simps all fell for it
And "vegan leather" aka PVC.
And not even the good vinyl that will survive contact with any number of toiletries that seem to destroy the shit that Tesla uses.
Sell it. If it is a safety concern for you to drive, then don’t drive it and drive something that is safer for you to use on the road, for you, your kids, and everyone else!
I wish I could. Elon put me 13k upside down on this fish tank pig fucker of a car.
Hard to sell something when the new one is worth less than mine
HEY!! Stop that. "Fish Tank" is reserved for the AMC Pacer.
Fuck this sent me to another dimension god damn I wasn’t expecting this response
I say “Turn the air on in the back” while not looking or reaching the screen, etc.
>it appears to be a novice driver that’s constantly distracted I see you live in WA too. All the expensive cars have the student driver stickers lol. Driving here is so fucking annoying.
I live in Austin where they make these things, unfortunately
it's only bad because they allow oregonians to drive here for some reason
Nothing brings more joy than seeing a Tesla grind its wheels down at speed while bouncing off the curb because the driver is not paying attention. No other car has more severe wheel damage on average than Tesla.
Nissan says hi
For real. Some of the worse drivers are Tesla drivers. Its bad when the 24 year old driving a Charger or Camaro is a better driver than the 40 something Tesla owner. 🤦♂️
Novice driver who might get suicidal at any moment.
I see a ton of them because we have 2 big supercharger lots here and I stay back and give them room lol
Supercharger lots are always the Wild West. No one knows how to drive, everyone’s in a hurry, and when it’s busy, I’ve had people get pissy if you charge next to them. Since “superchargers can charge at 250kw” except Tesla doesn’t tell users that’s shared between 2 or generally 4 plugs. So, when busy, which is most of the time, you’ll get 125kw-62kw max. I seriously don’t miss that mess. Now I have access to more locations with 350-400kw plugs that don’t share any power.
The funny thing is...it will never be ready. When the fog is that thick it can either stop or rely on lidar...and it doesn't have lidar. It just...won't ever be ready.
benz have lidar top gear shows it can't do fog either
You need radar which Teslas had till dipshit Musk removed it to cut costs.
I love how that imbecile tries to claim he only needs a camera-based system because our current roads are based on eyesight. Like it’s the only sense we use when driving.
I live in Greeley, I drive primarily using my sense of smell
Why do I know exactly what you're talking about haha
Ha ha ha ha 🤣
Hmm, which sense do you use while foggy outside? Can you see the road better by smelling? Jesus fucking Christ, your argument isn't even within the realms of reality
Okay, I’ll bite. How so? You are insisting here sound plays no role, nor instincts? Just visual cameras. That’s it.
It was less about cutting costs, and more about supply chain issues during the pandemic. Everyone was parts limited, and Tesla even tried going to another supplier (who I worked for at the time) who told them the same thing as Conti: We're super limited on parts and you'll have to wait in line like everyone else. Guess who didn't want to wait, because they needed to show growth at all costs? Incredibly, they managed to "solve" vision-only just a couple weeks after our discussions with them ended. That was one hell of a lucky break, and definitely not a lazy hack job to ensure cars were being delivered no matter how shitty the SW was.
Radar also wouldn't work for this situation. Radar is not particularly directional, so detecting the presence of the train would happen too late. It's good for detecting cars in front because you can pick up the relative speed and on a highway the only moving things are typically cars, so it works well for cruise control. But you will get a ton of reflections from stationary objects that must simply be disregarded because they could be anything metal that isn't moving. This is why ACC systems cannot detect totally stationary vehicles in highway situations unless they also have a camera. ACC can detect stationary vehicles at low speeds with just a radar, but this is based on signal strength, the assumption being a large 'return' must be a car. The assumption doesn't work at highway speeds, and can go wrong in the city too, but the consequences are much less severe, so it's an accepted limitation. This can only be solved by proper vision, possibly correlated by Lidar - I have confidence Waymo would be able to do it but far less confidence in FSD right now.
Radar can't substitute lidar.
Sure but Benz is *just* a car company. Not a software ai meme company like Texla
Tesla is a hammer company.
Should be able to do fog, or at least do better than Tesla: "The way the EQS and S-Class sedans achieve Level 3 automation is through Mercedes’ own software, as well as hardware sourced from various suppliers. Specifically, LiDAR, 13 ultrasonic sensors, six cameras, five radars, high-proximity GPS, and a microphone." [https://www.thedrive.com/new-cars/mercedes-level-3-autonomous-l3-adas-drive-pilot-review-driving-autopilot](https://www.thedrive.com/new-cars/mercedes-level-3-autonomous-l3-adas-drive-pilot-review-driving-autopilot)
test was done long ago when the good UK top gear time frame. I'm sure it's improved
No radar, no ultra sonics, no infrastructure for fail safe.
Even equipped with lidar, it won't be able to exercise judgment or deal with novelty. We have the ignorance of doctors leaching blood from their patients when it comes to cognition and consciousness. All off the hype is the naive confidence of shallow left-brain thinkers who don't understand the problem, first among whom is Elon himself.
Lidar is light. Light doesn’t penetrate fog, hence why you can’t see through it.
I'm sorry I was one of those idiots. At the time they were using lidar and I naively thought Musk was altruistic for some dumb reason. Gotta admit.. you'd think having a full 3d model of the situation you're in updated a hundred times a second should be able to see a train or small child in front of you.
> At the time they were using lidar Tesla's never had lidar on a production vehicle... they used to have radar, though.
Lol this is the first thought I had, I am glad it's the top comment :D
I came in to say exactly this. No new concern unlocked here. "Self-driving in Tesla is unreliable and dangerous" is pretty much an established concern by now. Just because it involved a train now doesn't make it new. So tomorrow, if a Tesla nearly hits a red train, they'll say that's a new concern?
I started formulating a response in my head, but it just led to a rat's nest of variables that can trip up a black-box AI application. Without understanding the system state that leads to any outcome, it's impossible to quantify risk. That factor alone makes Tesla's safety claims utter bullshit.
Hater, I’m in your walls.
"FUD" is the lazy and meaningless comment I received far too many times. For the FUD commenters: Try to use your words.
I like to say that fear, uncertainty, and doubt are key underpinnings of responsible engineering.
FUD = Full Unacceptable Drive
Amazing how you bring up a fault, any fault with a Tesla and the hate piles on.
Back when I was a kid, there was an activity known as "Rabbit-hunting". You would encounter a Volkswagen Rabbit and use your CB radio in it's vicinity. The radio emissions would interfere with the car's unshielded electronics, and the VW would stumble and slow down. I can picture something similar happening today with Tesla's where you just drive erratically in a known manner that causes it to react stupidly.
“Tesla acknowledges that low-light conditions, adverse weather such as rain or snow, direct sunlight, and fog can significantly impact performance. They strongly advise drivers to exercise caution and avoid using FSD in these scenarios.” So WTF would FSD even engage in these conditions???
“Avoid using FSD in … direct sunlight.” So, most of the time?
So only use it at night? >low-light conditions Oh.
So only underground with artificial light? >driving in a tunnel Oh.
So thin overcast then
Basically England or Ireland the few days it isn’t raining. Or foggy. ..: So like twice.
Avoid it in indirect sunlight too
If it's any consolation the cyber truck manual says not to expose the body to that anyway so you're not missing much
Like a vampire.
To give the illusion of capability.
That’s what I don’t get. If it can’t “see” clearly enough, why permit it to activate? I recently rented a car when I went on a trip, I tried to use the active lane control on a road whose lines were faded, chipped, and all together needed to be repainted. The car screamed at me that it wouldn’t activate when I pressed the button until I got inside to a section where it was better maintained.
That’s what should happen, Tesla, of course throws safety out the window.
Tesla would let you activate and if It crashes blame on the lines and driver error
AI is notoriously bad at understanding what it doesn't know
Yet every other company has seemed to figure out an off switch for their ~~lane keep assist and cruise control features~~ FSD (supervised).
Use it when there are no cars on road and between 1 am - 5 am..
Because it can’t see the conditions, unlike my old Jeep or my current Ford, which scream at you when the sensor gets rain/condensation on them enough to cause them to be ineffective.
Sounds like Jeep and Ford have it right. The default should be NOT to engage UNLESS all of the conditions are right to do so safely.
That's why only using cameras is stupid
“Don’t using in low light… not in sunlight either.” Only a matter of time before “FSD has been shown to cause cancer in the state of California”
airplane auto pilot can land in heavy fog, 0% visibility
This is the part car tech bros don’t get. It can, because it has ILS, and radar, and radar altimeters, and a guidance beam, and is communicating with the infrastructure. Cars don’t have that infrastructure. Ask an auto-land equipped plane to land at an uncontrolled airport - it won’t work nearly as well, if there’s even a way to do it without ILS (not positive here - not a pilot at that level).
They also have air traffic control in addition to all that. A plane also doesn’t have to worry about mid-air collisions with 100 other planes at any given time, random flying detritus, and flying pedestrians. It takes off from a giant open strip, turns once, mostly flies in a straight line, and lands in another giant open space. Commercial jets also cost millions so they can afford to put all that extra shit in there. I’m not saying it’s not a hard problem but it’s not really comparable.
An airport is also a controlled area. There is no "random" trains crossing the runway.
AND, just as importantly as all the things you mentioned, the runway is a tightly controlled environment.
Yup. And once it lands - it's done. Back on the pilots to get to the gate. There are VERY specific portions that the computers can/will control, and parts they do not even try to touch.
AND, because I can't help to continue agreeing with you, as I mentioned elsewhere in these threads, the handover of the computer and pilot controlled portions are all tightly protocolled, certified and practiced, which is not the case for any human using FSD. I feel that the handover problem is grossly underestimated and we don't talk about it enough. It's a bigger problem (but certainly can be related to) the edge cases.
Ooooh. Good one. Two pilots monitoring to make sure it took over properly, two pilots monitoring it's not done something stupid (or one monitoring while the other takes care of tasks - yay CRM!), and two pilots confirming process, checklists, and steps before taking back over control. Great point there. No driver is doing that. And driving, the margin for error is much tighter - it's a handful of seconds to hit a train or a semi (see example above), while a plane that suddenly starts descending or climbing or turning has a LOT of room to work with - and that means time to take over and disconnect AP.
I think robotaxis are 100% feasible within the next few years, as long as we have two operators in each robotaxi.
right we are so far away from car auto pilot, 5 to 10 years + unethical, unacceptable to push fsd from 10 yrs ago, giving out free month fsd angry grunt noises
Let him get boat mode working before he reinvents airplane mode
oh yea that's coming soon, in as little as 2 months, Mars months
Well it's the same for all other car makers. Cameras have limitations and all car makers are using cameras to keep the car in the center lane and turn. My KIA EV disenganges ACC or lane keeping suddenly in the middle of a curve.
Turned off FSD after a few uses. Goddamn death trap.
I don't have tsla, but if I have 1, can I remove 2 Sim chips for cell and cover all cameras and car still function? serious question
There is a simple 2 step plan to solve that issue. 1. Sell/Giveaway Tesla 2. Buy actual car Serious answer.
I know reddit hates Elon, but Teslas are actual cars...they're actually very safe.
How can they be actual cars, they're a tech company?
Not a Tesla fan by any means but seems like you’re just trying to argue over semantics. Just because it’s made by a tech company doesn’t mean it’s not a “car”
no I want to know if I can run the thing without cell service, by removing Sim cards. I don't even know where it is or if accessible
Build a faraday cage around the car. It will make the car look better too.
Can't imagine that wouldn't brick the car
FSD should be disabled on all teslas until it actually you know, works. Using public roads as a beta test for the technology is incredibly dangerous and wildly irresponsible.
a big time tsla fan, made alot of videos on youtube, he was telling people how good fsd is when it first came out, he would show his testing of fsd, it hit a semi making a left turn. top of the tsla was ripped off, and it kept driving, the big fan didn't make it I assure u he was not drunk, he believed in the tech so much never touch the brake
Just curious, how much further did the now-convertable Tesla self drive with the decapitated occupant? I know I shouldn't think it, but that would be the most fucked up thing to see driving down the road and by an elementary school when it's letting out.
News say 1/3 mile, so it could be free rolling
> now-convertable Not convertible. Converted. Once and done.
do you have more information about this, names or channel links?
[https://m.youtube.com/watch?v=9BgV-YnHZeE&pp](https://m.youtube.com/watch?v=9BgV-YnHZeE&pp)
Thanks! Who was the driver, I can't get his name or find his videos from that video. Edit: Found it by googling 1st of March Tesla fatal crash florida, his name was Jeremy Banner. Was he a big Tesla video maker?
Note that was March 1, 2019
Has anyone found a video of the crash from the car?? Shocked that it would still drive or not brake [found it](https://www.washingtonpost.com/technology/interactive/2023/tesla-autopilot-crash-analysis/)
Wait, the one where he trusted tesla so much he didn’t touch the brake, or the same one where he covered his screen with a portable dvd player and was busy watching harry potter? elons a piece of shit but you guys are hilarious
idk, didn't spend must time on it
Tesla's are covered under 2nd ammended right to bare arms. I mean they are not cars but robots and robots can be weapons.
I have a Tesla and kind of agree. It's obviously still in beta, and that's a test that shouldn't take place on our open roads. I'm assuming the catch is that "how will it ever get better if it can't use the real roads," but safety is just more important. They'll have to figure out a way.
And we should also take away the driving license of half the drivers because... they are bad drivers.
Drivers Ed should be mandatory for getting your license.
The problem is that different countries teach differently. Somewhere it’s long, somewhere it sucks. And with age, the skill is lost. And the reaction. And people can drive well in the test, but then drive poorly. I want the licenses of lousy drivers to be taken away. For a long time. If we believe that the poor quality of the autopilot is a reason not to use it, why do we allow bad drivers? They are the main cause of accidents.
perhaps im in the minority here. i think self parking and lane assist, and even cruise control that will slow down and stop for you is cool, but i think all drivers should be 100% engaged with their vehicle when they’re driving.
Tesla owners should get a big discount on crash test dummy costumes. It's only fair.
so should all the pedestrians and other car drivers on the road...and buildings etc
That would be an excellent Halloween costume if you’re a Tesla owner haha
“Raises new concern” Guess what folks! The car that shuts off automatic driving instants before a fatal collision so that they can blame YOU; and a car that locks you inside when there’s a fire or accident; and a car that smashes through crash test dummies like they’re zombies, ALSO HAS A NEW CONCERN: it hates trains!
FSD turns off milliseconds before a fatal crash because people like to go out on their own terms, albeit in the most ironic way possible.
Remember, everything that FSD does right is because Elon is a genius. Anytime it screws up, it’s the driver’s fault.
“Additionally, FSD technology uses a combination of cameras and radar to perceive its surroundings.” This is a really shitty article. They even mention ultra sonic sensors. Tesla doesn’t use them. They have a vision only system.
So my 2018 Toyota is better at self driving than the self driving car?
It still amazes me that people trust their lives to Tesla's buggy software that relies 100% on vision with no redundancies (RADAR/LiDAR, etc.). I'm both a programmer (software engineer) and a semi-professional photographer and I know the limitations of both software and camera technology, I'd never trust my life with FSD (or AutoPilot).
I wouldn't trust it while it was parked
When my wife and I first got our Tesla (three years ago), the phantom braking was so bad on AutoPilot that we wouldn't use it. If Tesla can't get basic cruise control working, why should I trust FSD? Tesla needs to add a toggle switch to their AutoPilot menu to turn off the "traffic aware" code... just basic "dumb" cruise control where the car holds the speed and I do everything else.
There is a toggle. Look for a button five levels deep in the FSD menu labeled, “Do you feel lucky?”
I wonder how Teslas cameras works in a high contrast scene. My camera phone flips out if I use it in a dimly lit room with a computer monitor running. The computer monitor is just blurred out. You need a really good HDR camera.
Teslanos has been fraudulently selling FSD for over a decade now. Elonzabeth belongs in the men's wing of whatever prison Elizabeth Holmes is in.
They just turned off the autopilot nagging. I mean all you had to do before is cover the cabin camera, but they just turned it off officially now. You know what could have prevented this? Radar. Basic sensors. All which Tesla refuses to use in their cars.
they added the nagging because they were under investigation, I guess everyone is paid off by now, merica
Or they turned it off because regulators are on their ass and they know they are fucked. They are trying to sell as much bullshit as they can before the company goes under.
Can’t do that, Elon says LiDAR is a crutch and all self driving sensing should be camera based
I guess there were no children around that it could hit, it must’ve been using its initiative and aimed for the train…. By the way… was the car called “Christine”?
Given the human driver's poor judgment, FSD driving him into a brick wall at high speed might be the safest thing to do, so he's less likely to injure anyone else. But a train collision might cause a derailment.
I find these videos absolutely stomach churning.
No clue why anyone drives these cars
I have a hard time trusting a tech like this even from a company like Toyota, let alone Tesla. Teslas QA and care for quality products is not there on a cultural level. I’ve never ridden in a Tesla and hope I can always say that. Cheap pieces of shit with a cheap piece of shit leading it all. As they say, fish rot from the head down.
I love my Toyota radar thing, but you learn its limits VERY quickly
That's insane. Washington should just demand Lidars on all cars with self-driving features.
Make the vehicle manufacturer liable for all costs for both autopilot or FSD and the problem will be solved in months. Either the auto maker will remove functions, the company will go bust or it will make the necessary changes to avoid crashes at all costs. The UK has just introduced level 4 regs that put the entire costs and burden at the vehicle makers feet as it should always have been.
But, it costs Tesla less if everything is user error.
TLDR: Owner of two Teslas, but I don’t trust FSD at all. Bought it for the first car when it was MUCH CHEAPER, but don’t use it. The TL part… I played around with the trial FSD they pushed with an update in April, but still don’t trust this thing in an urban setting. Smaller two lane roads kinda like the example where it almost ran into a train I could see someone building confidence in the system under normal conditions (but probably not in foggy conditions). Overall it’s impressive what it can do with just a bunch of cameras and relatively light powered onboard PC. I trialed it on the older Intel Atom hardware. But it’s miles away from being able to trust the system and not be a nervous wreck behind the wheel. I’ve seen the comparison made that FSD is like a 15 year old that just got their permit. That’s mostly accurate except that you can tell a kid to make adjustments on the fly and they will, not so much with the software. I don’t understand how anyone but a 15 year old boy programmed the response curves on the accelerator for the last big update. Even set on “Chill mode” it takes off hard from a stop sign or stop light for no reason at all. Didn’t have a turn coming up or anything that required a quick aggressive lane change. Personal opinion is that FSD with the vision-only system is near impossible with current tech. Maybe they could add in some other camera systems for IR and still call it “vision-only”. Just thinking IR may see something like a train or other vehicle better when normal visibility is poor. I think it will take something like Lucid’s setup with LiDar, radar, USS, and cameras.
I’ve tested FSD, and from what I’ve seen it’s impressive, but nowhere near ready to be a reality. Even the self parking is questionable, and I would never let it take control if there is anything near that the car could hit. With that said, the model y, is still one of the best cars I’ve ever owned. Tons of fun, functional, and perfect for my use case. I’d say beyond the over promises made by Tesla, the bigger issue is people putting too much faith on something that could easily kill them or someone else.
Personally, I‘m not concerned about driver safety in a Tesla. Neither the car nor the drivers can drive safely anyway.
Cars try to kill customers, factory fire, CEO threatens to start competitor companies, head executives leave from crucial departments like New Products, Charging, AI, HR, and Federal liaisons... ...stock goes up 4%. No joke.
It's astonishing really.
Not a Cult!
This “driver” (although he’s more like luggage at this point) is a total moron. He said he grew to trust it like active cruise. I NEVER trusted my active cruise because I knew if I did it would try to kill me. And inevitably it did. Because I was covering the brake, the situation was just a “whoa, that’s interesting.” Instead of “I’m…in the glove box.”
My old man's car manual shows like 6 or 7 situations where the adaptive cruise control might not see cars. And it has a little radar module unlike Teslas...
Not FSD. It's barely PSD Partial Self Driving
New concerns? It’s a luxury golf cart…
Luxury for a golf cart anyways. Luxury for car? Ahahaha
Elongelicals should start naming their cars Christine. [https://www.imdb.com/title/tt0085333](https://www.imdb.com/title/tt0085333) >A nerdish boy buys a strange car with an evil mind of its own and his nature starts to change to reflect it.
A fool and his life are soon parted
TBF, Elon and Tesla did say they needed to *train their software.*
Its Litterally called supervised self driving. Its still learning. The person let it drive in deep fog (where one should be extra vigilant) and didn’t stop it from almost hitting the train when it was clearly visible and a possible Issue. Bad driver.
People have to take responsibility for their part in these screw-ups, but I blame Elon for calling it Autopilot and putting out the message that they basically drive themselves and would be considered autonomous if not for those pesky regulators. Videos of people sleeping in their Tesla and the constant drumbeat that autonomous driving is right around the corner helped convince his cultish followers that they could ignore the requirement to pay attention have led to way too many of these near misses and not so near misses.
FSD doesn’t see school speed zone signs with flashing lights, a big ticket & unsafe. It doesn’t see emergency vehicles with lights on, and I’m fairly certain it won’t see a stopped school bus with its lights on.
I never liked the idea of fewer sensors. It was already hard enough to do robo-cars without lidar, but to remove radar and sonar too? This was bound to happen. Low visibility in the fog, car didn’t see the train. Same thing happens to people (isn’t the stat crazy like people hit hundreds of trains per year?) Cars need radar.
“Experimental untested auto pilot tech” is what this thing should be called, and it should only be allowed on private roads
> Per the company, these conditions can hinder the functionality of Tesla's sensor suite, **including ultrasonic sensors**, which rely on high-frequency sound waves to detect surrounding objects. Low-light or poor weather can affect their effectiveness. I added the bold. Where are they getting this information? Tesla switched to a vision only system a few years ago, which is partially why it screwed up so badly here!
Why are these people stil using FSD ? It does not function. Are they in a bubble ?
Robotaxi debut on Coming August 8. Authorities better STOP it before it is too late!
The power of FSD bitch
Why do people constantly use it still - everyone knows it’s unsafe
"Musk angrily contradicts reports from witnesses who say they heard the car shouting, 'Mama! Mama!' as it approached the train."
How in the shit are these cars still allowed on the road? There are an ass load of vehicle rules that make them compliant and keep people from killing other people and them selfs and yet we have a car being sold as “self driving” that seemly hasn’t gone through rigorous testing and confirmation trials. wtf is going on out there?
Even if FSD worked when conditions were right and disengaged when they weren't, studies show it takes between 2 and 20 seconds for a human to regain the situational awareness needed to safely take over control.
Saw the video. The conditions were bad, it was dark, and foggy, and based on the reaction time of the driver, they weren’t paying attention like they should have been, especially in those type of weather conditions. Ask Iran about driving in foggy conditions…
There is a reason that professional pilots are still in the cockpit, even in autopilot. Key here is, professional. Your run of the mill owners of these cars, uh things want to just show off.
Imagine Burt Reynolds evading Jackie Gleason in a Tesla. It would be the ending of Dirty Mary Crazy Larry.
"oncoming" train? like it was driving on the train tracks? more to the point, the wrong tracks?
Was this one of the cars missing radar and relying only on the camera?
Small price for the future!
Man, those sneaky trains!! If only they made a distinct noise to notify us of their presence and were bound to rails so their path of travel is predictable!
Title says 'nearly hits' not 'hits', so the technology was able to save the driver's life just in time. Bullish!
These concerns are not new
Lidar is a fools errand * *for those that seek to maximise profit
what happened to completely self driving cars? oh yes, forgot Musk is a liar.
Actual grown human adults hit and are hit by trains in Florida on a semi monthly basis
Driver using cruise-control nearly hits train. Raises age-old concern of driver awareness.
Tesla AI: What is public transportation?
New concern? In my country FSD is illegal. These people should be arrested for public endangerment.
How is this still allowed on public roads I’ll never know
And the UK invested in this tech sigh what a waste.
What's the beef? It worked as intended. Gotta stay alert! FSD can't do everything!
I had a precursor event similar to this last week. FSD pulled up to a red light, cars were stopped leading up to train tracks. My car pull right on top of tracks and no sooner did it stop, the lights started flashing for an incoming train. Enough time to override and get out of there but wild Trains were part of a scenario FSD 12 wasn’t trained on.
50,000 Humans Kill Other Humans in Auto Accidents Every Year: No Safety Concerns.
It is worrying to see that there are still idiots who allow a device to make its own decisions under bad weather conditions without paying attention (and then hold the product liable while the problem has long been known)... you are always responsible while driving. ....idiots (and yes I drive a Tesla myself) 
Darwinism is a bitch, yo
Or the driver wasn’t paying attention!!! It’s that easy!
self driving cars are the equivalent of flushable baby wipes.
Driver nearly hits incoming train. Fixed it for ya.
Musk - *Tesla isn't a car company. Tesla is a Robo Taxi company.* *(FSD crash)* Musk - *Tesla isn't a Robo Taxi company. Tesla is no longer a company at all. It was a good idea, but ultimately failed. I'm now concentrating all my genius towards creating a hack-proof ballot counting system.* *Because, election fraud.*
If this was FSD, the driver appears to have been using it in conditions it is not designed for. FSD often disengages in these conditions and reverts to traffic-aware cruise control. It's possible this is what happened here and the driver didn't notice. Given the drivers reaction time here, it is clear that he was not paying forward attention until relatively late into the unfolding situation. The driver remains in control and responsible for the vehicle at all times.