Tesla Master Plan: Part Deux

  • Thread starter CodeRedR51
  • 1,666 comments
  • 147,194 views
Neither do Teslas. That functionality is not legal in the UK.
God, really? What a drag. Forgive my ignorance on that. Yes, in that case, and, furthermore, with your charge ports over there, the EV6 makes perfect sense. The Kia dealer experience and lack of FSD are massive negatives here, stateside.
 
I actually don't understand how anyone can have faith in Teslas FSD tech, it seems fairly obvious that the company (mostly Elon) has been massively overpromising on its capabilities for a decade, and I think it's fairly likely he's done a lot of lying about how safe it actually is.
 
VXR
I see a lot of twerps on Facebook who are staunchly anti-EV for political reasons, but they're the ones missing out. These things are brilliant.
People who are anti tesla have many reasons and 99% of them are tied to Musks personality issues. I could condone musks arrogance and narcissism while he helped the planet course correct, but when he attached his fate and the fate of tesla to a band of Right wing lunatics who understand zip about humanity, economy or decency I realized tesla or musk no longer deserve our support. Whatever he thinks he is doing he is hurting millions in the process and setting the worst role model ever in terms of how not to be a leader. Shameful.
 
I'm not going to get into politics or Elon's cult of personality in this thread. We have a thread for that.
I am glad they are not legal.

In all honesty, if ships driving in a straight line have no self preserving functions, how can you even believe it will work in a car?
Uhh... because I use it every time I get behind the wheel, and its self-preserving function has intervened at least once for both my dad and I before we could have reacted. To be clear, either of us would have intervened so as to not have an accident, though we didn't have to. You might as well say, "In all honesty, if people in Miami driving in a straight line have no self-preserving functions, how can you even believe driving among them will work?" I know it's hard to imagine because I thought it was gimmicky at first, too. But, now, it's so good that I wouldn't want a commuter/errand car without it. We've done thick urban Miami to palm beach several, several times with zero interventions.

I actually don't understand how anyone can have faith in Teslas FSD tech, it seems fairly obvious that the company (mostly Elon) has been massively overpromising on its capabilities for a decade, and I think it's fairly likely he's done a lot of lying about how safe it actually is.
A machine paying attention for a human is better than nobody paying attention. At some point you're going to have to take your head out of the sand.
 
I'm not going to get into politics or Elon's cult of personality in this thread. We have a thread for that.

Uhh... because I use it every time I get behind the wheel, and its self-preserving function has intervened at least once for both my dad and I before we could have reacted. To be clear, either of us would have intervened so as to not have an accident, though we didn't have to. You might as well say, "In all honesty, if people in Miami driving in a straight line have no self-preserving functions, how can you even believe driving among them will work?" I know it's hard to imagine because I thought it was gimmicky at first, too. But, now, it's so good that I wouldn't want a commuter/errand car without it. We've done thick urban Miami to palm beach several, several times with zero interventions.


A machine paying attention for a human is better than nobody paying attention. At some point you're going to have to take your head out of the sand.
There is a fundamental compromise with Tesla's system in that it's vision only. It may work well in most situations, but I don't think it will ever be as safe as a system with radar and/or lidar redundancy. I've heard* that the plan was to use a combined radar/vision system but the radar system development wasn't keeping up with the vision system and it was delaying the roll out of FSD so Elon just fired the entire team and went vision only.

*Not gonna get into specifics, but I believe it.
 
There is a fundamental compromise with Tesla's system in that it's vision only. It may work well in most situations, but I don't think it will ever be as safe as a system with radar and/or lidar redundancy. I've heard* that the plan was to use a combined radar/vision system but the radar system development wasn't keeping up with the vision system and it was delaying the roll out of FSD so Elon just fired the entire team and went vision only.

*Not gonna get into specifics, but I believe it.
I watched a vid on social media just the other day on someone testing this out. They did 5 tests with two cars, one being a Tesla, the other, one with lidar, i can't recall - and can't now find the vid to check The tests were basically a cardboard cutout kid that suddenly darts out into the road in front of the cars and they were attempting to see if Tesla's camera only sensors were up to the job in under different conditions. The lidar equiped system passed all the tests. The 'headline' test was the final one where they'd done a Wile E. Coyote/Road Runner-style painting (actually a photograph) of the road ahead on a solid surface - a large polystyrene wall in this case rather than a rock face. The Tesla sailed on straight through the wall without even attempting to stop. Obviously not a very real world test and one that, given the right lighting conditions, some real driver driven cars would possibly fail at too. The troubling issue was the other previous two tests that the camera-only system failed at as well. One was a 'blanket' of heavy rain/burst water hydrant spray that was concealing the mock-child and the other was a 'blanket' of smoke/mist. Both times the Tesla failed to react or didn't react until too late.

I think this is the video:

 
Last edited:
I watched a vid on social media just the other day on someone testing this out. They did 5 tests with two cars, one being a Tesla, the other, one with lidar, i can't recall - and can't now find the vid to check The tests were basically a cardboard cutout kid that suddenly darts out into the road in front of the cars and they were attempting to see if Tesla's camera only sensors were up to the job in under different conditions. The lidar equiped system passed all the tests. The 'headline' test was the final one where they'd done a Wile E. Coyote/Road Runner-style painting (actually a photograph) of the road ahead on a solid surface - a large polystyrene wall in this case rather than a rock face. The Tesla sailed on straight through the wall without even attempting to stop. Obviously not a very real world test and one that, given the right lighting conditions, some real driver driven cars would possibly fail at too. The troubling issue was the other previous two tests that the camera-only system failed at as well. One was a 'blanket' of heavy rain/burst water hydrant spray that was concealing the mock-child and the other was a 'blanket' of smoke/mist. Both times the Tesla failed to react or didn't react until too late.
Who knew that Trompe-l'œil could take down the world's most sophisticated car manufacturer. :lol:
 
I watched a vid on social media just the other day on someone testing this out. They did 5 tests with two cars, one being a Tesla, the other, one with lidar, i can't recall - and can't now find the vid to check The tests were basically a cardboard cutout kid that suddenly darts out into the road in front of the cars and they were attempting to see if Tesla's camera only sensors were up to the job in under different conditions. The lidar equiped system passed all the tests. The 'headline' test was the final one where they'd done a Wile E. Coyote/Road Runner-style painting (actually a photograph) of the road ahead on a solid surface - a large polystyrene wall in this case rather than a rock face. The Tesla sailed on straight through the wall without even attempting to stop. Obviously not a very real world test and one that, given the right lighting conditions, some real driver driven cars would possibly fail at too. The troubling issue was the other previous two tests that the camera-only system failed at as well. One was a 'blanket' of heavy rain/burst water hydrant spray that was concealing the mock-child and the other was a 'blanket' of smoke/mist. Both times the Tesla failed to react or didn't react until too late.

I think this is the video:


And the real-life, non-produced repeat experiment with up to date software:



This guy has balls for real though lol
 
Last edited:
And the real-life, non-produced repeat experiment with up to date software:
There's plenty of clear differences in the set-up of the stunt that must surely create different outcomes... one example aside from the clear lighting, size, quality and consistency of the image between the two walls in the experiments is this...

1742937300869.png


Parallax effect going on with the image of the lamp post and the top of the real one must surely be a giveaway for the system as it must be looking for things moving with differing speeds across it's field of view.

That's just one example, I'm sure people familiar with the system can throw it either way.

I don't think it changes either of the points being made though. The fact the guy failed to 'fool' the car in the second experiment doesn't mean it can't be done, it just means he couldn't do it, where the first guy could. Who knows, with the new software perhaps it would be even harder to 'fool' the system. But the other point still stands, even when the experiments aren't intentionally designed to 'fool' the system, and just give some difficult - but more likely real world scenarios, the system is still deficient compared to a system that uses an entire other 'sense' alongside it's vision.
 
I'm not going to get into politics or Elon's cult of personality in this thread. We have a thread for that.

Uhh... because I use it every time I get behind the wheel, and its self-preserving function has intervened at least once for both my dad and I before we could have reacted. To be clear, either of us would have intervened so as to not have an accident, though we didn't have to. You might as well say, "In all honesty, if people in Miami driving in a straight line have no self-preserving functions, how can you even believe driving among them will work?" I know it's hard to imagine because I thought it was gimmicky at first, too. But, now, it's so good that I wouldn't want a commuter/errand car without it. We've done thick urban Miami to palm beach several, several times with zero interventions.


A machine paying attention for a human is better than nobody paying attention. At some point you're going to have to take your head out of the sand.
It's obvious to me that you, sir, have your head in the sand :lol:.

I specifically said "Teslas" for a reason.
 
Last edited:
There's plenty of clear differences in the set-up of the stunt that must surely create different outcomes... one example aside from the clear lighting, size, quality and consistency of the image between the two walls in the experiments is this...

View attachment 1438722

Parallax effect going on with the image of the lamp post and the top of the real one must surely be a giveaway for the system as it must be looking for things moving with differing speeds across it's field of view.

That's just one example, I'm sure people familiar with the system can throw it either way.

I don't think it changes either of the points being made though. The fact the guy failed to 'fool' the car in the second experiment doesn't mean it can't be done, it just means he couldn't do it, where the first guy could. Who knows, with the new software perhaps it would be even harder to 'fool' the system. But the other point still stands, even when the experiments aren't intentionally designed to 'fool' the system, and just give some difficult - but more likely real world scenarios, the system is still deficient compared to a system that uses an entire other 'sense' alongside it's vision.
The most obvious thing to me is the horizon line is too high.
 
The most obvious thing to me is the horizon line is too high.
Yeah, though that might be more dependent on the relative height of the cars cameras and the cameras they've filming it from. It might be better for the car even if it looks worse for the viewer.

The sun too, is bouncing basically right back at the car from the picture in the second video... in the first it's far more dependent on ambient light.

It's just not comparison worthy. If the software makes the difference they need to repeat the actual same test at the same time.
 
There's plenty of clear differences in the set-up of the stunt that must surely create different outcomes... one example aside from the clear lighting, size, quality and consistency of the image between the two walls in the experiments is this...

View attachment 1438722

Parallax effect going on with the image of the lamp post and the top of the real one must surely be a giveaway for the system as it must be looking for things moving with differing speeds across it's field of view.

That's just one example, I'm sure people familiar with the system can throw it either way.

I don't think it changes either of the points being made though. The fact the guy failed to 'fool' the car in the second experiment doesn't mean it can't be done, it just means he couldn't do it, where the first guy could. Who knows, with the new software perhaps it would be even harder to 'fool' the system. But the other point still stands, even when the experiments aren't intentionally designed to 'fool' the system, and just give some difficult - but more likely real world scenarios, the system is still deficient compared to a system that uses an entire other 'sense' alongside it's vision.
I'm just saying, there's the same experiment re-done. Interesting that the new hardware and v13 software could detect the wall, and showed on the screen that it was a wall. I think the "painted" scene was too blue compared to the overcast background in the Cybertruck test though. That made it a lot easier than the v12 Model Y test which was almost dead-on from what we can see through his lens.
 
Last edited:
And the real-life, non-produced repeat experiment with up to date software:



This guy has balls for real though lol


The experiments aren't comparable.

The lidar system isn't a "full" self-driving system. It's a safety/intervention system that is intended to identify potential dangers and apply the brakes to prevent impact. That is why at the beginning of Rober's video he makes a point of stating he's using the Tesla Autopilot and not FSD, as that system is intended to do the same and the intention is to test camera vs lidar.
 
Back