Tesla Master Plan: Part Deux

  • Thread starter CodeRedR51
  • 1,606 comments
  • 134,046 views
Right, but flying takes so hundreds of hours of training. You have to fly for thousands of hours to be able to fly commercially. There are so many different regulations in place to protect everyone involved. And yet, fatal accidents still happen, even with autopilot on. The FAA is trying to revise training to reduce reliance on autopilot.


(The last bit here, specifically)


I think this in between level of automation, where people are still in the loop, is incredibly dangerous and should not be available for the general public to use.
It's all quite a bit more nuanced than this.

In the US, the soonest a pilot can earn their private pilot license (for funsies license) is 16 but training can begin before that. There are more people than you think who earn private licenses before they can drive or before they graduate high school. A commercial license (get your first job to get paid) can be had at 21 and there are people lined up out the door to pull this off. An ATP (airline pilot) license can be had at 24 and again there are people who achieve this milestone all the time.

The accident and incident rate for private pilots and general aviation aircraft is vastly lower, hilariously lower, than cars. It's even lower than commercial cars like semi trucks. The average number of hours to earn a private certificate is about 50 hours of flight training and about 50 hours of classroom training and like I mentioned earlier it doesn't merely teach people things, it instills a mindset and influences character changes. It teaches responsbility and makes the pilot into a better person, not just a pilot. They have to, otherwise they're going to kill themselves.

This aspect of training simply doesn't exist for cars. There are asshole kids lined up out the door to rip their first burnout in dad's Corvette or crash it into a parked car.

None of my training aspect has anything to do with automation. None of the planes I trained in years ago had any level of automation, in fact they were considerably more manual than the most basic cars sold on the market today. Couldn't even start it unless you knew how fuel mixture worked and that's more manual than a lawnmower. To be even the most basic level of pilot you have to understand your machine, how it works, it's capabilities, etc. Driver training does not teach that either.

Automation is being introduced earlier and earlier into pilot training these days but it's not some epidemic like that article suggests. Human factors has been at the forefront of aviation education across the board since the 90s and has trickled all the way down to private pilot training. By the time you're in airline training human factors becomes a primary aspect specifically because of the human-machine interface and levels of automation. The jets I've flown so far have pretty dumb automation and we pilots were constantly engaged with the plane. I'm familiar with some planes these days that are conspicuously automated, and in my latest round of training some issues with this have been highlighted because the risk is known and thus taught.

With respect to airlines and Boeing most recent crisis, the issue with the training isn't necessarily the training, it's the FAA themselves. There was a failure in the system where the FAA delegated certification of systems and training to the OEMs, and the OEMs abused that responsbility. In the past Airbus has also been on the hot seat for blurry marketing and human factors mistakes. Just a couple days ago we had a discussion in class about this and while the instructor mentioned this new company's emphasis on hand flying, myself and a classmate who flew for the same company were already used to that and thought it was normal. Apparently it's not universal - my new company has had a rash of interview failures because one airline in particular has the policy of relying on automation rather than promoting hand flying. And of course that falls right back on the FAA because the FAA are the ones who approved that company's training procedures and SOP.

Basically what that article is saying is that without admitting fault the FAA is trying to cover its own ass from the massive mistakes it's been making allowing companies like Boeing to make up the rules. There is not actually an automation crisis of any sort going on. And the autonomous discussion is sort of a joke in the aviation community - no pilot will accept it, no passenger will accept it. OEMs obviously push this crap to try and save a buck but its nonsense.

Meanwhile, we're not even teaching people to drive cars properly on a sunny day much less how to deal with automation mishaps. Like I said, it takes about 100 hours of total instruction to become a private pilot at 16 years old. There is no excuse for this failure of organizations like the SAE, AAA, NHTSA, DOT, etc for their lack of driver training standards.
 
I think it's reckless that Tesla is using completely untrained drivers to beta test their incomplete software on public roads where no bystander has accepted to the terms of being beta tested against.

Absolutely fair.

This whole desire to create autonomous vehicles seem pointless. If you want driving to be safer, you need stricter driving tests and better driving training. If you want better transit, you need to develop better public transit, like trains and subways.

Not really fair.

Currently with Level 2 or 3 autonomy, the driver needs to be attentive at all times, but when the system works "good enough" people will start to trust the system too much, thus not paying enough attention to take over when necessary. Which has already happened multiple times

This argument is a rehash of a very very wrong argument made constantly. Seat belts, knee pads, helmets, ABS, crumple zones, airbags.

q070jlecqij71.jpg


Everyone has a different threshold for risk, and they operate at that threshold fairly consistently. You argue that human beings are not good at staying alert in order to intervene when the conditions call for it. It's easy enough to see this in action at every traffic light ever. Watch the first car waiting for the green light, they'll often not be paying attention. Why? Because they know they'll make it through the intersection. Worst case scenario, they'll get honked at. But regardless they're guaranteed to get through. It's true that people are not great at remaining vigilant, the mistake is to think that this is not already a problem on the roads.

Driving safety is exactly this - remaining vigilant even when it appears that everything is ok. Not just driving safety, all safety. It's something people are not good at, and fail right this second on the road and in other aspects of safety. Automated driving does not fundamentally change that lack of vigilance leads to car crashes.

Given every new advancement in safety, some people will let their guard down and adjust to take a similar risk posture. That does not mean that safety efforts don't work. The comic above pretends that people need to learn without safety, but this is wrong-headed in many respects. First, never underestimate someone's ability to repeatedly do things that cause them harm. Second, people just adjust to a similar level of perceived risk regardless. Generally, the safety has worked better than the adjustment.

mm4818a1f1.gif

-1x-1.png




I think this in between level of automation, where people are still in the loop, is incredibly dangerous and should not be available for the general public to use.

You mean like Cruise Control? Automatic transmissions? ABS? Stability Control?

What are you basing this opinion on? That crashes occur? That doesn't seem like enough information. Automation is excellent for people who adopt a higher risk profile than most - because automation limits their ability to put others at risk. The difference between @Keef's example of aviation and automotive transportation is that aviation rarely is forced on someone who has no interest. All of the hours of driver training in the world may not change the mind of someone who has no inclination to be a good driver. I'd imagine that the number of people who become pilots who aren't interested in piloting is vanishingly small. Meanwhile we have a huge number of drivers who aren't interested in driving - and we need automation as quickly as possible for those people. In fact, much of the automation we already have is primarily for those people.

Let's not assume that inattentive or risky drivers make good drivers when they're not the backup to an automated vehicle.
 
Last edited:
Worst case scenario, they'll get honked at.
The worst case scenario should be that they know they've made a mistake, however minor it may be, and the dwell on it for days and even discuss it with fellow drivers in an attempt to learn and resolve the mistake and make sure it never happens again as they seek perfection as a driver.

Excellence should be the standard expectation. Anything less should cause a self-induced feeling of shame and inadequacy. Mistakes should not necessarily be punished however. Instead, very unlike the rules of the road today but very similar to the rules of aviation, mistakes should be nurtured and trained through because as previously stated all drivers should have a passion for excellence instilled in them through training and comradery.
All of the hours of driver training in the world may not change the mind of someone who has no inclination to be a good driver. I'd imagine that the number of people who become pilots who aren't interested in piloting is vanishingly small. Meanwhile we have a huge number of drivers who aren't interested in driving - and we need automation as quickly as possible for those people. In fact, much of the automation we already have is primarily for those people.
I'm actually astonished that I've never thought of this before. And I'm glad you mentioned it because aviation has a really good track record of weeding out people who aren't dedicated.

I'm aware of a few people thoughout my career who attempted aviation and simply didn't make the cut. For some it wasn't for a lack of trying, they just weren't mentally built for it and have been successful in other endeavors. But a couple of those people truly didn't care. They just thought it was neat and had no intention of getting gud. Neither of those people earned their private certificate - they were prevented from continuing because their instructors sniffed out that they were not dedicated and therefore a safety and security risk. Sometimes that lazy mindset doesn't take hold until well into a pilot's career but mechanisms exist to deal with it at all levels, especially the airline level. Those people become stories told during training.

Bottom line is that people who aren't interested in doing something well should either be nurtured until they believe in the cause or should not be allowed to participate. Simple as that. It's unfortunate that our ancestors have designed a car-centric system and lack of transit options that punishes people who can't drive but the more I fly the less I'm will to take the risk of dealing with people who aren't effective and responsible. The training and vetting process simply isn't good enough. And when nurturing options have been exhausted, the punishments for being a repeat offender are not even remotely severe enough.

This is all my opinion. Another opinion of mine is that if a person really can't be bothered to learn and improve then I've got no problem telling them to their face that they're not good enough and they need to sit down and get out of my way. At some point, shaming people into submission becomes and effective training method.
Let's not assume that inattentive or risky drivers make good drivers when they're not the backup to an automated vehicle.
I can't tell if you're talking about lazy drivers or Mesa pilots haha. But hey, if all you know how to do is not stall the plane then you've accomplished more than 99% of these sickly landlubbers.
 
The worst case scenario should be that they know they've made a mistake, however minor it may be, and the dwell on it for days and even discuss it with fellow drivers in an attempt to learn and resolve the mistake and make sure it never happens again as they seek perfection as a driver.

Excellence should be the standard expectation. Anything less should cause a self-induced feeling of shame and inadequacy. Mistakes should not necessarily be punished however. Instead, very unlike the rules of the road today but very similar to the rules of aviation, mistakes should be nurtured and trained through because as previously stated all drivers should have a passion for excellence instilled in them through training and comradery.

Yea that's not going to work for a lot of people. It presumes a fair amount about personality and priorities.


I'm actually astonished that I've never thought of this before. And I'm glad you mentioned it because aviation has a really good track record of weeding out people who aren't dedicated.

I'm aware of a few people thoughout my career who attempted aviation and simply didn't make the cut. For some it wasn't for a lack of trying, they just weren't mentally built for it and have been successful in other endeavors. But a couple of those people truly didn't care. They just thought it was neat and had no intention of getting gud. Neither of those people earned their private certificate - they were prevented from continuing because their instructors sniffed out that they were not dedicated and therefore a safety and security risk. Sometimes that lazy mindset doesn't take hold until well into a pilot's career but mechanisms exist to deal with it at all levels, especially the airline level. Those people become stories told during training.

Bottom line is that people who aren't interested in doing something well should either be nurtured until they believe in the cause or should not be allowed to participate. Simple as that. It's unfortunate that our ancestors have designed a car-centric system and lack of transit options that punishes people who can't drive but the more I fly the less I'm will to take the risk of dealing with people who aren't effective and responsible. The training and vetting process simply isn't good enough. And when nurturing options have been exhausted, the punishments for being a repeat offender are not even remotely severe enough.

This is all my opinion. Another opinion of mine is that if a person really can't be bothered to learn and improve then I've got no problem telling them to their face that they're not good enough and they need to sit down and get out of my way. At some point, shaming people into submission becomes and effective training method.

There's not a great way to test this from a legal perspective. And regardless of how you feel about our infrastructure, the reality is that it is built on cars as a means of transport, so operating without one, while possible, is a great sacrifice in many cases and should require a lot of evidence for the need. I'd wager a bunch of people end up dead before the evidence is clear. The amount of time and resources it would take to truly evaluate all of the drivers in the nation would be staggering.

Automated driving by extension could usher in a relative utopia for driving enthusiasts. Imagine getting all of the uninterested, non-dedicated people into something that drives for them perfectly. And if that's an option, cracking down on driving ability, training, diligence, and commitment, becomes easier because a trivial alternative exists. Taking away a license from a drunk driver permanently would be way more feasible.
 
Last edited:
I'm all but convinced that Tesla has an oversupply problem that they are trying to keep quiet. The center near me has become so overfilled that they have started filling various other parking lots in the neighborhood. Is their delivery process really so slow that they are filling up multiple parking lots with sold cars? I suspect there is some degree of creative accounting going on.
 
I'm all but convinced that Tesla has an oversupply problem that they are trying to keep quiet. The center near me has become so overfilled that they have started filling various other parking lots in the neighborhood. Is their delivery process really so slow that they are filling up multiple parking lots with sold cars? I suspect there is some degree of creative accounting going on.
Elon managed to piss off the demographic that typically buys electric vehicles by sucking up to the demographic that typically buys clapped-out pick-up trucks. He's sinking his own company because he doesn't know when to shut up.
 
Elon managed to piss off the demographic that typically buys electric vehicles by sucking up to the demographic that typically buys clapped-out pick-up trucks. He's sinking his own company because he doesn't know when to shut up.
You mean the one's he thinks will buy the Cybertruck?
 
Automated driving by extension could usher in a relative utopia for driving enthusiasts. Imagine getting all of the uninterested, non-dedicated people into something that drives for them perfectly. And if that's an option, cracking down on driving ability, training, diligence, and commitment, becomes easier because a trivial alternative exists. Taking away a license from a drunk driver permanently would be way more feasible.
As someone that enjoys driving, I see tremendous value in a self driving car that is effectively door to door capable in any reasonable scenario, but what we're talking about for now, and in the short to medium term, and likely the long term, is a system that isn't perfect, nor close enough to being perfect that it would be suitably approved for the type of use that would justify the risk, or perhaps cost. And of course, at the point the system is better than a dedicated, attentive driver, who's the bigger nuisance on the road?

Personally I think the approach to autonomous driving is wrong. It needs to start with an agreed standard framework of physical and digital infrastructure, and a minimum standard of sensory technology. Cars models shouldn't be allowed that don't meet the standards, and autonomy should be geo-fenced to areas where the physical infrastructure meets the standard. I also think that standardised car to car and car to road communication would be a tremendous benefit. It needs to be thought of as a new type of national infrastructure first - not a problem that needs evermore complex bandaids applying to an algorithm. Make the roads be able to be 'understood' by a standard sensor set, mandate a set of sensors to look for things that might contradict that information, and make cars communicate that information to each other.

I'm certainly no expert on the technology, but the question that keeps coming up in my head, is how does the tech know what it doesn't know? How does it fail safely when it might not realise it's failed, and whilst it's failsafe is the driver how is it going to be better than what we have now? Where does the majority of the intelligence in AI fall, in understanding or in probability?

As a frequent Geoguessr player, the idea of a car that could drive me from Fairbanks, Alaska to Puento Arenas, Chile (Darien Gap aside), or from Lisbon to Vladivostock, or from Cape Town to the Marina Bay Sands Hotel, Singapore, AND visit one bar in each of the provincial captials en route, would be absolute nirvana. Ideally this would be a capability of every new car on the market (accepting terrain and conditions), and therefore of the majority of cars in the market within the following decade or two. Right now, Tesla's are iffy about slotting it into a parking space next to them - the autonomous future we all want, is something I'd bet no registered forum member will see in their life time.
 
It's all quite a bit more nuanced than this.

In the US, the soonest a pilot can earn their private pilot license (for funsies license) is 16 but training can begin before that. There are more people than you think who earn private licenses before they can drive or before they graduate high school. A commercial license (get your first job to get paid) can be had at 21 and there are people lined up out the door to pull this off. An ATP (airline pilot) license can be had at 24 and again there are people who achieve this milestone all the time.

The accident and incident rate for private pilots and general aviation aircraft is vastly lower, hilariously lower, than cars. It's even lower than commercial cars like semi trucks. The average number of hours to earn a private certificate is about 50 hours of flight training and about 50 hours of classroom training and like I mentioned earlier it doesn't merely teach people things, it instills a mindset and influences character changes. It teaches responsbility and makes the pilot into a better person, not just a pilot. They have to, otherwise they're going to kill themselves.

This aspect of training simply doesn't exist for cars. There are asshole kids lined up out the door to rip their first burnout in dad's Corvette or crash it into a parked car.

None of my training aspect has anything to do with automation. None of the planes I trained in years ago had any level of automation, in fact they were considerably more manual than the most basic cars sold on the market today. Couldn't even start it unless you knew how fuel mixture worked and that's more manual than a lawnmower. To be even the most basic level of pilot you have to understand your machine, how it works, it's capabilities, etc. Driver training does not teach that either.

Automation is being introduced earlier and earlier into pilot training these days but it's not some epidemic like that article suggests. Human factors has been at the forefront of aviation education across the board since the 90s and has trickled all the way down to private pilot training. By the time you're in airline training human factors becomes a primary aspect specifically because of the human-machine interface and levels of automation. The jets I've flown so far have pretty dumb automation and we pilots were constantly engaged with the plane. I'm familiar with some planes these days that are conspicuously automated, and in my latest round of training some issues with this have been highlighted because the risk is known and thus taught.

With respect to airlines and Boeing most recent crisis, the issue with the training isn't necessarily the training, it's the FAA themselves. There was a failure in the system where the FAA delegated certification of systems and training to the OEMs, and the OEMs abused that responsbility. In the past Airbus has also been on the hot seat for blurry marketing and human factors mistakes. Just a couple days ago we had a discussion in class about this and while the instructor mentioned this new company's emphasis on hand flying, myself and a classmate who flew for the same company were already used to that and thought it was normal. Apparently it's not universal - my new company has had a rash of interview failures because one airline in particular has the policy of relying on automation rather than promoting hand flying. And of course that falls right back on the FAA because the FAA are the ones who approved that company's training procedures and SOP.

Basically what that article is saying is that without admitting fault the FAA is trying to cover its own ass from the massive mistakes it's been making allowing companies like Boeing to make up the rules. There is not actually an automation crisis of any sort going on. And the autonomous discussion is sort of a joke in the aviation community - no pilot will accept it, no passenger will accept it. OEMs obviously push this crap to try and save a buck but its nonsense.

Meanwhile, we're not even teaching people to drive cars properly on a sunny day much less how to deal with automation mishaps. Like I said, it takes about 100 hours of total instruction to become a private pilot at 16 years old. There is no excuse for this failure of organizations like the SAE, AAA, NHTSA, DOT, etc for their lack of driver training standards.
Maybe what I had meant before wasn't clear and unless I'm misunderstanding, then I think we're in agreement that there should be more regulation from regulatory bodies like FAA, NHTSA, DOT, etc. that there should be more proper training in the use of automation systems and we shouldn't be leaving it to the OEMs, such as Tesla, to be doing the training (or lack thereof).

Lately, I've been watching a lot of videos by Mentour Pilot where he discusses accidents in detail, and from the handful of videos I've seen, many times it is when an edge case occurs and the pilot is unable to react properly in the situation due to the high levels of mental stress in the sudden situation.

To be able to use these automation systems, I believe that you need proper training to understand the system and situations where the systems can fail and what is needed to safely recover. However, as the system improves, the less intervention is needed, and the less attentive the driver/pilot may become, increasing the difficulty to safely recover, especially in a high stress and time sensitive situation.


Driving safety is exactly this - remaining vigilant even when it appears that everything is ok. Not just driving safety, all safety. It's something people are not good at, and fail right this second on the road and in other aspects of safety. Automated driving does not fundamentally change that lack of vigilance leads to car crashes.
I agree with this completely, but I would argue that due to a lack of understanding, lack of training, etc. that automation makes it much easier for drivers to be less vigilant.

When airbags, seatbelts, helmets, etc. were introduced, I assume no one advertised these new amazing inventions that you no longer have to pay attention or that you can take more risks. These safety advancements were made for you to be safer in case of an accident. The person is still in control of the situation at all times. The problem with automation is that you are slowly taking the person outside of the equation which is the dangerous part.

What I'm trying to argue is that if we are going the automation route, there should be proper training to be able to use these systems, similar in rigor to those in aviation. Driving a car is a dangerous activity and if we are able to completely automate it, then I am all for it, it would save many lives. However, at the current level of automation, it is dangerous since drivers are still in control while not having to actively participate.


What are you basing this opinion on? That crashes occur? That doesn't seem like enough information. Automation is excellent for people who adopt a higher risk profile than most - because automation limits their ability to put others at risk. The difference between @Keef's example of aviation and automotive transportation is that aviation rarely is forced on someone who has no interest. All of the hours of driver training in the world may not change the mind of someone who has no inclination to be a good driver. I'd imagine that the number of people who become pilots who aren't interested in piloting is vanishingly small. Meanwhile we have a huge number of drivers who aren't interested in driving - and we need automation as quickly as possible for those people. In fact, much of the automation we already have is primarily for those people.

Let's not assume that inattentive or risky drivers make good drivers when they're not the backup to an automated vehicle.
I agree that automation is good, especially for those who do not want to drive.

Although I have no hard facts to back this up, but I would argue that it may be better for these inattentive drivers to be fully in control at all times when the current level of automation have a lot of edge cases where the driver needs to intervene. If they are fully in control, then that limits have inattentive they can be. This is why we have driver assistant systems like auto braking and lane keep systems to catch them if they fail. But, again, these systems are there to assist rather than take over the job.

Automation, on the other hand, pulls the driver out of the loop. Their task is to watch, but because of how monotonous that is, people will inherently lower their guard, especially when the system is good enough for 90% of the situations. I understand this is a "what about"ism type of argument, but unless there is concrete data saying these automation technologies are safer, I think it is better not to risk lives.

Another solution would be to require these vehicles to have driver attention monitoring, which some automakers already have, which I think is a good middle ground in addition to proper training. But again, it is not the same as a fully attentive driver who is in control at all times.


I'd imagine that the number of people who become pilots who aren't interested in piloting is vanishingly small. Meanwhile we have a huge number of drivers who aren't interested in driving - and we need automation as quickly as possible for those people. In fact, much of the automation we already have is primarily for those people.
My "radical" opinion is that I think there should be stricter regulations and training, like aviation, for driving. If you are not interested in driving safely, then you should not be driving at all.

I understand completely this is no feasible for a lot of people, especially those who live in suburban sprawl hell and more rural areas. This is why I think America needs a better public transit system. These people will not have to buy an expensive new luxury vehicle just so they can be inattentive (assuming automation technology becomes widespread and affordable enough for this to even be possible).

People in general should be taking public transit more. I firmly believe that that would greatly increase people's general wellbeing (source). I understand this is difficult at the moment. Especially when there are a lot factors influencing policymakers, the development of suburban sprawl, the general stigma against public transit, and the ingrained American attitude of independence, consumerism, "keeping up with the Joneses", etc. But this would be a completely different discussion than the one at hand.

I think the development of autonomous vehicles is not necessary or inevitable, but if we must go this route, there needs to be more training and regulations for those involved. The general public should not be able to beta test these technologies without proper training and understanding of the systems. Only when the driver is completely out of the loop and accidents are less likely than normal driving, then do I think there should be widespread adoption. If people are interested in the technology and want to use it, they should fully understand the risk and train to use it. A simple disclosure to agree to the risks on a screen is not enough.


Excellence should be the standard expectation. Anything less should cause a self-induced feeling of shame and inadequacy. Mistakes should not necessarily be punished however. Instead, very unlike the rules of the road today but very similar to the rules of aviation, mistakes should be nurtured and trained through because as previously stated all drivers should have a passion for excellence instilled in them through training and comradery.
I agree with this completely. Rigorous training and high expectations should be expected in all drivers, with or without the current level of automation. Only until cars are like elevators do I think training may no longer be necessary.

There's not a great way to test this from a legal perspective. And regardless of how you feel about our infrastructure, the reality is that it is built on cars as a means of transport, so operating without one, while possible, is a great sacrifice in many cases and should require a lot of evidence for the need. I'd wager a bunch of people end up dead before the evidence is clear. The amount of time and resources it would take to truly evaluate all of the drivers in the nation would be staggering.
The reality is our infrastructure is car-centric, but it truly does not have to be that way. With enough effort and investments, we can change our reality for the better.

Personally I think the approach to autonomous driving is wrong. It needs to start with an agreed standard framework of physical and digital infrastructure, and a minimum standard of sensory technology. Cars models shouldn't be allowed that don't meet the standards, and autonomy should be geo-fenced to areas where the physical infrastructure meets the standard. I also think that standardised car to car and car to road communication would be a tremendous benefit. It needs to be thought of as a new type of national infrastructure first - not a problem that needs evermore complex bandaids applying to an algorithm. Make the roads be able to be 'understood' by a standard sensor set, mandate a set of sensors to look for things that might contradict that information, and make cars communicate that information to each other.
One of my biggest issues with this is that if we are going to invest so much effort into developing the infrastructure and technology (which may or may not be possible in our lifetimes) and then ask people to replace their cars for a new car is laughable. Trains, subways, trams, and buses are already proven technologies. It would be smarter to invest in infrastructure for these instead.
 
Maybe what I had meant before wasn't clear and unless I'm misunderstanding, then I think we're in agreement that there should be more regulation from regulatory bodies like FAA, NHTSA, DOT, etc. that there should be more proper training in the use of automation systems and we shouldn't be leaving it to the OEMs, such as Tesla, to be doing the training (or lack thereof).

Lately, I've been watching a lot of videos by Mentour Pilot where he discusses accidents in detail, and from the handful of videos I've seen, many times it is when an edge case occurs and the pilot is unable to react properly in the situation due to the high levels of mental stress in the sudden situation.

To be able to use these automation systems, I believe that you need proper training to understand the system and situations where the systems can fail and what is needed to safely recover. However, as the system improves, the less intervention is needed, and the less attentive the driver/pilot may become, increasing the difficulty to safely recover, especially in a high stress and time sensitive situation.



I agree with this completely, but I would argue that due to a lack of understanding, lack of training, etc. that automation makes it much easier for drivers to be less vigilant.

When airbags, seatbelts, helmets, etc. were introduced, I assume no one advertised these new amazing inventions that you no longer have to pay attention or that you can take more risks. These safety advancements were made for you to be safer in case of an accident. The person is still in control of the situation at all times. The problem with automation is that you are slowly taking the person outside of the equation which is the dangerous part.

What I'm trying to argue is that if we are going the automation route, there should be proper training to be able to use these systems, similar in rigor to those in aviation. Driving a car is a dangerous activity and if we are able to completely automate it, then I am all for it, it would save many lives. However, at the current level of automation, it is dangerous since drivers are still in control while not having to actively participate.



I agree that automation is good, especially for those who do not want to drive.

Although I have no hard facts to back this up, but I would argue that it may be better for these inattentive drivers to be fully in control at all times when the current level of automation have a lot of edge cases where the driver needs to intervene. If they are fully in control, then that limits have inattentive they can be. This is why we have driver assistant systems like auto braking and lane keep systems to catch them if they fail. But, again, these systems are there to assist rather than take over the job.

Automation, on the other hand, pulls the driver out of the loop. Their task is to watch, but because of how monotonous that is, people will inherently lower their guard, especially when the system is good enough for 90% of the situations. I understand this is a "what about"ism type of argument, but unless there is concrete data saying these automation technologies are safer, I think it is better not to risk lives.

Another solution would be to require these vehicles to have driver attention monitoring, which some automakers already have, which I think is a good middle ground in addition to proper training. But again, it is not the same as a fully attentive driver who is in control at all times.



My "radical" opinion is that I think there should be stricter regulations and training, like aviation, for driving. If you are not interested in driving safely, then you should not be driving at all.

I understand completely this is no feasible for a lot of people, especially those who live in suburban sprawl hell and more rural areas. This is why I think America needs a better public transit system. These people will not have to buy an expensive new luxury vehicle just so they can be inattentive (assuming automation technology becomes widespread and affordable enough for this to even be possible).

People in general should be taking public transit more. I firmly believe that that would greatly increase people's general wellbeing (source). I understand this is difficult at the moment. Especially when there are a lot factors influencing policymakers, the development of suburban sprawl, the general stigma against public transit, and the ingrained American attitude of independence, consumerism, "keeping up with the Joneses", etc. But this would be a completely different discussion than the one at hand.

I think the development of autonomous vehicles is not necessary or inevitable, but if we must go this route, there needs to be more training and regulations for those involved. The general public should not be able to beta test these technologies without proper training and understanding of the systems. Only when the driver is completely out of the loop and accidents are less likely than normal driving, then do I think there should be widespread adoption. If people are interested in the technology and want to use it, they should fully understand the risk and train to use it. A simple disclosure to agree to the risks on a screen is not enough.



I agree with this completely. Rigorous training and high expectations should be expected in all drivers, with or without the current level of automation. Only until cars are like elevators do I think training may no longer be necessary.


The reality is our infrastructure is car-centric, but it truly does not have to be that way. With enough effort and investments, we can change our reality for the better.


One of my biggest issues with this is that if we are going to invest so much effort into developing the infrastructure and technology (which may or may not be possible in our lifetimes) and then ask people to replace their cars for a new car is laughable. Trains, subways, trams, and buses are already proven technologies. It would be smarter to invest in infrastructure for these instead.
Driverless cars that don't really work is what you get when you have a nation of people predisposed to reject public works projects out of hand. An aspirational vision of no compromise and no expense promised by people with more confidence than capability. There's zero chance you could get a project like the interstate highway system done in contemporary America. Every single congressional district would be a road block (look at high speed rail in CA) because every single representative would object and/or require some kind of special favor. I think I'm getting into campaign finance reform in a thread that's already in a tangent from its original purpose. :lol:
 
Automated driving by extension could usher in a relative utopia for driving enthusiasts. Imagine getting all of the uninterested, non-dedicated people into something that drives for them perfectly. And if that's an option, cracking down on driving ability, training, diligence, and commitment, becomes easier because a trivial alternative exists. Taking away a license from a drunk driver permanently would be way more feasible.
I think this is a great idea but the middle ground we're stuck with at the moment is causing more problems than it solves. Besides driver unpredictability, the second biggest obstacle seems to be a lack of standardization in road design and marking. That needs to be resolved promptly. Go browse Maps and see how much variation you see in the design of airport taxiways and markings, etc. You'll find some but not many and that level of standardization is what's required to be able to operate safely and efficienctly anywhere you go without thinking twice about it. Roads don't seem to work like this and I can't see a good reason for that.
 
Taken to the automated vehicle thread.

 
Probably a good opportunity now to post some proper Tesla news. US prices have recently dropped drastically in order to meet IRA pricing brackets. This means that all Model 3 variants (including Performance!), and 5-seat Model Y long range are now eligible to receive the full $7,500 rebate. I wish our Australian government was this generous.




1674045850895.png
 
Probably a good opportunity now to post some proper Tesla news. US prices have recently dropped drastically in order to meet IRA pricing brackets. This means that all Model 3 variants (including Performance!), and 5-seat Model Y long range are now eligible to receive the full $7,500 rebate. I wish our Australian government was this generous.




View attachment 1223756

Inflation! Thanks Biden. [/s]
 
Imagine being the guy that bought a Model Y for $69k a few weeks ago...

The M3 with the Tax credit is into the genuinely affordable range...but I still hate it so I would take a Prius Prime when they get their assembly stateside for something ridiculous like $22k after incentives.
 
Looking at the price history, it is a decent drop, but not the lowest it has been

Someone has been keeping track of pricing it's great to compare



1674158248761.png


1674158257642.png


1674158282864.png


1674158295435.png




Since this is Tesla specific news, during a deposition, Tesla director of Autopilot reveals that a 2016 video was staged using predetermined GPS data rather than any real Autopilot software


A 2016 video that Tesla used to promote its self-driving technology was staged to show capabilities like stopping at a red light and accelerating at a green light that the system did not have, according to testimony by a senior engineer.

But the Model X was not driving itself with technology Tesla had deployed, Ashok Elluswamy, director of Autopilot software at Tesla, said in the transcript of a July deposition taken as evidence in a lawsuit against Tesla for a 2018 fatal crash involving a former Apple engineer.

The previously unreported testimony by Elluswamy represents the first time a Tesla employee has confirmed and detailed how the video was produced.

The video carries a tagline saying: “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

Elluswamy said Tesla’s Autopilot team set out to engineer and record a “demonstration of the system’s capabilities” at the request of Musk.

Elluswamy, Musk and Tesla did not respond to a request for comment. However, the company has warned drivers that they must keep their hands on the wheel and maintain control of their vehicles while using Autopilot.

The Tesla technology is designed to assist with steering, braking, speed and lane changes but its features “do not make the vehicle autonomous,” the company says on its website.

To create the video, the Tesla used 3D mapping on a predetermined route from a house in Menlo Park, California, to Tesla’s then-headquarters in Palo Alto, he said.

Drivers intervened to take control in test runs, he said. When trying to show the Model X could park itself with no driver, a test car crashed into a fence in Tesla’s parking lot, he said.


“The intent of the video was not to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system,” Elluswamy said, according to a transcript of his testimony seen by Reuters.

When Tesla released the video, Musk tweeted, “Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot.”

When asked if the 2016 video showed the performance of the Tesla Autopilot system available in a production car at the time, Elluswamy said, "It does not."

Elluswamy was deposed in a lawsuit against Tesla over a 2018 crash in Mountain View, California, that killed Apple engineer Walter Huang.

Andrew McDevitt, the lawyer who represents Huang’s wife and who questioned Elluswamy’s in July, told Reuters it was “obviously misleading to feature that video without any disclaimer or asterisk.”

The National Transportation Safety Board concluded in 2020 that Huang’s fatal crash was likely caused by his distraction and the limitations of Autopilot. It said Tesla’s “ineffective monitoring of driver engagement” had contributed to the crash.

Elluswamy said drivers could “fool the system,” making a Tesla system believe that they were paying attention based on feedback from the steering wheel when they were not. But he said he saw no safety issue with Autopilot if drivers were paying attention.

Similarly, the director of Autopilot software says he is unaware of what Operation Design Domain or perception-reaction time are





 
Looking at the price history, it is a decent drop, but not the lowest it has been

Someone has been keeping track of pricing it's great to compare



View attachment 1224169

View attachment 1224170

View attachment 1224171

View attachment 1224172



Since this is Tesla specific news, during a deposition, Tesla director of Autopilot reveals that a 2016 video was staged using predetermined GPS data rather than any real Autopilot software




Similarly, the director of Autopilot software says he is unaware of what Operation Design Domain or perception-reaction time are






Looks like silicon valley software engineer typical ******** hubris to me.

I'm starting to suspect there is something fishy with the Tesla Semi Truck video if they were willing to stage autopilot like they did. The only reliable footage in the video is the journey itself, as the beginning footage showing the contents of the trailer is not continuous. Adding to that, the information given was anything but comprehensive...it's like we were given just enough information to make the trip seem plausible, but not enough to take it at face value conclusively. If I were to guess, I would say the truck was a 'ringer' as it were, loaded up with significantly more batteries than production spec. Maybe we'll find out in a lawsuit one day?
 
Last edited:
Looks like silicon valley software engineer typical ******** hubris to me.

I'm starting to suspect there is something fishy with the Tesla Semi Truck video if they were willing to stage autopilot like they did. The only reliable footage in the video is the journey itself, as the beginning footage showing the contents of the trailer is not continuous. Adding to that, the information given was anything but comprehensive...it's like we were given just enough information to make the trip seem plausible, but not enough to take it at face value conclusively. If I were to guess, I would say the truck was a 'ringer' as it were, loaded up with significantly more batteries than production spec. Maybe we'll find out in a lawsuit one day?
Definitely. Seeing how much fraud is happening lately (e.g. Forbes 30 Under 30), I wouldn't be surprised if more recent tricks get brought to light
 
https://www.bloomberg.com/news/arti...-autopilot-video-saying-car-drove-itself-tsla

It is revealed that Musk was directly involved in allowing the video to be staged. He had asked them to remake the video multiple times (at least 4 times) to make it look more like one continuous take. During filming, the car had hit a fence

Elon Musk oversaw the creation of a 2016 video that exaggerated the abilities of Tesla Inc.’s driver-assistance system Autopilot, even dictating the opening text that claimed the company’s car drove itself, according to internal emails viewed by Bloomberg.

Musk wrote to Tesla’s Autopilot team after 2 a.m. California time in October 2016 to emphasize the importance of a demonstration drive to promote the system, which the chief executive officer made a splashy announcement about a week later. In an Oct. 19 call with reporters and blog post, Tesla said that all its cars from that day forward would ship with the hardware necessary for full self-driving capability.

“Just want to be absolutely clear that everyone’s top priority is achieving an amazing Autopilot demo drive,” Musk said in the email. “Since this is a demo, it is fine to hardcode some of it, since we will backfill with production code later in an OTA update,” he wrote, referring to using temporary code and updating it later using an over-the-air software update.

“I will be telling the world that this is what the car will be able to do,” Musk continued, “not that it can do this upon receipt.”

[...]

Under the subject line “The Absolute Priority,” Musk wrote in his Oct. 11, 2016, email that he had canceled his obligations for the upcoming weekend to work with the Autopilot team on both Saturday and Sunday. He said everyone would be required to write a daily log of what they did to contribute to the success of the demo, and that he would read them personally.

Nine days later, after Tesla staffers shared a fourth version of the video, Musk replied that there were still too many jump cuts, and that the demo footage “needs to feel like one continuous take.”

While Musk had written in the earlier email that he would be clear Tesla was demonstrating what its cars would be able to do in the future, he then instructed staffers to open the video with a black screen and three sentences referring to the present.

[...]

Tesla and Musk didn’t disclose when releasing the video that engineers had created a three-dimensional digital map for the route the Model X took, Elluswamy said during his deposition. Musk said years after the demo that the company doesn’t rely on high-definition maps for automated driving systems, and argued systems that do are less able to adapt to their surroundings.

The mapping detail — along with Elluswamy’s acknowledgment that the car was involved in an accident during the demo — broadly confirm a December 2021 New York Times report that said Tesla’s video didn’t provide a full picture of how the vehicle operated during the filming.

When asked if the Tesla drove up over a curb, through bushes and hit a fence, Elluswamy replied: “I’m not so sure about the curb or the bush. I do know about the fence.”
 
Last edited:
Why exactly is the media obsessing over the fact that the route of the 2016 FSD video was predetermined? Tesla merely selected a route which would look best on video to show the systems capabilities. They do not use pre-mapped geofencing to improve the data available on one route over another.

A fun video from Munro & Associates on the reality of Teslas price cuts:



I guess it will be the easiest truck to remove snow and wash off mud.
It’s re-appeared on the Australian Tesla page, too.
 
Last edited:
Why exactly is the media obsessing over the fact that the route of the 2016 FSD video was predetermined? Tesla merely selected a route which would look best on video to show the systems capabilities.
The route is only a small part of it; the fact the video was presented - by Musk himself - as a demonstration of what Tesla's driver assistance systems can do. However it was not only unrepresentative by virtue of being entirely mapped out beforehand (rather than plotted and executed using the Autopilot software) and edited together from several separate drives to appear like one, but the systems couldn't actually do that... yet.

The video was supposed to be a concept demonstration of what Tesla's driver assistance systems could do, eventually - which Musk confirmed in the emails also released - but was presented as "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself", which was not true at all, and promoted by Musk, on Twitter, as "Tesla drives itself". Ashok Elluswamy, Tesla’s director of Autopilot software, confirmed that the route was predetermined (because Autopilot could not dynamically plan a route), and that version of the software had no traffic light recognition capability - despite being shown in the video as having it.


I'm not sure about the "obsessing" part, but the fact it's coming up is because Tesla is defending a wrongful suit brought by the family of Walter Huang, who died in a 2018 collision in his Tesla with the "Autopilot" function engaged. I imagine it hinges on people being led to believe the car was capable of things that it, at that time, was not and if Huang believed it was and died as a consequence.
 
The video was supposed to be a concept demonstration of what Tesla's driver assistance systems could do, eventually - which Musk confirmed in the emails also released - but was presented as "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself", which was not true at all, and promoted by Musk, on Twitter, as "Tesla drives itself". Ashok Elluswamy, Tesla’s director of Autopilot software, confirmed that the route was predetermined (because Autopilot could not dynamically plan a route), and that version of the software had no traffic light recognition capability - despite being shown in the video as having it.
Understood, but why are virtual waypoints in the software an arguing point against this FSD beta video? They're free to program anything they wish if the software is not available to the public, and hands-free FSD was clearly not yet available. It's the equivalent of trying finger-point at Polyphony, because the AI drivers in GT7 do not have the same capabilities as the AI drivers shown in project sophy demonstration videos. They are two separate software packages at different development stages.

I'm not sure about the "obsessing" part, but the fact it's coming up is because Tesla is defending a wrongful suit brought by the family of Walter Huang, who died in a 2018 collision in his Tesla with the "Autopilot" function engaged. I imagine it hinges on people being led to believe the car was capable of things that it, at that time, was not and if Huang believed it was and died as a consequence.
'Autopilot' is not the same software package as 'Full Self Driving' - Tesla make this clear when purchasing AP. In addition, in-car reminders that the driver must keep their eyes on the road and hands on the wheel appear while engaging Autopilot, and there are continued prompts during usage. The fact that Walter chose to ignore each of these attention warnings, and treated Autopilot as level 5 FSD is unfortunate.
If people are led to believe they can be inattentive at the wheel, then Tesla needs to make it more clear, but they make it extremely clear that the driver must be alert at all times, even with current FSD Beta.
 
Last edited:
Not only is Tesla being investigated by the SEC, it's also being investigated by the Justice Department over its "self-driving" feature.

This is the same information request from DOJ which we knew about from ~27th Oct, only it's been recycled by the media again after Teslas comments in the latest 10-K form. The specific new text they added to the Q4, over Q3 on this matter is:

"The company has received requests from the DOJ for documents related to Tesla’s Autopilot and FSD features. To our knowledge no government agency in any ongoing investigation has concluded that any wrongdoing occurred."

The DOJ had even previously requested information on Elons 'going private' tweet, and early Model 3 production rate claims made by Tesla, but there have been no further requests since May 2019. In addition, the paragraph mentioning these investigations was removed from this latest Q4 form. A DOJ or SEC investigation into a company is not a horrendously negative criminal investigation as the media might have you believe, and this latest information request is no exception.

Got to keep the FUD train moving to get those clicks, I suppose.
 
"The company has received requests from the DOJ for documents related to Tesla’s Autopilot and FSD features. To our knowledge no government agency in any ongoing investigation has concluded that any wrongdoing occurred."

allegedly-michael-jackson.gif


Yea, they haven't concluded yet. But asking for documents still remains interesting.
 

Tesla recalls 362,758 cars (2016-2023 Tesla Model S, 2016-2023 Tesla Model X, 2017-2023 Tesla Model 3, 2020-2023 Tesla Model Y) due to how dangerous FSD is

U.S. safety regulators have pressured Tesla into recalling nearly 363,000 vehicles with its “Full Self-Driving” system because it misbehaves around intersections and doesn’t always follow speed limits.

The recall, part of part of a larger investigation by the National Highway Traffic Safety Administration into Tesla’s automated driving systems, is the most serious action taken yet against the electric vehicle maker.

The safety agency says in documents posted on its website Thursday that Tesla will fix the concerns with an online software update in the coming weeks. The documents say Tesla is doing the recall but does not agree with an agency analysis of the problem.

The system, which is being tested on public roads by as many as 400,000 Tesla owners, makes unsafe actions such as traveling straight through an intersection while in a turn-only lane, failing to come to a complete stop at stop signs, or going through an intersection during a yellow traffic light without proper caution, NHTSA said.

In addition, the system may not adequately respond to changes in posted speed limits, or it may not account for the driver’s adjustments in speed, the documents said.

“FSD beta software that allows a vehicle to exceed speed limits or travel through intersections in an unlawful or unpredictable manner increases the risk of a crash,” the agency said in documents.

In a statement, NHTSA said it found the problems during tests performed as part of an investigation into Tesla’s “Full Self-Driving” and “Autopilot” software that take on some driving tasks. The investigation remains open, and the recall doesn’t address the full scope of what NHTSA is scrutinizing, the agency said.

NHTSA’s testing found that Tesla’s FSD beta testing, “led to an unreasonable risk to motor vehicle safety based on insufficient adherence to traffic safety laws.”



Recall details


Description of the Defect :
In certain rare circumstances and within the operating limitations of FSD Beta,
when the feature is engaged, the feature could potentially infringe upon local
traffic laws or customs while executing certain driving maneuvers in the
following conditions before some drivers may intervene: 1) traveling or
turning through certain intersections during a stale yellow traffic light; 2) the
perceived duration of the vehicle’s static position at certain intersections with
a stop sign, particularly when the intersection is clear of any other road users;
3) adjusting vehicle speed while traveling through certain variable speed
zones, based on detected speed limit signage and/or the vehicle's speed offset
setting that is adjusted by the driver; and 4) negotiating a lane change out of
certain turn-only lanes to continue traveling straight.

Description of the Safety Risk :
In the specific and rare circumstances described above when a Tesla vehicle
is operating with a software version of FSD Beta as described below and with
FSD Beta engaged, certain driving maneuvers could potentially infringe upon
local traffic laws or customs, which could increase the risk of a collision if the
driver does not intervene.

Description of Remedy Program :
Tesla will deploy an over-the-air (“OTA”) software update at no cost to the
customer. The OTA update, which we expect to deploy in the coming
weeks, will improve how FSD Beta negotiates certain driving maneuvers
during the conditions described above.
 
Nothing more than an over-the-air software update for vehicles subscribed to FSD software package.
NHTSA definitely needs to update their 'recall' terminology, as there's nothing to differentiate this from traditional physical recalls.
 
Last edited:
Back