Jump to content

Tesla Thread


GearsNSuch

Recommended Posts

It's a really common problem with automation.

1) Clearly Tesla autopilot is not 100% safe. But is it safer than no autopilot? Human-driven cars also sometimes hit stationary objects, including emergency vehicles. It should be studied.

1a) That doesn't mean it shouldn't be improved.

2) As airplanes have gotten safer and safer, more and more the cause of crashes has been pilots who were relying on automated systems instead of remembering how to fly manually. For instance, both of the 737 MAX accidents should have been easily avoided if the pilots had just correctly followed the procedures for a malfunctioning trim system. That's true of many other aviation crashes too. But as the automation becomes more and more sophisticated, it becomes a harder and harder problem to know when to override it. And even, sometimes, to know HOW to override it. The main problem with self-driving cars is that relying on human backup is a bad idea. Even trained, certified, practiced pilots have problems with this. Untrained randos who probably aren't very skilled drivers anyway are not going to react correctly in the very short time available between recognizing an autopilot mistake and the last moment when it would be possible to recover from that mistake.

Edited by mikegarrison
Link to comment
Share on other sites

Yeah, and in this case it was not the fault of autopilot*, it was the fault of the tesla driver, 100%.

There's a reason they have people pay attention, and remain ready to take over at any instant. There would be no headline, "Car on cruise control hits police cruiser." Um, step on the brake, moron. Autopilot is just cruise-control +.

The fact that it is astoundingly good 99.x% of the time might make people less vigilant, but they need to remain vigilant anyway or not use it.

(*autopilot clearly might have failed in a technical sense, but "fault" in the ethical/legal sense is the actual driver not paying attention and driving)

Edited by tater
Link to comment
Share on other sites

16 minutes ago, tater said:

Yeah, and in this case it was not the fault of autopilot*, it was the fault of the tesla driver, 100%.

It's not that simple. It's an "uncanny valley" kind of problem. The repeated reliability of the system trains humans to trust it ... until it fails.

Link to comment
Share on other sites

43 minutes ago, mikegarrison said:

2) As airplanes have gotten safer and safer, more and more the cause of crashes has been pilots who were relying on automated systems instead of remembering how to fly manually. For instance, both of the 737 MAX accidents should have been easily avoided if the pilots had just correctly followed the procedures for a malfunctioning trim system. That's true of many other aviation crashes too. But as the automation becomes more and more sophisticated, it becomes a harder and harder problem to know when to override it. And even, sometimes, to know HOW to override it. The main problem with self-driving cars is that relying on human backup is a bad idea. Even trained, certified, practiced pilots have problems with this. Untrained randos who probably aren't very skilled drivers anyway are not going to react correctly in the very short time available between recognizing an autopilot mistake and the last moment when it would be possible to recover from that mistake.

Actually the problem wasn't that pilots haven't followed the procedures, problem was that MCAS system introduced on 737 MAX (because of larger engines positioned more forward and up than on previous 737 models that changed planes flight characteristics) wasn't mentioned in any of the manuals because Boeing tried to hide it so that them and aviation companies wouldn't need to spend more time and money to re-certify pilots to fly on 737 MAX.

Link to comment
Share on other sites

48 minutes ago, Cuky said:

Actually the problem wasn't that pilots haven't followed the procedures, problem was that MCAS system introduced on 737 MAX (because of larger engines positioned more forward and up than on previous 737 models that changed planes flight characteristics) wasn't mentioned in any of the manuals because Boeing tried to hide it so that them and aviation companies wouldn't need to spend more time and money to re-certify pilots to fly on 737 MAX.

It's obvious that wasn't "the problem". I mean, the Ethiopian crew sure AF knew about MCAS, right?

And the Lion Air crew on the same airplane the day before had the same issue and safely landed the airplane, by following the procedure for malfunctioning trim, just like they were supposed to.

But was it *A* problem that the difference training hadn't included MCAS? Yup. It was *A* problem, even though it wasn't *THE* problem. The thing is, complicated systems usually have complicated problems. And when a system has many redundant lines of safety, but they all fail, which one was *THE* problem?

When you leave the puppy alone, and you leave the room door open, and you leave the house door open, and you leave the front gate open, which one of those mistakes was THE ONE that allowed the puppy to run off and get lost?

With these Tesla crashes, if the system worked a little better, the crash wouldn't happen. But if the drivers trusted the system a little less, the crash wouldn't happen either. (Unless, you know, the drivers just screwed up. As drivers -- and sometimes pilots -- sometimes do.)

The problem here is when the automated systems are untrustworthy, people watch out for them to fail. And when they are perfect, they don't fail. But the closer and closer they get to perfect, the more that people trust them and stop watching them. And that's how these "driver inattention" crashes are happening. It's often how modern plane crashes happen too.

Edited by mikegarrison
Link to comment
Share on other sites

An "autopilot" requiring the driver's hands on the wheel and the driver's attention is by definition worse than no autopilot at all.
Because humans aren't robots, they can't keep attention without doing the work themselves, and a sleepy driver = no driver.

The autopilot either must be as reliable as a human driver on its own, or be not at all.

Link to comment
Share on other sites

1 hour ago, mikegarrison said:

It's not that simple. It's an "uncanny valley" kind of problem. The repeated reliability of the system trains humans to trust it ... until it fails.

Right below what you responded to I said the same thing:

"The fact that it is astoundingly good 99.x% of the time might make people less vigilant, but they need to remain vigilant anyway or not use it."

I realize that the reliability creates a situation where people let their guard down, but that doesn't change the fact that it is still the driver's responsibility. If a driver cannot be responsible in using it, they should not use it. Course I feel the same way about using phones in the car. If you can't keep your phone in your pocket, you should not be driving at all, looking at your phone and responsible driving are mutually exclusive—yet people are texting, etc all the *&^%*&^$ time.

Link to comment
Share on other sites

13 minutes ago, tater said:

Right below what you responded to I said the same thing:

"The fact that it is astoundingly good 99.x% of the time might make people less vigilant, but they need to remain vigilant anyway or not use it."

I realize that the reliability creates a situation where people let their guard down, but that doesn't change the fact that it is still the driver's responsibility. If a driver cannot be responsible in using it, they should not use it. Course I feel the same way about using phones in the car. If you can't keep your phone in your pocket, you should not be driving at all, looking at your phone and responsible driving are mutually exclusive—yet people are texting, etc all the *&^%*&^$ time.

You can fight human nature, in which case you will always lose (at least occasionally), or you can accept human nature and try to design systems that with it.

With pilots, for instance, they have checklists. Why? Because PEOPLE FORGET STUFF. So, they have a checklist for *everything*.

26 minutes ago, kerbiloid said:

An "autopilot" requiring the driver's hands on the wheel and the driver's attention is by definition worse than no autopilot at all.
Because humans aren't robots, they can't keep attention without doing the work themselves, and a sleepy driver = no driver.

The autopilot either must be as reliable as a human driver on its own, or be not at all.

It has to be MORE reliable than a human driver. Because:

1) it will be scrutinized more if it fails

2) someone can be blamed (and sued) if it fails

3) if it's no better than a human driver, why have it?

Link to comment
Share on other sites

24 minutes ago, mikegarrison said:

You can fight human nature, in which case you will always lose (at least occasionally), or you can accept human nature and try to design systems that with it.

With pilots, for instance, they have checklists. Why? Because PEOPLE FORGET STUFF. So, they have a checklist for *everything*.

We clearly don’t do that for driving, look at distracted driving (phones). People have their phones on stalks so they can see them while driving, lol. 
 

Checklist item 1, put phone away, and don’t touch it until driving is done.

The trouble with driving automation is that to get it better they need real driving data, and that requires letting the car drive while supervised—but good automation then decreases driver attentiveness. At a certain point I guess we have to assume the drivers will be responsible (unlike the 27 yo in question).

That said, if the automation crashes less than people it’s still a win, and the sort of crappy driver like this guy is also likely to be the sort with his phone out in a regular car.

Link to comment
Share on other sites

I think comma ai is right about using a driver facing camera to assess driver attention vs the steering wheel thing.

Tesla has such a camera, and they need to use it. Look away too long, and the car warns you that automation is turning off, or even it might pull you over and stop. If a cell phone enters the frame? Instantly does the above, pulls you off and stops.

Link to comment
Share on other sites

1 hour ago, tater said:

We clearly don’t do that for driving, look at distracted driving (phones). People have their phones on stalks so they can see them while driving, lol. 

Broke: phones while driving

Woke: watching TV while driving

IIRC someone even tried a gaming console once...

Link to comment
Share on other sites

A driver forced to follow the driving without be driving = zombie driver on a leash. 
Able to roll the eyes and clack with teeth, but same smart and attentive.
(Donkey) on the seat, thoughts in zombieland.

So, all those measures to attract the attention are just an attempt to drop the blame from the manufacturer onto the customer.

Edited by kerbiloid
Link to comment
Share on other sites

4 hours ago, mikegarrison said:

It's a really common problem with automation.

1) Clearly Tesla autopilot is not 100% safe. But is it safer than no autopilot? Human-driven cars also sometimes hit stationary objects, including emergency vehicles. It should be studied.

1a) That doesn't mean it shouldn't be improved.

2) As airplanes have gotten safer and safer, more and more the cause of crashes has been pilots who were relying on automated systems instead of remembering how to fly manually. For instance, both of the 737 MAX accidents should have been easily avoided if the pilots had just correctly followed the procedures for a malfunctioning trim system. That's true of many other aviation crashes too. But as the automation becomes more and more sophisticated, it becomes a harder and harder problem to know when to override it. And even, sometimes, to know HOW to override it. The main problem with self-driving cars is that relying on human backup is a bad idea. Even trained, certified, practiced pilots have problems with this. Untrained randos who probably aren't very skilled drivers anyway are not going to react correctly in the very short time available between recognizing an autopilot mistake and the last moment when it would be possible to recover from that mistake.

Agree, now events in cars tend to happen much faster than in planes there any sudden action you have to make tend to be an incident. In cars they are everyday stuff like the truck you are behind stopping for reasons you can not see. 
During takeoff and landing the pilots are on high alert even if done automatically. During cruise its another story, then they are probably more like Tesla drivers. But here its unlikely that sudden events require split second reactions and primitive autopilots has been used since WW2, I guess modern systems are better of handling stuff like engine out on a two engine plane. 
Tail size on modern passenger planes are so large because designed to handle worst case event who is engine out during takeoff, it has to compensate for running on one engine at below takeoff speed. 

Link to comment
Share on other sites

27 minutes ago, DDE said:

You couldn't pay me enough to stay alert for hours while supervising an autopilot.

That's what the Metro is for.

I use cruise control all the time, but I'm not sure I'd ever use autopilot the way it is currently available. If I'm paying attention anyway, I'd just steer. I'd see autopilot (if I had it) as a way to stretch, or change positions briefly on a long drive, maybe.

In the current situation, it seems more like crowdsourcing training vs true labor saving for the driver. Convince yourself it is making your travel better while you train their system for them. I would imagine I would consider autopilot to be about as mentally taxing as it is sitting in the right seat with one of my kids driving (they are both learning now). In some ways I think I am paying far more attention when supervising kids than when actually driving.

Link to comment
Share on other sites

2 hours ago, tater said:

We clearly don’t do that for driving, look at distracted driving (phones). People have their phones on stalks so they can see them while driving, lol. 
 

Checklist item 1, put phone away, and don’t touch it until driving is done.

The trouble with driving automation is that to get it better they need real driving data, and that requires letting the car drive while supervised—but good automation then decreases driver attentiveness. At a certain point I guess we have to assume the drivers will be responsible (unlike the 27 yo in question).

That said, if the automation crashes less than people it’s still a win, and the sort of crappy driver like this guy is also likely to be the sort with his phone out in a regular car.

This, you could also gate this, autopilot run by it self on highways and easy country roads. 
Driver attention is needed in cities and other more complex places, however its enabled at very low speeds like an queue. 

Link to comment
Share on other sites

4 minutes ago, magnemoe said:

This, you could also gate this, autopilot run by it self on highways and easy country roads. 
Driver attention is needed in cities and other more complex places, however its enabled at very low speeds like an queue. 

The place where something between autopilot and FSD would be most desirable would to be me awful traffic jams. I have only driven some small number of hours (10-20?) in LA traffic, but that's an example. It moves so slowly, I could see letting the car just do it—the worst problem would be a walking speed fender bender (and that on the highway, lol). Country roads I'm less sure about, animals, passing into oncoming lanes (common on undivided highways here), etc. On twisting roads I'd probably end up hyper aware prepping to take over, lol.

Link to comment
Share on other sites

37 minutes ago, tater said:

I use cruise control all the time, but I'm not sure I'd ever use autopilot the way it is currently available. If I'm paying attention anyway, I'd just steer. I'd see autopilot (if I had it) as a way to stretch, or change positions briefly on a long drive, maybe.

I would imagine I would consider autopilot to be about as mentally taxing as it is sitting in the right seat with one of my kids driving (they are both learning now). In some ways I think I am paying far more attention when supervising kids than when actually driving.

I use Autopilot near constantly, and almost entirely on winding country roads. I’ve had a while now to get used to its quirks and features, and figure out where it works well and where it doesn’t. It’s an absolute Godsend when I get off work after a long day of, ironically, driving, and now have to drive home when I’m a virtual zombie. Autopilot handles the basic tasks of keeping speed and staying in the lane, so I can focus my admittedly compromised attention entirely on watching the road, not the minutia of driving itself. And even on long road trips where I’m not compromised, just like airplane AP it greatly reduces my mental workload, so at the end of the day I don’t have that drained feeling nearly as much, and you’ll hear this same sentiment echoed by other frequent Autopilot users. 

Yes, it’s extremely surreal the first time you engage it, and that wheel starts moving on its own. It does take a while to learn how to trust it, but for the vast majority of users, that trust that leads to responsible, proper use comes fairly quickly. 

28 minutes ago, tater said:

Country roads I'm less sure about, animals, passing into oncoming lanes (common on undivided highways here), etc. On twisting roads I'd probably end up hyper aware prepping to take over, lol.

It does extremely well on winding highway-speed (50-60mph) two-lanes, on some tighter 30-40mph suburban windies it’s struggled a bit in the past, but I’ve seen it improve remarkably over the last year or two. It even slows down for cyclists or walkers on the side of the road. I trust Autopilot far more than the rando coming the other way and inching closer to the centerline. 
 

1 hour ago, kerbiloid said:

So, all those measures to attract the attention are just an attempt to drop the blame from the manufacturer onto the customer.

The blame for blatant misuse of a product remains, as it always has, sorely on the shoulders of the user, not the manufacturer. Ever hear that gag about the guy with the wrecked RV who set the cruise control then went in back to make a sandwich?

If you take a hair dryer into the shower, despite being warned not to, and get electrocuted, that’s on you, not the hair dryer. 
If you spill hot coffee on yourself, after being warned it’s hot, and get burned, that’s on you, not the coffeemaker. 
 If you stick your hand in the garbage disposal… well, you get the idea. 
 

This is the warning you have to acknowledge before you can ever engage Autopilot:

Spoiler

M04qrz4.jpg

This is the warning you get every single time you do engage it:

Spoiler

ThohYQM.jpg

If you choose to ignore all these warnings, and fail to pay attention to the point that you don’t even see the flashing lights of an emergency vehicle ahead, that is YOUR fault, not the machine’s.

Every single Autopilot-involved accident up to this point has been a result of user error, pure and simple, full stop.

And they remain extremely rare, because the vast majority of people are not, in fact, blithering idiots, and can responsibly operate a piece of equipment after sufficient “education.” But, “another Autopilot crash!” is sensational, and sensational gets those sweet sweet ad clicks. Double bonus points for Tesla, too, since anything negative involving Tesla is also sensational (it’s almost like someone(s) somewhere has a vested financial interest in all this sensationalism, hmm…)

Pretty much every automaker has their own driver-assist system available now that kinda-sorta approaches what Autopilot can do, yet you never hear about those crashes, because those crashes are so unsensational that no one is even keeping data on them. 

Link to comment
Share on other sites

56 minutes ago, CatastrophicFailure said:

It does extremely well on winding highway-speed (50-60mph) two-lanes, on some tighter 30-40mph suburban windies it’s struggled a bit in the past, but I’ve seen it improve remarkably over the last year or two. It even slows down for cyclists or walkers on the side of the road. I trust Autopilot far more than the rando coming the other way and inching closer to the centerline. 

Really interesting to hear from someone who uses it. I tried it a little the few times I was trying friends' teslas, but only briefly.

The difference vs actively driving if impaired is interesting. I end up staying awake most of the time when my wife gets called in to the OR in the middle of the night, particularly if she's already been on call a couple days in a row—concerned she might be so tired driving is less than ideal.

59 minutes ago, CatastrophicFailure said:

Pretty much every automaker has their own driver-assist system available now that kinda-sorta approaches what Autopilot can do, yet you never hear about those crashes, because those crashes are so unsensational that no one is even keeping data on them. 

As for this, it reminds me of a while ago when the news always reported people "killed by an SUV" vs "killed by an impaired driver" as if the SUV was driving itself.

Link to comment
Share on other sites

7 hours ago, CatastrophicFailure said:

The blame for blatant misuse of a product remains, as it always has, sorely on the shoulders of the user, not the manufacturer. Ever hear that gag about the guy with the wrecked RV who set the cruise control then went in back to make a sandwich?

If you take a hair dryer into the shower, despite being warned not to, and get electrocuted, that’s on you, not the hair dryer. 
If you spill hot coffee on yourself, after being warned it’s hot, and get burned, that’s on you, not the coffeemaker. 
 If you stick your hand in the garbage disposal… well, you get the idea. 

The listed sandwich, dryer, etc, differ from the "autopilot" requiring the driver's  constant attention without driver's actions.

They don't exceed the natural abilities of the human organism to be properly used.

Why do people watch TV or surf the net while waiting for something in a queue, or sitting in a plane or a bus? Pupils playing in a class?
Because any monotonous activity is a torture, which makes a human tired even when he was just sitting and doing nothing.
It's physically impossible for a human to stay alert and concentrated for hours when he doesn't perform any actions on his own.

So, unlike the dryers , hot coffee, and sandwiches, the "but the driver should keep his eyes on the road and hands on the wheel " is equal to "we tried to made an autopilot which can replace a human on the driver seat, but it's so unreliable that you have to keep driving yourself, but pay us for the autopilot".

The only purpose of an autopilot is a long boring trip (for parking there is a parktronics) , and when a driver has to stay alert for hours but sit still.

Edited by kerbiloid
Link to comment
Share on other sites

6 hours ago, tater said:

Really interesting to hear from someone who uses it. I tried it a little the few times I was trying friends' teslas, but only briefly.

It's telling that so much criticism levied against Autopilot comes from people who've never used it at all, based on hearsay on what it's "supposed" to do. Like anything, it takes a bit of learning, for the moment. My wife loves the traffic-aware cruise control but pretty much never uses auto-steer, if she does she sends up unintentionally disengaging it by tugging the wheel a bit too hard. I've found you have to trust the system about half a second father than you want to. Where it's meant to work, it works really really good. Even where it's not meant to work, it still works pretty good. But you've got to know and accept its limitations as an incomplete system.

6 hours ago, tater said:

The difference vs actively driving if impaired is interesting. I end up staying awake most of the time when my wife gets called in to the OR in the middle of the night, particularly if she's already been on call a couple days in a row—concerned she might be so tired driving is less than ideal.

I do a lot of long driving on nearly-deserted roads in the middle of the night, even as it is now, it's dang near perfect for that.

45 minutes ago, kerbiloid said:

The listed sandwich, dryer, etc, differ from the "autopilot" requiring the driver's  constant attention without driver's actions.

Not really, no. They are all things which have a specific set of circumstances where they can be used safely. If you deliberately-deliberately-ignore warnings and use them outside of those circumstances, that is on you, not on the item. You have made that choice and need to own that mistake.

46 minutes ago, kerbiloid said:

an autopilot which can replace a human on the driver seat

Except, it's NOT that. And it's not marketed or sold as that, either. Those warnings I posted above demonstrate that. Autopilot is a driver assist system, not full autonomy, and no one familiar with it actually thinks otherwise. Full Self Driving is coming, but it's not here yet, safe for a handful of carefully chosen beta test volunteers.

49 minutes ago, kerbiloid said:

The only purpose of an autopilot is a long boring trip

And this remains its primary use and selling point.

Link to comment
Share on other sites

Has the autopilot liability been tested in a US court yet? It's hard to imagine that no enterprising lawyer yet hasn't tried an argument that a product that is so easy to use irresponsibly, even with good intentions must give some liability to the manufacturer.

If I were Tesla's lawyers I would be up at night worrying.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...