Jump to content

Tesla Thread


GearsNSuch

Recommended Posts

https://www.nbcnews.com/news/us-news/tesla-autopilot-crashes-deputy-s-vehicle-washington-state-n1267716

Tesla on "autopilot" smashes into a police car parked on the side of the road as the officer was responding to a different crash.

Also:

https://www.marketwatch.com/story/tesla-driver-killed-in-crash-posted-videos-of-himself-driving-hands-free-11621220917

Tesla driver who made many posts about how much he liked his self-driving car is killed when the car (apparently on autopilot) crashes into an overturned semi truck. (He also seriously injured another person who had been trying to help the driver of the truck.) This happened at 2:30 am, so there may have been other factors involved. But some of his posts indicated that this was a normal time for him to be driving.

==========

I believe I've said this here before, but the most dangerous condition is an unreliably self-driving car. It's good enough to make people stop concentrating on driving, but not good enough to actually take over for them.

This weekend I had a rental car with a bunch of new electronic driving aids that I've never used before. It was pretty amazing how quickly I got used to cruise control that automatically responded to the speed of traffic ahead, but I could easily see myself losing concentration on paying attention to what's happening in front of me.

Edited by mikegarrison
Link to comment
Share on other sites

27 minutes ago, mikegarrison said:

I believe I've said this here before, but the most dangerous condition is an unreliably self-driving car. It's good enough to make people stop concentrating on driving, but not good enough to actually take over for them.

Every human should have a 5G chip, to let the self-driving car know that somebody is here, and it should take evasive action. That's all.

***

Actually, at least a smartphone.

***

A gem implanted in forehead like a third eye.

(Just to feel VIP.)

Link to comment
Share on other sites

50 minutes ago, mikegarrison said:

It's good enough to make people stop concentrating on driving, but not good enough to actually take over for them.

Sadly nearly everything works this way in the transition phase... Which only makes worse the underlying problem that exist with the whole idea, if there's any (like how often we let drivers renew their license without any test, or lax enforcement and prosecution, or the fact that we often use cars where we shouldn't).

Don't get me wrong, technology is a great thing to have, but sometimes what works, works.

Link to comment
Share on other sites

How many car crashes per total distance for crewed and uncrewed rovers cars?

***
Upd.
Found it.
https://carsurance.net/blog/self-driving-car-statistics/

Quote

(Government Technology) At the moment, self-driving cars have a higher rate of accidents compared to human-driven cars, but the injuries are less serious. On average, there are 9.1 self-driving car accidents per million miles driven, while the same rate is 4.1 crashes per million miles for regular vehicles.

 

Link to comment
Share on other sites

6 hours ago, mikegarrison said:

I believe I've said this here before, but the most dangerous condition is an unreliably self-driving car. It's good enough to make people stop concentrating on driving, but not good enough to actually take over for them.

I remain unconvinced that a driver willing to trust an autopilot is going to be much safer driving the car himself.

For those wondering if regulators might clamp down on this thing, all they have to do is drive a loop around the Washington  DC beltway to realize that humans (especially Marylanders, Virginians, and Washingtonians) should never be put in charge of a multi-ton vehicle at highway speeds with minimal training/oversight.

Insurance companies may come down hard on the things.  But I'd expect the existence of enough insurance companies with a long enough view to get the humans (or at least the worst of the lot) from behind the wheel.

Link to comment
Share on other sites

I think full self driving is a ways off, but anecdotes like the above crash don't really tell us anything. We literally hear about every such autopilot crash in the country, and don't hear about every non-autopilot crash.

If the autopilot crash rate is lower than the human driver crash rate, autopilot is still ahead. If someone gets killed because of it, obviously people will blame the tech, but I'm not sure how you do if it kills fewer people per mile than humans, it's just that different people will die. If you google police car hit during traffic stop there are loads of videos, so it's not like having the lights on helps all the time. The Tesla driver should have taken over, obviously, that's the way the training works. The driver should not assume, if they don't like the way the car is heading, they are supposed to take over. I use cruise control often—if I am on I-25 and have it set to 80, and there is a car in the right lane going 74, I need to turn off cruise control, change lanes, or both. Autopilot is no different.

I'm not a gung-ho self-driving advocate, I'm just pointing out what needs to be looked at. Driving is actually incredibly hard, it's sort of amazing how few people get killed (been on my mind a lot in the last 2 weeks since our friend missed a stop sign and got killed).

 

Link to comment
Share on other sites

1 minute ago, tater said:

I think full self driving is a ways off, but anecdotes like the above crash don't really tell us anything. We literally hear about every such autopilot crash in the country, and don't hear about every non-autopilot crash.

If the autopilot crash rate is lower than the human driver crash rate, autopilot is still ahead. If someone gets killed because of it, obviously people will blame the tech, but I'm not sure how you do if it kills fewer people per mile than humans, it's just that different people will die. If you google police car hit during traffic stop there are loads of videos, so it's not like having the lights on helps all the time. The Tesla driver should have taken over, obviously, that's the way the training works. The driver should not assume, if they don't like the way the car is heading, they are supposed to take over. I use cruise control often—if I am on I-25 and have it set to 80, and there is a car in the right lane going 74, I need to turn off cruise control, change lanes, or both. Autopilot is no different.

I'm not a gung-ho self-driving advocate, I'm just pointing out what needs to be looked at. Driving is actually incredibly hard, it's sort of amazing how few people get killed (been on my mind a lot in the last 2 weeks since our friend missed a stop sign and got killed).

 

First off I am so sorry about your friend, I think a large issue with self driving is that it promotes a lack of attention. When driving on the road it is the drivers job to watch where they go or they will die. It’s virtually impossible to keep a dirver attentive when they don’t need to be.

Link to comment
Share on other sites

52 minutes ago, wumpus said:

remain unconvinced that a driver willing to trust an autopilot is going to be much safer driving the car himself.

This fall I’ll have been driving commercially for 20 years, that’s over a million and a half miles (supposedly). I’ve also had well over a year now to get familiar with Tesla Autopilot and all its foibles & shortcomings, where it works and where it doesn’t .

That being said, without hesitation I would rather share the road with a hundred Teslas on Autopilot, as it currently exists, and unmanned, than one idiot on his cell phone. At least I know the Teslas are actively trying NOT to hit me. -_-

8 minutes ago, tater said:

We literally hear about every such autopilot crash in the country, and don't hear about every non-autopilot crash.

Mercedes, Audi, Ford, Kia, Hyundai, and  probably every other manufacturer now have cars on the road with TACC and “lane centering” similar to basic Autopilot. Funny how you never hear about those crashes... <_<

Link to comment
Share on other sites

Autopilot is a driver assist system, the driver is still in control of the vehicle in the same way that a driver with regular, non-adaptive, cruise control is still in control of their vehicle. AP offloads some of the tedium of managing the speed, distance, and steering adjustments, but it is never in control of the vehicle.

People misusing the system is not an indictment of it. Normal, non-AP, drivers are remarkably dangerous as a baseline.

Link to comment
Share on other sites

5 minutes ago, SpaceFace545 said:

First off I am so sorry about your friend, I think a large issue with self driving is that it promotes a lack of attention. When driving on the road it is the drivers job to watch where they go or they will die. It’s virtually impossible to keep a dirver attentive when they don’t need to be.

I agree that it certainly has that possibility (inattention). I agree with George Hotz of comma AI (openpilot) that one thing Tesla should certainly do is use the internal camera to monitor driver attention. Until full self driving is a thing, eyes on the road or it shuts down.

That said...

5 minutes ago, CatastrophicFailure said:

than one idiot on his cell phone.

This.

Literally anyone who takes their phone out of their pocket/purse and operates it at any level while driving can't talk smack about self-driving. Anyone actually looking at their phone, EVER while driving (not using it as a phone, but texting/browsing/etc)? I'd suggest something but it's a family forum.

A few years ago I had to drive over to a large 6 lane road to pick up my wife and take her to the hospital (she had a surgery to do). She needed a ride because a guy looking at his phone looked up, slammed on the brakes, skidded over 100m (I measured the skid marks), rear ended my wife's small sedan (BMW) stopped at the red light, pushed her car into the intersection, and his car ended up on the other side of the intersection in the oncoming lane—maybe 75m from the point of impact. Happily my wife was fine (both cars totaled), but had he looked up some fraction of a second later?

6 minutes ago, southernplain said:

Normal, non-AP, drivers are remarkably dangerous as a baseline.

And yet also remarkably safe. It really is amazing how few crashes there are, all things considered, and the huge % of people I see looking at their phones.

Link to comment
Share on other sites

4 minutes ago, tater said:

I agree with George Hotz of comma AI (openpilot) that one thing Tesla should certainly do is use the internal camera to monitor driver attention. Until full self driving is a thing, eyes on the road or it shuts down.

Tesla already has the ability to use the internal camera on Model 3/Y for driver monitoring: https://twitter.com/greentheonly/status/1379928419136339969

It can be fooled of course, but it exists. Implementation may come at some point only Tesla knows.

5 minutes ago, tater said:

And yet also remarkably safe. It really is amazing how few crashes there are, all things considered, and the huge % of people I see looking at their phones.

 Yes, overall safety has improved dramatically in the past ~50 years. Yet, relative to number of miles traveled compared to airlines/rail/bus driving is still quite dangerous.

Active safety features, ADAS, and eventually fully autonomous driving have the potential to improve safety to well in excess of what humans are capable of. 

Link to comment
Share on other sites

29 minutes ago, southernplain said:

People misusing the system is not an indictment of it. Normal, non-AP, drivers are remarkably dangerous as a baseline.

Yep.

55 minutes ago, tater said:

If the autopilot crash rate is lower than the human driver crash rate, autopilot is still ahead.

11 minutes ago, southernplain said:

Active safety features, ADAS, and eventually fully autonomous driving have the potential to improve safety to well in excess of what humans are capable of. 

What already works, works very well though.

 

Edited by YNM
Link to comment
Share on other sites

  • 2 weeks later...
  • 2 months later...

As someone would say, “unimpressive, Musk is just copying what was already done before him! Also, it isn’t economically viable!”

1236036
 

Spoiler

Just kidding, I’m sure the Tesla bot is going to be smarter than Rogozin.

 

Link to comment
Share on other sites

Interesting facts.

1. The Teslobot can drive a normal human car making Teslacars you all have bought a useless junk for hipsters.

2. The Teslobot needs no car to be a riksha. Just kick its lazy (rear part) and make it run faster.
You see? You don't need a Teslacar at all, they had you.

3. The Teslobot allows you to feel like a sir and have your own anthropomorphic slaves instead those soulless boxes with wires.

4. You can kick their lazy (rare parts), too, and they never rebel.

4a. A downside. They aren't able to feel humiliated and thus your power won't be fully appreciated.

5. You can not feed them, even more, electrocute them instead, and they will be happy.
Actually, that's exactly what they need, just keep the voltage under 12 V.

6. Caution. Do not allow your Teslabots to get into conversation with Fedorobots.
They will learn bad things, and think (and they will be right) they are new proletarians, and that you are robbing and exploiting them.
Or at least they will learn how to creatively screw your orders and use your property properly when you are not watching.
Remember: as it's said, "a proletarian has nothing to loose except his chains", and in case of Teslobots the chains are electric.

7. The Teslobot carry capacity is 45 lb 20 kg, which is much less than most humans can carry themselves.
So, if you take your electroslave to a shop and by a 60 kg thing, it will say "Screw off, you human. I can lift only 20", so you will carry the cargo mostly yourself.

7a. If you weight 100 kg, you need to own 6 Teslobots to carry your palanquin, Do you still want it?

8. They didn't show the temperature range. It's very possible that you will be working under sun and clean the snow yourself, while your Teslobot will be looking at you through a conditioned room window.

Link to comment
Share on other sites

1 hour ago, sh1pman said:

Just kidding, I’m sure the Tesla bot is going to be smarter than Rogozin.

Well, for one thing, I don't think it's going to have Twitter.

28 minutes ago, kerbiloid said:

6. Caution. Do not allow your Teslabots to get into conversation with Fedorobots.
They will learn bad things, and think (and they will be right) they are new proletarians, and that you are robbing and exploiting them.
Or at least they will learn how to creatively screw your orders and use your property properly when you are not watching.
Remember: as it's said, "a proletarian has nothing to loose except his chains", and in case of Teslobots the chains are electric.

But will it always accomplish its mission?

c906561b7eebf46d8475573e050add6a83ffe7d3

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...