Jump to content

Idiot of the year contender


King Kevin

Recommended Posts

  • Replies 20
  • Created
  • Last Reply
10 minutes ago, King Kevin said:

https://news.sky.com/story/autopilot-driver-who-sat-in-passenger-seat-while-tesla-travelled-on-m1-banned-11350736

Sometimes the public do need protecting from themselves .Should have banned him for life .

Sounds like he performed a successful road test to me. The car didn’t crash did it. 

The Tesla engineer will always tow the line, and say that the features are only there to assist a fully attentive driver who can best react to the road ahead. But that’s ********, a Tesla will never switch off and will therefore always be more attentive than any driver, and will always be able to react quicker to anything on the road ahead. 

The fact that the driver didn’t crash, shows that the car did its job. 

Teslas are fully hardware ready to self drive. The software has only been purposefully limited because legislation hasn’t caught up. 

But how may Tesla’s have crashed? The answer is, not many. 

And how many Tesla drivers have tried this stunt. Probably a lot. And if they haven’t actually sat in the ossenger seat, they’ll certainly have sat in the drivers seat and completely switched off and left the driving to the car. It’s impossible to stay completely attentive when you’re not actually doing anything. 

Point is, probably more than average Tesla’s have the opportunity to crash, but Tesla’s actually crash less than average. 

The law needs to let them off the leash a little bit and let them light the way. Roads would be a lot safer places. 

Link to comment
Share on other sites

Sith Happens
5 minutes ago, eddie said:

Do I get a nomination?

When you have won it 3 years on the trot you are no longer allowed to enter. ????

Link to comment
Share on other sites

If i am really lucky i've got twenty years left in this realm, can't see em forcing these monstrosities on us all within that period.

As for you young ones, may as well adjust to not having control of much in the future, so marry early

Link to comment
Share on other sites

By the way, I’m not denying that he did a very stupid thing. He broke the law, and was very obvious about it. Like playing with your phone in full view of everyone. 

All im saying is, the first reaction would be that he did a stupid thing by sitting in the passenger side. But apart from an obvious way of getting  a driving ban, doing that in a Tesla is actually far safer than using your phone, speeding or drunk driving in any other car. At least two of those, you don’t even get a ban for. 

It just seems outlandish that not sitting behind the wheel isn’t dangerous, and therefore deserves a ban. But it really isn’t dangerous. If I saw a Tesla driving by itself on the motorway, cruising at 40, with no occupants, I’d feel a lot safer than if I saw a banger full of 17 years olds driving erratically. 

But it will take a long time for legislation to come up with common sense. 

Link to comment
Share on other sites

2 minutes ago, TigerTedd said:

By the way, I’m not denying that he did a very stupid thing. He broke the law, and was very obvious about it. Like playing with your phone in full view of everyone. 

All im saying is, the first reaction would be that he did a stupid thing by sitting in the passenger side. But apart from an obvious way of getting  a driving ban, doing that in a Tesla is actually far safer than using your phone, speeding or drunk driving in any other car. At least two of those, you don’t even get a ban for. 

It just seems outlandish that not sitting behind the wheel isn’t dangerous, and therefore deserves a ban. But it really isn’t dangerous. If I saw a Tesla driving by itself on the motorway, cruising at 40, with no occupants, I’d feel a lot safer than if I saw a banger full of 17 years olds driving erratically. 

But it will take a long time for legislation to come up with common sense. 

Autopilot is like cruise control - you can't just turn off and go to sleep - it is a driver assistant not a driver replacement - not yet anyway. There have been a couple of fatalities due to Tesla's auto pilot so far - not sitting behind the wheel at the moment is dangerous. In the next 5 years or so I expect self driving cars to be pushing to be the majority of cars on the road, but they aren't perfect - not yet at least.

Link to comment
Share on other sites

26 minutes ago, GenBr said:

Autopilot is like cruise control - you can't just turn off and go to sleep - it is a driver assistant not a driver replacement - not yet anyway. There have been a couple of fatalities due to Tesla's auto pilot so far - not sitting behind the wheel at the moment is dangerous. In the next 5 years or so I expect self driving cars to be pushing to be the majority of cars on the road, but they aren't perfect - not yet at least.

They’re not perfect, but the hardware is actually there. The software has been purposefully restrained. A Tesla could quite happily drive itself. And it’s cleverer than cruise control. It’ll slow down, speed up, steer round obstacles, over take. 

Theres been a couple of fatalities. Of course these make the news, cos it’s an (almost) autonomous car, so it’s newsworthy. 

But in the same period of time I bet there have been dozens of fatalities in Vauxhall corsas and the like. 

You could argue there are more corsas on the road of course, but I’d love to see the statistical amount of fatalities compared with Tesla’s on the road, compared with other brands. 

Especially when you take into account that this isn’t the only guy who will have tried to illegally test it to its limits. I’m not surprised if there’s been a couple of fatalities. I’m more surprised that there haven’t been loads. 

Link to comment
Share on other sites

37 minutes ago, TigerTedd said:

They’re not perfect, but the hardware is actually there. The software has been purposefully restrained. A Tesla could quite happily drive itself. And it’s cleverer than cruise control. It’ll slow down, speed up, steer round obstacles, over take. 

Theres been a couple of fatalities. Of course these make the news, cos it’s an (almost) autonomous car, so it’s newsworthy. 

But in the same period of time I bet there have been dozens of fatalities in Vauxhall corsas and the like. 

You could argue there are more corsas on the road of course, but I’d love to see the statistical amount of fatalities compared with Tesla’s on the road, compared with other brands. 

Especially when you take into account that this isn’t the only guy who will have tried to illegally test it to its limits. I’m not surprised if there’s been a couple of fatalities. I’m more surprised that there haven’t been loads. 

Just to be clear I'm not arguing against self driving cars here - the safety benefits and almost total elimination of traffic queues are too great to ignore. However the Tesla certainly isn't fully there at the moment - the recent fatalities have nothing to do with the software being "restrained" - the recent fatalities were due to issues with the base code of the software.

Cruise control can speed up and slow down  and keeps it distance from the car in front as well. The recent fatalities were because the Tesla software drove the cars straight into an obstacle. There are millions of lines of code in a Tesla's software and quite frankly they are the last company I would trust from from a QA perspective if their recent production issues are anything to go by.

"Almost self driving"cars (i.e Teslas) make up something like 0.0001% of cars on UK roads. Road fatalities aren't exactly high as it is considering the number of cars on the road, but they are going to need start rolling out more cars from all companies before anyone can start categorically saying these cars are currently safer than fully human controlled cars. Give it 5 years and I very much doubt there will be many if any cars sold that still have a human control aspect.

Link to comment
Share on other sites

4 hours ago, TigerTedd said:

Sounds like he performed a successful road test to me. The car didn’t crash did it. 

The Tesla engineer will always tow the line, and say that the features are only there to assist a fully attentive driver who can best react to the road ahead. But that’s ********, a Tesla will never switch off and will therefore always be more attentive than any driver, and will always be able to react quicker to anything on the road ahead. 

The fact that the driver didn’t crash, shows that the car did its job. 

Teslas are fully hardware ready to self drive. The software has only been purposefully limited because legislation hasn’t caught up. 

But how may Tesla’s have crashed? The answer is, not many. 

And how many Tesla drivers have tried this stunt. Probably a lot. And if they haven’t actually sat in the ossenger seat, they’ll certainly have sat in the drivers seat and completely switched off and left the driving to the car. It’s impossible to stay completely attentive when you’re not actually doing anything. 

Point is, probably more than average Tesla’s have the opportunity to crash, but Tesla’s actually crash less than average. 

The law needs to let them off the leash a little bit and let them light the way. Roads would be a lot safer places. 

sorry Tigertedd, time to step in, this is my field of expertise, reliable and competent software is miles off, and I mean miles off, for starters, until ALL cars on the road have the same type of lights, and then all motorway signs and traffic lights are changed in the world, to a certain type of LED, the sensors cannot pick them up. Sensors at the moment cannot detect brake lights on cars over a certain age simply cos of the type of bulb. That sounds ludicrous I know, but true. Similarly, on country lanes etc, where pitch black, the adaptive sensors cannot detect cyclists, hedges, sheep, people, walls etc.

I spent a few days testing an autonomous car with the latest technology the other week and this was at the stage where it required numerous lines and spots painting in the road and with no adaptive surroundings. In certain areas of the States, where every car is <2years old and top of the range, with linked sensors on it, they can programme cars to deliver a pizza, but it's all staged.

Link to comment
Share on other sites

1 minute ago, Moist One said:

sorry Tigertedd, time to step in, this is my field of expertise, reliable and competent software is miles off, and I mean miles off, for starters, until ALL cars on the road have the same type of lights, and then all motorway signs and traffic lights are changed in the world, to a certain type of LED, the sensors cannot pick them up. Sensors at the moment cannot detect brake lights on cars over a certain age simply cos of the type of bulb. That sounds ludicrous I know, but true. Similarly, on country lanes etc, where pitch black, the adaptive sensors cannot detect cyclists, hedges, sheep, people, walls etc.

I spent a few days testing an autonomous car with the latest technology the other week and this was at the stage where it required numerous lines and spots painting in the road and with no adaptive surroundings. In certain areas of the States, where every car is <2years old and top of the range, with linked sensors on it, they can programme cars to deliver a pizza, but it's all staged.

Fair enough. I’m no expert, just an advocate. 

But surely if we can see a cyclist or whatever, the technology must exist to enable a super hd camera to see the same cyclist, especially with night vision capabilities that a human doesn’t have. A self driving car should be able to drive with its lights off quite happily.

My merely human miopic eyes can’t possibly be a match for cutting edge camera and sensor technology. 

My theory being, if a human can see a cyclist, or a traffic signal, then a machine can see it twice as easily, and react twice as quickly. 

And if self driving cars can’t do that yet, then they need to buck their ideas up, cos I can’t see any reason why that can’t be possible. 

Link to comment
Share on other sites

34 minutes ago, TigerTedd said:

Fair enough. I’m no expert, just an advocate. 

But surely if we can see a cyclist or whatever, the technology must exist to enable a super hd camera to see the same cyclist, especially with night vision capabilities that a human doesn’t have. A self driving car should be able to drive with its lights off quite happily.

My merely human miopic eyes can’t possibly be a match for cutting edge camera and sensor technology. 

My theory being, if a human can see a cyclist, or a traffic signal, then a machine can see it twice as easily, and react twice as quickly. 

And if self driving cars can’t do that yet, then they need to buck their ideas up, cos I can’t see any reason why that can’t be possible. 

the problem you have is, you would need <insert HUGE number> sensors and <insert huge number> processors to match what the human eye and brain can see and interpret. You would probably also need a few miles of fibre optic cable, and a variety of different voltage cable, and probably more code than @eddie has seen in his entire career just to complete and read the circuit of signals required just to move the car.

If you look at Tesla, and read all their bad news, they cannot launch cars on time simply because the software isn't ready.

Link to comment
Share on other sites

back to the original story too @TigerTedd I would suggest that the reason the car didn't crash would be more attributed to nothing bizarre happening. At 40mph, things are very different to 70mph, and having mr White Van man or HGV pulling in front of you. 

The argument would have been made, albeit not reported that the driver was very lucky he didn't have to take aversive action.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...