What Shocked You Today

Currently reading:
What Shocked You Today

Comments above highlight one thing about self driving that humans don’t like, and that’s the issue of blame.

If something goes wrong who can we blame for i, and more importantly who gets punished.

People aren’t happy with a computer in charge because you can’t punish a computer.

a driver in the case of the trolley problem could be tried in court to see if what they did warranted punishment, where as with a computer driver you can analyse all the data but families don’t get their “”justice” as you can’t send a computer to jail or fine it as it doesn’t understand ‘punishment’

What people really don’t like is when other people can’t be held accountable, it could be 10 time - 100 times safer to have self driving cars but people would still feel happier with a person in charge. Even though people make the stupidest mistakes
 
For me, the biggest problem with 'self-driving', is that those who look forward to it most, are those with the least driving ability, so therefore least capable to take over when the car can no longer cope. Sticky situations will become worse.
If that was the case then they should never have passed their test.

As an aid, the self driving tools right now are highly beneficial. I've only used nessan, when it's active, it'll stop you getting too close to car in front, prevent you wandering across lanes, stop you moving over with things in blind spots, and it does self steer if you let it. It's not tru self driving, but you can see all the bits are getting there, and if all cars had these features then there would be less accidents.
 
Comments above highlight one thing about self driving that humans don’t like, and that’s the issue of blame.

If something goes wrong who can we blame for i, and more importantly who gets punished.

People aren’t happy with a computer in charge because you can’t punish a computer.

a driver in the case of the trolley problem could be tried in court to see if what they did warranted punishment, where as with a computer driver you can analyse all the data but families don’t get their “”justice” as you can’t send a computer to jail or fine it as it doesn’t understand ‘punishment’

What people really don’t like is when other people can’t be held accountable, it could be 10 time - 100 times safer to have self driving cars but people would still feel happier with a person in charge. Even though people make the stupidest mistakes
Our politicians and heads of corporations are never held accountable, so....................................
 
A problem now, is that if a Tesla cannot understand a situation, it seems to ignore it and continue, whereas a safer option might be to slow to a stop instead, while screaming at the 'driver' to take over. But stopping on a fast road because it is confused will create more problems.

At least the horse will often try to discourage heading into danger.

Well this is where human performance factors come in.

If 98% of the time it drives to an acceptable standard, people are extremely poor at system monitoring complex systems for long periods. They tend to get complacent, or bored or distracted. So when the system says ahem take over, they've just looked up from their phone, or book or whatever they were doing and have maybe a second to make a decision that will kill/save them with incomplete information.

So would the gentleman in the video have got into a none self-driving car and had the same accident? Probably not he would either had to been awake enough and concentrating that he would have seen the lights with enough warning or he would have crashed long before he got to that point.
 
Remind me who programs the computer?

😉
it’s not quite as straightforward as a person sits programming the computer on what to do.

Tesla do something similar to that then when the car crashes and kills someone they do an update to patch the problem as Tesla rely on cameras and have no ability to detect solid objects this is a pretty crappy way of doing it.

Other self driving technologies that use LiDAR and radar generally program themselves as they go hence all these self driving companies spending years in development driving cars about and not seemingly progressing.

What computers don’t do is panic and press the wrong pedal or get distracted by kids in the back arguing, get into road rage incidents, brake check people or have any concept of breaking the law because the driver is late.

Most accidents that do happen on the road are usually the result of someone doing something deliberate that results in a accident, speeding, not paying attention, overtaking when it's not safe, etc.

All of the accidents caused by the driver doing something outside of their own abilities, or the car's abilities would be avoided. Theoretically a self driving car should never be able to have an accident where no other cars or people (animals) are involved, such as teenagers driving into trees and ditches in the middle of the night.
 
Remind me who programs the computer?

😉
It would be a team of people, and then would have verification and external approval. But it sounds like no amount of systems, common sense and backups will convince you
 
Last edited:
It would a team of people, and then would have verification and external approval. But it sounds like no amount of systems, common sense and backups will convince you
Yeah the issue of Teslas crashing into emergency vehicles has been patched multiple times.

Would you trust the next patch or would you wait until the one after?

Or perhaps is doing the same thing multiple times and expecting different outcomes the definition of something?

I accept driver aids can eliminate minor accidents and an element of human error...but human stupidity combined with them is going to cause some very major ones.
 
Last edited:
Yeah the issue of Teslas crashing into emergency vehicles has been patched multiple times.
Because the tesla system is terrible and has no ability to detect solid objects, it is reliant on cameras and and inherent limitation in those camera such as in this case a bright light obscuring something behind it.

In truth the case above is a bit unique as the driver was drunk and managed 45 minutes of driving without accident despite being hammered. otherwise he would probably have drifted off into a tree in the first few minutes without autopilot.

There are 6 levels of autonomous driving.

1. no automation at all.
2. cruise control and lane assist basic stuff.
3. some autonomy but requires constant supervision (this is where Tesla sits)
4. autonomous but requires intervention in severe conditions.
5. Highly autonomous rarely requires any input from a driver.
6. fully autonomous and requires no intervention at all.

Because of using only cameras and analyzing what the cameras see, tesla will never manage to get above level 3.

The more you program a computer the more data you create the more reliant the computer becomes on being able to access the data quickly enough to make decisions. So if you had file with millions of possible situations that the computer had to search through to know what to do, the limiting factor can become the speed at which the computer can look up what to do.

This could take milliseconds but at 70mph you need a car that knows what to do and doesn't waste time looking it up.

With radar and LiDAR self driving cars they see a solid object they know to stop. They don't have to look at a database to see if what they are seeing may or may not be solid.
 
All entirely taken on board..

Yet this has been signed off as fixed more than once despite it being impossible with the tech involved to guarantee it.

But I am the one who is luddite for not trusting manufacturers to fix things because rather than trusting the evidence that says they can't/don't I need to believe on faith they will.

So to recap..driver aids..lovely but at best a wind up toy. Full self-drive never.
 
If that was the case then they should never have passed their test.
The driving test is a test of ability, over a short period, where the driver is concentrating more than they are ever likely to subsequently. They have to reach a minimum standard, which is mostly a fault-based assessment. This is the minimum standard to be allowed to pilot a vehicle unsupervised, but sadly so may feel it is the pinnacle of achievement, and gently deteriorate from there.
Remind me who programs the computer?

😉
Those special people who don't get out of their basements much. Daylight is terrifying to them.:)
 
I'd just be interested with for example how they'd deal with this...
Screenshot_20230819-185018.png

If the woman steps off the kerb at the zebra crossing in theory camera/lidar will pick it up. But the car should give way regardless of if the woman has stepped off the kerb.

She was running down the pavement then on arrival at the crossing addressed the road..as you can see the Nissan is unaware of what a zebra crossing is but we don't expect Nissan drivers to keep on top of that sort of thing.

However for self driving to be an improvement it shouldn't be driving like the Nissan.

So it will need to correctly identify that it's a Zebra crossing and know what the importance of that is (also what the local custom is as many countries use zebra markings for crossings but the rules are not the same). Second it then needs to see if there are any pedestrians at that crossing and then judge if they are actually waiting to cross rather than say..walking past it. Having done this it then needs to take the correct course of action. What if the person was on a bike riding on the pavement to a pedestrian crossing, is it going to treat them as a vehicle and go round or a pedestrian and stop?

There's a lot of logic to be built behind that..or even better the new highway code rule that says you should give way to waiting pedestrians at junctions. I'd love to see how it would deal with 2 people chatting on a street corner...20 minutes later 5 mile tailback..they wander off and traffic flows again.
 
Last edited:
However for self driving to be an improvement it shouldn't be driving like the Nissan.
They are not currently programmed to identify zebra crossings, what's your point? The current nissan systems only keeps the car in lane and at a safe distance from other cars

you should give way to waiting pedestrians at junctions. I'd love to see how it would deal with 2 people chatting on a street corner...20 minutes later 5 mile tailback..they wander off and traffic flows again.
The logic is perfectly simple, unless you also stop to watch women chat at the side of the road. It should slow and stop if people are waiting at crossing, same as people should do. I've seen plenty of cars drive on as people cross at zebra crossings, which is worse in your view?
 
The logic is perfectly simple, unless you also stop to watch women chat at the side of the road. It should slow and stop if people are waiting at crossing, same as people should do. I've seen plenty of cars drive on as people cross at zebra crossings, which is worse in your view?

And if they continue chatting at the junction or crossing for some time would they perhaps write an algorithm to beep the horn?

The point is more they can't teach it read body language, or they could but my god that's a big program.

Everything self drive currently does could be replicated by a 5 year old, press a pedal, follow a line, monitor the distance to close vehicles and either avoid or warn if needed.

They've been at that level for years..the next step is a doozy.

Also I was under the impression self-driving was meant to be an improvement..so it should follow the highway code? Otherwise whats the point? "I can program a car to drive erratically".."good for you we absolutely don't have enough of those already"

Another one from today
Screenshot_20230820-203725.png


Bridge would block all sensor readings behind it...I've stopped because I spotted him coming down the hill through the passenger window 30 seconds ago. How's self drive gonna spot him?
 
Bridge would block all sensor readings behind it...I've stopped because I spotted him coming down the hill through the passenger window 30 seconds ago. How's self drive gonna spot him?
Same as current systems probably do, warn that they cant make out the road clear enough and disengage.
 
How does it know it's a bridge? Or do you just crash at the top?

It would have no idea there's a sensor shadow so what can it warn you about? To all intents and purposes this road is clear, conditions are clear, weather is clear and there are no apparent obstructions.
 
Last edited:
How does it know it's a bridge? Or do you just crash at the top?

It would have no idea there's a sensor shadow so what can it warn you about? To all intents and purposes this road is clear, conditions are clear, weather is clear and there are no apparent obstructions.
Again it’s the driving the cars over hundreds of thousands/millions of miles by the time you’ve added all the cars together. Firstly the car would already know it’s a narrow road, secondly self driving cars in a world when all cars are self driving can communicate wirelessly and negotiate who will go first. With other cars on the road the self driving car would proceed with caution at a slow speed. It’s worth noting where LiDAR is fitted it’s usually in the roof so would have a greater view of the surrounding than the driver would being up that much higher.

One thing teslas have been good at is avoiding accidents and rapidly slowing traffic several cars ahead where the driver may have been fixated on the car in front. It’s reasonable to suggest if you can see the car coming down the hill so could any self driving car.
 
Again it’s the driving the cars over hundreds of thousands/millions of miles by the time you’ve added all the cars together. Firstly the car would already know it’s a narrow road, secondly self driving cars in a world when all cars are self driving can communicate wirelessly and negotiate who will go first. With other cars on the road the self driving car would proceed with caution at a slow speed. It’s worth noting where LiDAR is fitted it’s usually in the roof so would have a greater view of the surrounding than the driver would being up that much higher.

One thing teslas have been good at is avoiding accidents and rapidly slowing traffic several cars ahead where the driver may have been fixated on the car in front. It’s reasonable to suggest if you can see the car coming down the hill so could any self driving car.
Well there's no phone reception in this lovely picturesque area so..how they gonna communicate? They all gonna have RF band perhaps? Shortwave? a man with a flag? There was no direct line of sight between vehicles at a time when it might register the other vehicle to be a danger. If I'd a taken still a second earlier the other car would be invisible. Also you're assuming both cars are self-driving and they share a standard communication protocol.

Finally do Tesla look sideways and up for cars in front? I seem to remember they can't detect cross traffic going across their path in certain circumstances.. nevermind looking at something like this out of the passenger window, noting there's a car up the hill and acting on it later.

Screenshot_20230820-222934.png
 
Well there's no phone reception in this lovely picturesque area so..how they gonna communicate? They all gonna have RF band perhaps? Shortwave? a man with a flag?
aside from being facetious, They will need to communicate with the cars immediately around them. so yes some sort of short wave radio albeit digital communication would be needed. Cellular phone networks would be unreliable and would be more like broad band with all cars connected to all other cars and would take some serious computing power to make sure all cars got the information they need.

There was no direct line of sight between vehicles at a time when it might register the other vehicle to be a danger. If I'd a taken still a second earlier the other car would be invisible. Also you're assuming both cars are self-driving and they share a standard communication protocol.

Finally do Tesla look sideways and up for cars in front? I seem to remember they can't detect cross traffic going across their path in certain circumstances.. nevermind looking at something like this out of the passenger window, noting there's a car up the hill and acting on it later.

one situation does not prove any sort of rule. How would you have dealt with this if you'd not seen the car coming down the hill? you wouldn't blunder onto the bridge without checking there was nothing coming. you and any self driving car would proceed with caution.

LiDAR sensors on the roof of a car would be able to see further over the bridge than someone in the car sitting at a much lower level.

Tesla have an all round view via cameras, but I have no idea if they would process information such as this on the otherside of a bridge, The car would have to know in advance the bridge was not wide enough for cars coming both ways and as already stated a Tesla is a long way off being able to self drive in a huge variety of situations.

Systems would need to learn the junction, learn how to proceed and how to check for danger, same as anyone else would do when they learn to drive. Can they do it now, probably not, could they do it in the future? very likely they will.
 
Back
Top