What Shocked You Today

Currently reading:
What Shocked You Today

Teslas keep crashing into emergency vehicles.

Warning contains scenes of a teslas crash where several people were injured

 
I think the car saying for the 150th time (!!) put your hands on the steering wheel sums it up, driver not paying any attention to the car.
 
I think the car saying for the 150th time (!!) put your hands on the steering wheel sums it up, driver not paying any attention to the car.
The guy was heavily intoxicated, I think that's why he would respond to the prompts but still not pay attention
 
Tesla's radar confused by the flashing lights?
The camera was confused, the high intensity flashing caused blurring on the camera.
Radar and optical are not vaugely related
The tesla radar doen't report stationary objects.

It does seem a bit odd as many cars now have the anti-collision braking.
We have the nissan pro-drive (or whatever it's called), semi-self driving, I wouldn't trust my life to it!!
 
This kinda thing is why I never expect fully self-driving cars to ever work.

You could fix this I'm sure, but it's just one of a million situations that a human brain would recognise immediately that might confuse a program.

Take passing a horse...most of the time they are under reasonable control and you could treat it like a slow easily startled cyclist.

But they don't all behave the same it might be the one you're passing has already been spooked cos it's seen a plastic bag it doesn't like the look of. On your approach from the rear you slow down, but the rear legs are no longer following front as it's trying to spin 90 degrees to the traffic flow to identify the noise/threat while the rider is attempting to get it to go straight.

As a human you look at that and probably think, "I'll give them a minute to get sorted and hang back" I doubt the car would do anything except "obstruction ahead must avoid".

Obviously niche example...but the world over is filled with similar niche examples. Self driving in an ancient Italian city centre?
 
My thinking had always been that self-driving cars would just have different accidents to humans. They won't fall asleep or check their phone but they will make mistakes that no human ever would. Whether we, as a society, get happy with that is an interesting question. I was discussing this with a mate last week and a couple of things came out.
  1. At some level the programmers of the system will have to create the ability for the AI to answer ethical questions like the trolley problem and that will be difficult
  2. Black box logging is easy in terms of recording the data surrounding a crash but how do we record the reasoning that the AI used to choose a course of action. I've read a couple of articles where it's noted that we don't fully understand why systems such as ChatGPT say what they do, one is here https://www.vox.com/unexplainable/2...-gpt-ai-science-mystery-unexplainable-podcast. The same may be true of any system that is basically trained on a large body of data.
Deep down I just feel that the AI will be better as a driver's assistant than a replacement but that the level of assistance may vary depending on the road situation.
 
Interesting study done, in the 90’s I think, at Uni of Leeds, given the choice of running a dog or a person down, most crashed the car to avoid making that choice, even if it put themselves at danger!
 
ng question. I was discussing this with a mate last week and a couple of things came out.
  1. At some level the programmers of the system will have to create the ability for the AI to answer ethical questions like the trolley problem and that will be difficult
The correct answer is "why are so many people standing on the tracks!!!"

The only real intermediate response should be if the car sees anything out of the ordinary then it slows heavily and tells the driver take control. It should be much easier to decide between normal situation and not normal.
 
The thing is, realistically self driving cars only need to be slightly better than people to make them worth while.

Honestly people get hung up about a person being in charge, but how often do you encounter systems these days where no person is involved in the process. In the last couple of decades we have become so used to automation and technology that we wouldn't even comprehend it if we went back just 20 - 25 years, when the internet was a new thing full of websites made with clip art.

The technology in Tesla is impressive, but it is becoming more and more of a problem as it is reaching it's limitations. It doesn't use lidar or radar, its all based off normal cameras all around the car and algorithms to understand what the cameras are seeing. If it had radar or Lidar like more advanced self driving cars, it would see the solid object and not just the lights.

Teslas self driving system is so stupid it doesn't even know what it is looking at unless it's told to. a couple of years back a tesla drove into a plane at an airport because tesla had not anticipated that someone would use a car around aircraft so had not programed it to recognize what a plane looks like. The car didn't know what it was looking at, so with no information drove straight into it. Again something that is able to detect distance and solid objects would not have done the same.

A lot of the "proper" self driving cars that do use lidar and radar to map their surroundings are way way more advanced and reliable but not infallible.
My favorite trick being the self driving car trap where you draw a circle in white lines around a self driving car and it will be programed not to cross a solid white line so will drive in a circle till the end of time looking for a way out.

But self driving cars can be much safer than people driving cars, but they will never be completely safe as there is alwasy that one in a million circumstance you cannot program for.

The other barrier is the transition. self driving cars only really work if everything is self driving and they can talk to one another and figure out how to navigate, much harder to do that with a bag of unpredictable meat behind the wheel breaking speed limits and swerving about. I don't see that there would ever be a time they would be able to click there fingers and switch so I expect it will take a couple of generations before self driving becomes a standard on the roads.

This does highlight one major problem with Tesla. There entire share price is dependent on self driving cars, otherwise they are a fairly low volume car manufacturer who makes mediocre electric sedans.
If they have put all their eggs in one camera based self driving basket, they could find the whole company is going to collapse soon.
People are comparing it to Theranos, where the CEO promised the world and could not deliver what they promised. Tesla and self driving are good but people are beginning to see the technology isn't what it promised to be, and at this stages the owners/drivers of tesla cars are guinea pigs crashing the cars so that tesla can keep patching up the floors in the system.
 
We've got 33 million cars on the road so realistically something one in a million happens 33 times a day.

I can see automated driving happening, which I don't particularly like as the safer someone feels the more risks they take. Not to roll out the Clarksonism about spikes but there are studies where if you make a road flat that was badly cambered people now travel faster and you see no benefit in terms of accident rate. People have a danger tolerance they work to you make them feel safe by telling them the car will look after them..they will take the **** until one day they reach the edge of what the system they can do and are totally unprepared.

But true, I'm at the pub I'll call my car to pick me up self drive...you'd need a full redesign of the road network and the rules of the road for that likely similar standards across the world.
 
We've got 33 million cars on the road so realistically something one in a million happens 33 times a day.
Lie, damn lies, and statistics.

That's not how probability works here. There's nothing to say in the original statement that the one in a million will happen daily.
 
Lie, damn lies, and statistics.

That's not how probability works here. There's nothing to say in the original statement that the one in a million will happen daily.
Well yes...but there's also no evidence it's a one in a 1,000,000 event it could be a one in a 100,000 event or one in 50,000 event.

How many times a day does an emergency vehicle sit with the blues on blocking a lane?

The main point was, statistically any possiblity is a certainty given large numbers and a long enough time line.

There are nearly infinite possibilities for self-driving cars to cock up in unexpected ways. The only real certainty is that everyday at least one will encounter a suitation the programmer didn't envisage or that the sensor tech involved can't deal with.
 
Last edited:
There are nearly infinite possibilities for self-driving cars to cock up in unexpected ways. The only real certainty is that everyday at least one will encounter a suitation the programmer didn't envisage or that the sensor tech involved can't deal with.
No different to people driving, if it's better (less accidents overall) then it would be worth it.
You are back to your tramline problem, if self driving cars kill a third compared to people driving cars then is it still a bad thing?
 
No different to people driving, if it's better (less accidents overall) then it would be worth it.
You are back to your tramline problem, if self driving cars kill a third compared to people driving cars then is it still a bad thing?
If a self driving car kills someone dear to you with no one at the wheel, would you be happy with a payout?

It's not like the owner can be held responsible, the company that builds it will perhaps throw some money at you to go away. But otherwise there's no real line responsibility.

Also swapping one set of cock ups..for a different set of cock ups isn't really progress. It really depends where the outliers are, if a human driver is more likely to hit traffic at low speed in a none life threatening way but there's longer set of odds badly painted road markings will cause your self driving car to enter trees at motorway speed...erm well. I'll take the 15mph shunt thanks..
 
Tis a very luddite view, may as well go back to horse and cart.
 
Imagine the carnage you could cause with nought but a pot of paint with the quality of current self driving systems.
 
Obviously niche example...but the world over is filled with similar niche examples. Self driving in an ancient Italian city centre?
Or self-driving around the Champs Elysee in Paris? Or even Swindon's Magic Roundabout.
At some level the programmers of the system will have to create the ability for the AI to answer ethical questions like the trolley problem and that will be difficult.
Deep down I just feel that the AI will be better as a driver's assistant than a replacement but that the level of assistance may vary depending on the road situation.
To me the trolley problem is simple. The single person was not in danger, so to change direction to put them in danger is definitely wrong. The vehicle has to do its best to stop, on its original path, only deviating if an alternative path is safe.
There are nearly infinite possibilities for self-driving cars to cock up in unexpected ways. The only real certainty is that everyday at least one will encounter a suitation the programmer didn't envisage or that the sensor tech involved can't deal with.
A problem now, is that if a Tesla cannot understand a situation, it seems to ignore it and continue, whereas a safer option might be to slow to a stop instead, while screaming at the 'driver' to take over. But stopping on a fast road because it is confused will create more problems.
Tis a very luddite view, may as well go back to horse and cart.
At least the horse will often try to discourage heading into danger.

For me, the biggest problem with 'self-driving', is that those who look forward to it most, are those with the least driving ability, so therefore least capable to take over when the car can no longer cope. Sticky situations will become worse.

AI will never be a match for real stupidity.

We already have 'self-driving' transport. They're called, taxis, buses, trains and trams.
 
Back
Top