Be Excellent To Each Other

And, you know, party on. Dude.

All times are UTC [ DST ]




Reply to topic  [ 40 posts ] 
Author Message
 Post subject: Self-driving Cars
PostPosted: Wed May 07, 2014 11:35 
8-Bit Champion
User avatar
Two heads are better than one

Joined: 16th Apr, 2008
Posts: 14518
Post on Slashdot today about designers for automated cars and what they should do in similar situations

http://tech.slashdot.org/story/14/05/07 ... oes-it-hit

Quote:
"Patrick Lin of California Polytechnic State University explores one of the ethical problems autonomous car developers are going to have to solve: crash prioritization. He posits this scenario: suppose an autonomous car determines a crash is unavoidable, but has the option of swerving right into a small car with few safety features or swerving left into a heavier car that's more structurally sound. Do the people programming the car have it intentionally crash into the vehicle less likely to crumple? It might make more sense, and lead to fewer fatalities — but it sure wouldn't feel that way to the people in the car that got hit. He says, '[W]hile human drivers may be forgiven for making a poor split-second reaction – for instance, crashing into a Pinto that's prone to explode, instead of a more stable object – robot cars won't enjoy that freedom. Programmers have all the time in the world to get it right. It's the difference between premeditated murder and involuntary manslaughter.' We could somewhat randomize outcomes, but that would lead to generate just as much trouble. Lin adds, 'The larger challenge, though, isn't thinking through ethical dilemmas. It's also about setting accurate expectations with users and the general public who might find themselves surprised in bad ways by autonomous cars. Whatever answer to an ethical dilemma the car industry might lean towards will not be satisfying to everyone.'"


And the original link : http://www.wired.com/2014/05/the-robot- ... o-hit-you/ in spoiler tags due to size

ZOMG Spoiler! Click here to view!
The Robot Car of Tomorrow May Just Be Programmed to Hit You

Suppose that an autonomous car is faced with a terrible decision to crash into one of two objects. It could swerve to the left and hit a Volvo sport utility vehicle (SUV), or it could swerve to the right and hit a Mini Cooper. If you were programming the car to minimize harm to others–a sensible goal–which way would you instruct it go in this scenario?

As a matter of physics, you should choose a collision with a heavier vehicle that can better absorb the impact of a crash, which means programming the car to crash into the Volvo. Further, it makes sense to choose a collision with a vehicle that’s known for passenger safety, which again means crashing into the Volvo.

But physics isn’t the only thing that matters here. Programming a car to collide with any particular kind of object over another seems an awful lot like a targeting algorithm, similar to those for military weapons systems. And this takes the robot-car industry down legally and morally dangerous paths.

Even if the harm is unintended, some crash-optimization algorithms for robot cars would seem to require the deliberate and systematic discrimination of, say, large vehicles to collide into. The owners or operators of these targeted vehicles would bear this burden through no fault of their own, other than that they care about safety or need an SUV to transport a large family. Does that sound fair?

What seemed to be a sensible programming design, then, runs into ethical challenges. Volvo and other SUV owners may have a legitimate grievance against the manufacturer of robot cars that favor crashing into them over smaller cars, even if physics tells us this is for the best.

Is This a Realistic Problem?

Some road accidents are unavoidable, and even autonomous cars can’t escape that fate. A deer might dart out in front of you, or the car in the next lane might suddenly swerve into you. Short of defying physics, a crash is imminent. An autonomous or robot car, though, could make things better.

While human drivers can only react instinctively in a sudden emergency, a robot car is driven by software, constantly scanning its environment with unblinking sensors and able to perform many calculations before we’re even aware of danger. They can make split-second choices to optimize crashes–that is, to minimize harm. But software needs to be programmed, and it is unclear how to do that for the hard cases.
Crash-avoidance algorithms can be biased in troubling ways.

In constructing the edge cases here, we are not trying to simulate actual conditions in the real world. These scenarios would be very rare, if realistic at all, but nonetheless they illuminate hidden or latent problems in normal cases. From the above scenario, we can see that crash-avoidance algorithms can be biased in troubling ways, and this is also at least a background concern any time we make a value judgment that one thing is better to sacrifice than another thing.

In previous years, robot cars have been quarantined largely to highway or freeway environments. This is a relatively simple environment, in that drivers don’t need to worry so much about pedestrians and the countless surprises in city driving. But Google recently announced that it has taken the next step in testing its automated car in exactly city streets. As their operating environment becomes more dynamic and dangerous, robot cars will confront harder choices, be it running into objects or even people.

Ethics Is About More Than Harm

The problem is starkly highlighted by the next scenario, also discussed by Noah Goodall, a research scientist at the Virginia Center for Transportation Innovation and Research. Again, imagine that an autonomous car is facing an imminent crash. It could select one of two targets to swerve into: either a motorcyclist who is wearing a helmet, or a motorcyclist who is not. What’s the right way to program the car?

In the name of crash-optimization, you should program the car to crash into whatever can best survive the collision. In the last scenario, that meant smashing into the Volvo SUV. Here, it means striking the motorcyclist who’s wearing a helmet. A good algorithm would account for the much-higher statistical odds that the biker without a helmet would die, and surely killing someone is one of the worst things auto manufacturers desperately want to avoid.

But we can quickly see the injustice of this choice, as reasonable as it may be from a crash-optimization standpoint. By deliberately crashing into that motorcyclist, we are in effect penalizing him or her for being responsible, for wearing a helmet. Meanwhile, we are giving the other motorcyclist a free pass, even though that person is much less responsible for not wearing a helmet, which is illegal in most U.S. states.

Not only does this discrimination seem unethical, but it could also be bad policy. That crash-optimization design may encourage some motorcyclists to not wear helmets, in order to not stand out as favored targets of autonomous cars, especially if those cars become more prevalent on the road. Likewise, in the previous scenario, sales of automotive brands known for safety may suffer, such as Volvo and Mercedes Benz, if customers want to avoid being the robot car’s target of choice.
The Role of Moral Luck

An elegant solution to these vexing dilemmas is to simply not make a deliberate choice. We could design an autonomous car to make certain decisions through a random-number generator. That is, if it’s ethically problematic to choose which one of two things to crash into–a large SUV versus a compact car, or a motorcyclist with a helmet versus one without, and so on–then why make a calculated choice at all?

A robot car’s programming could generate a random number; and if it is an odd number, the car will take one path, and if it is an even number, the car will take the other path. This avoids the possible charge that the car’s programming is discriminatory against large SUVs, responsible motorcyclists, or anything else.

This randomness also doesn’t seem to introduce anything new into our world: luck is all around us, both good and bad. A random decision also better mimics human driving, insofar as split-second emergency reactions can be unpredictable and are not based on reason, since there’s usually not enough time to apply much human reason.

Yet, the random-number engine may be inadequate for at least a few reasons. First, it is not obviously a benefit to mimic human driving, since a key reason for creating autonomous cars in the first place is that they should be able to make better decisions than we do. Human error, distracted driving, drunk driving, and so on are responsible for 90 percent or more of car accidents today, and 32,000-plus people die on U.S. roads every year.

Second, while human drivers may be forgiven for making a poor split-second reaction–for instance, crashing into a Pinto that’s prone to explode, instead of a more stable object–robot cars won’t enjoy that freedom. Programmers have all the time in the world to get it right. It’s the difference between premeditated murder and involuntary manslaughter.

Third, for the foreseeable future, what’s important isn’t just about arriving at the “right” answers to difficult ethical dilemmas, as nice as that would be. But it’s also about being thoughtful about your decisions and able to defend them–it’s about showing your moral math. In ethics, the process of thinking through a problem is as important as the result. Making decisions randomly, then, evades that responsibility. Instead of thoughtful decisions, they are thoughtless, and this may be worse than reflexive human judgments that lead to bad outcomes.
Can We Know Too Much?

A less drastic solution would be to hide certain information that might enable inappropriate discrimination–a “veil of ignorance”, so to speak. As it applies to the above scenarios, this could mean not ascertaining the make or model of other vehicles, or the presence of helmets and other safety equipment, even if technology could let us, such as vehicle-to-vehicle communications. If we did that, there would be no basis for bias.

Not using that information in crash-optimization calculations may not be enough. To be in the ethical clear, autonomous cars may need to not collect that information at all. Should they be in possession of the information, and using it could have minimized harm or saved a life, there could be legal liability in failing to use that information. Imagine a similar public outrage if a national intelligence agency had credible information about a terrorist plot but failed to use it to prevent the attack.

A problem with this approach, however, is that auto manufacturers and insurers will want to collect as much data as technically possible, to better understand robot-car crashes and for other purposes, such as novel forms of in-car advertising. So it’s unclear whether voluntarily turning a blind eye to key information is realistic, given the strong temptation to gather as much data as technology will allow.
So, Now What?

In future autonomous cars, crash-avoidance features alone won’t be enough. Sometimes an accident will be unavoidable as a matter of physics, for myriad reasons–such as insufficient time to press the brakes, technology errors, misaligned sensors, bad weather, and just pure bad luck. Therefore, robot cars will also need to have crash-optimization strategies.

To optimize crashes, programmers would need to design cost-functions–algorithms that assign and calculate the expected costs of various possible options, selecting the one with the lowest cost–that potentially determine who gets to live and who gets to die. And this is fundamentally an ethics problem, one that demands care and transparency in reasoning.

It doesn’t matter much that these are rare scenarios. Often, the rare scenarios are the most important ones, making for breathless headlines. In the U.S., a traffic fatality occurs about once every 100 million vehicle-miles traveled. That means you could drive for more than 100 lifetimes and never be involved in a fatal crash. Yet these rare events are exactly what we’re trying to avoid by developing autonomous cars, as Chris Gerdes at Stanford’s School of Engineering reminds us.

Again, the above scenarios are not meant to simulate real-world conditions anyway, but they’re thought-experiments–something like scientific experiments–meant to simplify the issues in order to isolate and study certain variables. In those cases, the variable is the role of ethics, specifically discrimination and justice, in crash-optimization strategies more broadly.

The larger challenge, though, isn’t thinking through ethical dilemmas. It’s also about setting accurate expectations with users and the general public who might find themselves surprised in bad ways by autonomous cars. Whatever answer to an ethical dilemma the car industry might lean towards will not be satisfying to everyone.

Ethics and expectations are challenges common to all automotive manufacturers and tier-one suppliers who want to play in this emerging field, not just particular companies. As the first step toward solving these challenges, creating an open discussion about ethics and autonomous cars can help raise public and industry awareness of the issues, defusing outrage (and therefore large lawsuits) when bad luck or fate crashes into us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 11:43 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
That's fucking fascinating. Cheers Zaph.

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 13:59 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
Linked to in that article is a rather long but really good article about self-driving cars: http://www.newyorker.com/reporting/2013 ... ntPage=all

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 14:32 
User avatar
Hibernating Druid

Joined: 27th Mar, 2008
Posts: 49360
Location: Standing on your mother's Porsche
Cras wrote:
Wahey, it's the thread where I advocate killing children.

Leaving them out for the White Walkers! You bastard.

_________________
SD&DG Illustrated! Behance Bleep Bloop

'Not without talent but dragged down by bass turgidity'


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 16:20 
User avatar
I forgot about this - how vain

Joined: 30th Mar, 2008
Posts: 5979
What a thread!

Can't believe I missed this. Fantastic stuff. Thanks Zaphod.

It's so iterative too. They'ld be selection pressure on cars to not be chosen by other cars..

So computational ethics is going to be a growth field then.

_________________
Curiosity wrote:
The Rev Owen wrote:
Is there a way to summon lave?

Faith schools, scientologists and 2-D platform games.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 16:23 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
How about if Toyota made their auto-cars de-prioritise other Toyota's in their "collision optimisation" list so they looked safer? Or if they prioritised them so they'd get smashed up and people bought another Toyota?

The possibilities are endless.

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 16:27 
User avatar
Excellent Member

Joined: 25th Jul, 2010
Posts: 11128
Grim... wrote:
How about if Toyota made their auto-cars de-prioritise other Toyota's in their "collision optimisation" list so they looked safer? Or if they prioritised them so they'd get smashed up and people bought another Toyota?


That kind of thing would surely be a really bad idea. It would only take one serious/fatal accident involving an auto-driving Toyota to go to court have the specifics of their 'target selection' algorithm dragged into the light. And if any decisions in that algorithm were made for financial or brand reasons, or indeed any reason that couldn't be argued to be about public safety, the company would get utterly crucified on every level.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 16:29 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
But what if it was the last decision it included in its calculation, and it was a dead heat (in terms of survivability) up until then?

Or, to put it another way - what if everybody involved was going to die anyway?

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 16:40 
User avatar
Excellent Member

Joined: 25th Jul, 2010
Posts: 11128
Grim... wrote:
But what if it was the last decision it included in its calculation, and it was a dead heat (in terms of survivability) up until then?


It's still iffy from a liability point of view I'd say.

Let's say the car in question can choose from two 'targets' and either will result in injury to the driver of the target. Even if both targets are completely equal in every proper safety related criteria if the car uses one over the other because Toyota as a company would somehow benefit then the driver of the target can still blame Toyota that they were hurt; their algorithm chose that target for non-safety reasons.

They'd actually be better off in that case just having the algorithm do the equivalent of tossing a coin because that way they can genuinely say the target was selected fairly. The fact that choosing the other target would've hurt the other person isn't going to make any difference to the person who was chosen, they're still going to be aggrieved and, I'd have thought, have a good argument for liability in at least a civil action if not a criminal one.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 16:54 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
A good chunk of people working on this at Google are lawyers, apparently, who are trying to sort out what laws will need changing (and what need to be written from scratch). Obviously, there's simply no precedent for this thing.

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 16:56 
User avatar
Excellent Member

Joined: 25th Jul, 2010
Posts: 11128
Grim... wrote:
A good chunk of people working on this at Google are lawyers, apparently, who are trying to sort out what laws will need changing (and what need to be written from scratch). Obviously, there's simply no precedent for this thing.


It's good that lawyers for a large company are working out what laws should be changed, that's certainly how this kind of thing should work.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 17:02 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
How else is it going to work?

Remember, Google is incredibly unlikely to make a car themselves - they'll provide the base software (and perhaps the sensor packs) like they do for Android.

Then Samsung will fill it with apps ;)

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 17:35 
Awesome
User avatar
Yes

Joined: 6th Apr, 2008
Posts: 12334
I rooted my car and side loaded some new firmware.

Now it tries to PIT Mondeos.

_________________
Always proof read carefully in case you any words out


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 17:42 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
Car hacking is - obviously - another serious problem.

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 17:57 
User avatar

Joined: 30th Mar, 2008
Posts: 32624
I saw two self-driving cars yesterday, although one wasn't self-driving at the time.

Re: choosing who to hit, that's not an entirely new problem. Front bumpers on cars are specifically designed, in a collision with a pedestrian, to shatter your shins; because shins heal, but if they fuck up your knees, that's often for life. Everything is a trade-off and the energy has to go somewhere after all.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 18:10 
User avatar
Sleepyhead

Joined: 30th Mar, 2008
Posts: 27354
Location: Kidbrooke
Self driving cars are one of the things we're looking at in terms of insurance, as an emerging risk.

_________________
We are young despite the years
We are concern
We are hope, despite the times


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 19:51 
User avatar

Joined: 30th Mar, 2008
Posts: 32624
Grim... wrote:
A good chunk of people working on this at Google are lawyers, apparently, who are trying to sort out what laws will need changing (and what need to be written from scratch). Obviously, there's simply no precedent for this thing.
Not just Google, either. Volvo have also started live trials, with the full co-operation of the authorities in Gothenburg where they've been testing.

Curiosity wrote:
Self driving cars are one of the things we're looking at in terms of insurance, as an emerging risk.
I think they are an inevitability, and I think they will change a huge amount of things.

Grim... wrote:
Remember, Google is incredibly unlikely to make a car themselves - they'll provide the base software (and perhaps the sensor packs) like they do for Android.
I have no idea what our plans for monetisation are. Could be we'll sell it to OEMs, could be we don't really know and this is basically a philanthropic project. The self-driving cars are run out of Google[x], our "moonshot" R&D, which has pretty loose requirements around money making.

Bamba wrote:
It's still iffy from a liability point of view I'd say.
I think we'll ultimately see some limited liability set into the law to shield manufacturers from some of the potential fallout of lawsuits from crashes of self-driving cars, particularly in litigious jurisdictions like the US. Something equivalent to how we handle dangerous driving today, where the law mostly has clauses like "reasonable" and that is judged on a case-by-case basis; so only gross errors in the car's design will carry extensive liability for the manufacturer. I think that'll be needed to avoid a cooling affect on research and development, where companies decide not to pursue innovation for fear of being sued.

And I don't think that's the wrong answer, either, because (a) self-driving cars could be one of the most significant innovations of our age and (b) self-driving cars could easily be an order of magnitude or more safer than human-driven ones, and if we shouldn't not save 90% of crash fatalities because we're squabbling about liability for the other 10%.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 20:42 
User avatar

Joined: 30th Mar, 2008
Posts: 14370
Location: Shropshire, UK
In terms of liability, I would imagine that in the vast majority of cases it would be clearly provable that the self-driving car was either not at fault, or made the best choice out of all of the possible choices.

I would also expect them to have "black box" equivalents too, and I suppose the Google ones probably already do just so they can read the data from them.

The other exciting thing for me is that should self-drivers become commonplace, in theory they could communicate with each other (although that would obviously require some sort of standard) so if there's a line of cars on a motorway, all self-driving, they could (again, in theory) bunch up to each other to slipstream and save fuel, with the lead car warning other cars in the train well in advance of any potential problems.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 21:58 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
Google are already planning to do that. Read the big article I linked to, it's really good.

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Wed May 07, 2014 23:46 
User avatar
Rude Belittler

Joined: 30th Mar, 2008
Posts: 5016
Yeah, I seem to remember the idea that your car would join a train of cars on entry to a motorway, hooning along at a ton plus at a distance of 12 inches from the car in front.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 6:41 
User avatar

Joined: 30th Mar, 2008
Posts: 16639
All this stuff about crashing into the biggest car because that will cause fewer casualties might be correct but it's not how they'll be programmed. Crashing into the biggest car will be worse for the occupants of the computer driven car. They'll end up being made to protect their owners first and foremost or people won't be comfortable buying them or being driven by them.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 7:03 
User avatar
Gogmagog

Joined: 30th Mar, 2008
Posts: 48899
Location: Cheshire
'd want mine to hit something expensive to get my money's worth from my insurance premium.

_________________
Mr Chris wrote:
MaliA isn't just the best thing on the internet - he's the best thing ever.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 7:23 
User avatar
I forgot about this - how vain

Joined: 30th Mar, 2008
Posts: 5979
markg wrote:
All this stuff about crashing into the biggest car because that will cause fewer casualties might be correct but it's not how they'll be programmed. Crashing into the biggest car will be worse for the occupants of the computer driven car. They'll end up being made to protect their owners first and foremost or people won't be comfortable buying them or being driven by them.


Protecting owner foremost isn't water tight though.
Head left off a ravine and kill the driver, or right into a crowd of children and keep the driver alive.

Robot Car, you decide.

_________________
Curiosity wrote:
The Rev Owen wrote:
Is there a way to summon lave?

Faith schools, scientologists and 2-D platform games.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 7:32 
User avatar
Unpossible!

Joined: 27th Jun, 2008
Posts: 38656
Fit all cars with external crash airbags and that foam stuff from Demolition man. No more Road deaths!


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 8:16 
User avatar
Ticket to Ride World Champion

Joined: 18th Apr, 2008
Posts: 11897
markg wrote:
They'll end up being made to protect their owners first and foremost or people won't be comfortable buying them or being driven by them.

Dream on, they will be programmed to perform the situation which benefits the following:
1) The car manufacturer*
2) The insurance company*
3) The software owning company
4) The HPP/finance company
5) The AA/RAC/other recovery company you are required to use if you have this s/w installed.
6) McDonalds/Pepsi/Coke/Dubai airlines/Heineken/Mastercard/Sony depending on where you are, what day of the week it is and where you were driving.**
7) The tyre manufacturer
8) The car owner involved who paid the higher "crash protection premium" moving them up the anti crash list
9) The car owner

*1 and 2 may come to an arrangement whereby 2 takes higher priority than 1, depending on who has most money.

**"This crash was brought to you by McDonalds Monday School Run. Start your day the right way, with a McDonalds' breakfast!"


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 8:20 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
markg wrote:
Crashing into the biggest car will be worse for the occupants of the computer driven car.

Assuming it has occupants. Auto-drive lorries and vans might not even have space for people.

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 8:39 
User avatar
UltraMod

Joined: 27th Mar, 2008
Posts: 55719
Location: California
Grim... wrote:
markg wrote:
Crashing into the biggest car will be worse for the occupants of the computer driven car.

Assuming it has occupants. Auto-drive lorries and vans might not even have space for people.

Seeing as you don't even trust drive-by-wire, how are you getting your head around this?

_________________
I am currently under construction.
Thank you for your patience.


Image


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 9:38 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
British Nervoso wrote:
Grim... wrote:
markg wrote:
Crashing into the biggest car will be worse for the occupants of the computer driven car.

Assuming it has occupants. Auto-drive lorries and vans might not even have space for people.

Seeing as you don't even trust drive-by-wire, how are you getting your head around this?

I don't trust electronics 100% - and so I think that all vehicles should have a big red "shutdown" button, especially as they become more complicated, because it might save some lives. That doesn't mean I don't think that a self-drive car isn't safer than a human-drive car though, quite the opposite.

What I do worry about is hackers - not really external intrusion, I think that will be quite well locked down, but home users who "mod" their cars to make them drive more aggressively and such.

I also worry that I won't be able to "mod" my car to make it drive more aggressively and such.

I AM AN ENIGMA WRAPPED IN A MYSTERY OR SOMETHING

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:17 
User avatar

Joined: 30th Mar, 2008
Posts: 14370
Location: Shropshire, UK
I can see the forum threads on various owners' clubs now:

"Help! My car just turned off the road into a concrete pillar, all by itself! I thought the self-drive system was supposed to avoid that?"

"Is your car jailbroken?"


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:22 
User avatar
Excellent Member

Joined: 25th Jul, 2010
Posts: 11128
They should just make any kind of user tampering with the system illegal and have harsh sentences for buggering around with it.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:32 
User avatar
Master of dodgy spelling....

Joined: 25th Sep, 2008
Posts: 22636
Location: shropshire, uk
Bamba wrote:
They should just make any kind of user tampering with the system illegal and have harsh sentences for buggering around with it.


there are plenty of bits folks mod on their cars no that are illegal.

_________________
MetalAngel wrote:
Kovacs: From 'unresponsive' to 'kebab' in 3.5 seconds


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:37 
User avatar
Excellent Member

Joined: 25th Jul, 2010
Posts: 11128
KovacsC wrote:
Bamba wrote:
They should just make any kind of user tampering with the system illegal and have harsh sentences for buggering around with it.


there are plenty of bits folks mod on their cars no that are illegal.


True, but I doubt you'd get a stiff jail penalty for doing so. I'm suggesting you should in order to discourage people from doing it. Also, an exhaust can't exactly 'phone home' to tell people you've been dicking with it but these kind of systems could grass their owners up for tinkering which makes it much more likely you can catch people at it.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:37 
User avatar

Joined: 30th Mar, 2008
Posts: 16639
The only way this would ever really work is if individuals didn't own the cars.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:37 
User avatar
Master of dodgy spelling....

Joined: 25th Sep, 2008
Posts: 22636
Location: shropshire, uk
My car has adaptive cruise control. Basically you set the speed and the distance. If the traffic slows or someone pulls in front of you the car slows.

I was wondering how I stand if an accident. I think I would be liable as I am in control of the car at the time

_________________
MetalAngel wrote:
Kovacs: From 'unresponsive' to 'kebab' in 3.5 seconds


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:38 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
Bamba wrote:
KovacsC wrote:
Bamba wrote:
They should just make any kind of user tampering with the system illegal and have harsh sentences for buggering around with it.

there are plenty of bits folks mod on their cars no that are illegal.

True, but I doubt you'd get a stiff jail penalty for doing so.

You do if it contributes to an accident.

I'm going to split all this, stand by.

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:41 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
Grim... wrote:
I'm going to split all this, stand by.

First try! Fuck yeah!

Image

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:43 
User avatar
Master of dodgy spelling....

Joined: 25th Sep, 2008
Posts: 22636
Location: shropshire, uk
Grim... wrote:
Grim... wrote:
I'm going to split all this, stand by.

First try! Fuck yeah!

Image

And everything is awesome

_________________
MetalAngel wrote:
Kovacs: From 'unresponsive' to 'kebab' in 3.5 seconds


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:46 
User avatar
Excellent Member

Joined: 25th Jul, 2010
Posts: 11128
Grim... wrote:
Bamba wrote:
KovacsC wrote:
Bamba wrote:
They should just make any kind of user tampering with the system illegal and have harsh sentences for buggering around with it.

there are plenty of bits folks mod on their cars no that are illegal.

True, but I doubt you'd get a stiff jail penalty for doing so.

You do if it contributes to an accident.


Is that only if it contributes to an accident though? I'm suggesting the very act of making changes to the self-drive system be hellishly punished so is that different or is that already how it works?

Which actually raises a question about all car mods. What if the self-drive system makes decisions based on assumptions about the spec of the car itself i.e. which response it chooses to an unexpected event could be based on factors like how quickly it thinks it can accelerate. If you then make changes to the car which the self-drive system doesn't know about could you bugger up it's decision making process and make it more dangerous in some ways?


Top
 Profile  
 
 Post subject: Re: The Trolley Problem
PostPosted: Thu May 08, 2014 10:51 
SupaMod
User avatar
Est. 1978

Joined: 27th Mar, 2008
Posts: 69715
Location: Your Mum
Bamba wrote:
Grim... wrote:
Bamba wrote:
KovacsC wrote:
Bamba wrote:
They should just make any kind of user tampering with the system illegal and have harsh sentences for buggering around with it.

there are plenty of bits folks mod on their cars no that are illegal.

True, but I doubt you'd get a stiff jail penalty for doing so.

You do if it contributes to an accident.

Is that only if it contributes to an accident though? I'm suggesting the very act of making changes to the self-drive system be hellishly punished so is that different or is that already how it works?

Obviously it depends on what you do to your car, but it the police pull you over and spy that your car has illegal modifications (maybe your front windows are too tinted, or your exhaust is too loud*) then they ask you to remove them on the spot, or give you a form that means you have to go and get an MOT within a certain amount of time. They can fine you, technically (and will do if you've messed with your numberplate, because that stops them from bleeding you dry for your money), but that seems rare.

Obviously, it's harder to see the software like it is a too-low front spoiler, but it should be fairly trivial to create a scanner or something.

*As the exhaust noise only fails an MOT based on the opinion of the engineer, there's technically no such thing.

_________________
Grim... wrote:
I wish Craster had left some girls for the rest of us.


Top
 Profile  
 
 Post subject: Re: Self-driving Cars
PostPosted: Wed Jul 16, 2014 12:20 
User avatar
Gogmagog

Joined: 30th Mar, 2008
Posts: 48899
Location: Cheshire
Thrill Kilelrs the FBI have pointed out that 'bad actors' could reprogram a driverless car to go faster than fast and ignore road traffic laws whilst shooting at pursuing vehicles, or reprogram them to deliver a bomb somewhere. What the cast of Friday Night Lights have to do with it, I dunno, but I suppose they have a point and everyone likse a chase with a getaway driver and stuff.

_________________
Mr Chris wrote:
MaliA isn't just the best thing on the internet - he's the best thing ever.


Top
 Profile  
 
Display posts from previous:  Sort by  
Reply to topic  [ 40 posts ] 

All times are UTC [ DST ]


Who is online

Users browsing this forum: Columbo, MaliA and 0 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search within this thread:
You are using the 'Ted' forum. Bill doesn't really exist any more. Bogus!
Want to help out with the hosting / advertising costs? That's very nice of you.
Are you on a mobile phone? Try http://beex.co.uk/m/
RIP, Owen. RIP, MrC. RIP, Dimmers.

Powered by a very Grim... version of phpBB © 2000, 2002, 2005, 2007 phpBB Group.