F
F
faleaksey2018-10-26 17:26:30
Algorithms
faleaksey, 2018-10-26 17:26:30

Why give a choice to a machine?

Good time everyone! MIT recently published the results of the Moral Machine tests : you have to choose who should be saved and who should be shot down by an unmanned vehicle in an emergency. Related post .
Question: why give a choice to the machine (of course, based on some data), and not give free rein to chance?

Answer the question

In order to leave comments, you need to log in

8 answer(s)
V
Vladimir, 2018-10-26
@faleaksey

Then let's get even cooler: why follow traffic rules if you can drive without them and give free rein to chance.
Traffic rules are needed in order for the behavior of cars on the road to be more orderly and predictable, in the same way, here the car must comply with traffic rules (we keep order and predictability), and then try to minimize the damage (still without violating traffic rules).
Let me remind you that violation of traffic rules leads to unpredictability, which often leads to increased damage.

Y
Yuri, 2018-10-27
@riky

Sergey Sokolov , you are driving like this on your brand new autopilot car, suddenly a mouse starts running across the road, and the car starts Russian roulette, crush the mouse or send the car downhill with you.

S
Sergey Sokolov, 2018-10-26
@sergiks

The Moral Machine experiment is still purely theoretical, and has more to do with the study of society than with embodiment in iron.
Therefore, a direct answer: no one gives the car a choice.

A
Alexander Talalaev, 2018-10-26
@neuotq

The system responds to events, in this case, the situation on the road.
Here is the situation: A
car is driving, suddenly a person appears on the road, runs across, the car makes a maneuver in order to bypass the person. And now imagine that in the case of a maneuver, let's say 10 people will suffer. This is where the choice begins. To crush the one who suddenly appeared or risk others 10.
Now, if these others are not there, then there are no questions. traveled around and all, but in real life there are options.
At the same time, in addition to pure logic, human society has a moral and ethical moment.

K
Konstantin Tsvetkov, 2018-10-26
@tsklab

The car is driving, suddenly a person appears on the road, runs across, the car makes a maneuver in order to bypass the person.

This topic is constantly brought up on ru-chp.livejournal.com. And it always comes down to one thing: you must slow down to a complete stop (according to traffic rules). And no other options.

X
xmoonlight, 2018-10-26
@xmoonlight

Question: why give a choice to the machine (of course, based on some data), and not give free rein to chance?
Will of chance - is it intentionally not to put on the brakes and not to use information about the traffic situation and information from other road users ?!
A bit, strange... Isn't it?!
By "car" - perhaps, it does not mean a vehicle at all, and certainly not some kind of automatic decision-making algorithm, but a much "deeper" meaning ...
But what? ... That's the whole question!

S
Saboteur, 2018-10-27
@saboteur_kiev

Because they want to make an autopilot, and for this the autopilot will have to control the car, including in non-standard situations.
If the manufacturer simply writes software that decides that it is necessary to crush the grandmothers, the relatives of the grandmothers will sue the manufacturer.
And if the algorithm is approved by the government, the manufacturer will say that "my car under these conditions did everything according to the law."

A
Alexander Skusnov, 2018-10-27
@AlexSku

The decision in the traffic rules: just slow down (without changing lanes), if possible, go around an obstacle or smoothly pull over to the side of the road (not into a ditch). That is, if there is no possibility of a detour (other people or oncoming cars), then do not steer.
The microprocessor is good: people formulate the question incorrectly. The microprocessor cannot see two obstacles AT THE SAME TIME. At least for a millisecond, but the events will come sequentially. Therefore - the rule of the queue: first come - first served. Those. the goal is to save the first pedestrian. But before you steer, the car must choose a safe direction. Later (albeit in a fraction of a second) a second person appears in a direction that was safe at first. Here the machine understands that the human rules were smarter than its algorithm and it is no longer important to shoot down whom. (probably, the second time she will not steer and knock down the second)
Priorities by age, gender, beauty, etc. should not be taken into account, only by quantity.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question