It is reported that Driverless cars (Self
driving cars) are expected to be on Britain’s roads by 2025. The cars can make
“life and death” decisions on the roads! How?
So far, robots and other artificial intelligence
based systems are being used in hospitals for surgery and other interventional
procedures. Now, driverless cars on the roads to take moral decisions in
situations where a collision with human being on the road is unavoidable!
A neuroscientist Leon Sutfeld and Prof. Gordon
Pipa at the University of Osnabruck, Germany who are working on driverless
cars, is stating that Human behavior in dilemma situations can be modeled by a “simple
value of life based model that is attributed by the participant to every human,
animal or inanimate object”. (Ref. Frontiers in Behavioral Neuroscience, 2017,
11, DOI:10.3389/fnbeh.2017.00122).
The ethical aspect is
questionable? Can machines take
human-like moral decisions?
It is true that everybody is expected to follow traffic rules
strictly. However, can a driverless car run over a child running on to the road
rather than crashing on to an adult standing on the footpath while trying to
save the child? Any answers?
No comments:
Post a Comment