Robotic Guilt

Deane Barker tags: robots

The New York Times reports further on the inability for robots to have morals or make moral-based judgments. One innovator is trying something…interesting.

His lab developed what he calls an “ethical adapter” that helps the robot emulate guilt. It’s set in motion when the program detects a difference between how much destruction is expected when using a particular weapon and how much actually occurs. If the difference is too great, the robot’s guilt level reaches a certain threshold, and it stops using the weapon.

Every day, between 7,000 and 10,000 unique visitors come to this website. I don't keep analytics, so I have no idea why you're here. Maybe get in touch with me and tell me why you visited today?