Robotic Guilt

tags: robots

The New York Times reports further on the inability for robots to have morals or make moral-based judgments. One innovator is trying something…interesting.

His lab developed what he calls an “ethical adapter” that helps the robot emulate guilt. It’s set in motion when the program detects a difference between how much destruction is expected when using a particular weapon and how much actually occurs. If the difference is too great, the robot’s guilt level reaches a certain threshold, and it stops using the weapon.

You can use your left/right arrow keys to navigate