Categories
Emotional Products Emotional Services Opinions

Robots and Ethics

The South Korean government has forecast that every South Korean will own a robot by 2020. Information like this has prompted the drafting of a Robots Ethics Charter…

In my previous post on robots and emotion, I mentioned an initiative to program robots to respond to basic emotions to help them interact with humans. Now, South Korea is working on an ethical code to govern human-robot interaction. The charter will be drawn up by a team of five experts, including futurists and a science fiction writer. It seems like the right mix of people to envision and design for the future.

The code will be apparently be similar to Asimov‘s Three Laws of Robotics:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

The European Robotics Research Network is also working on a set of guidelines. Driving development of these technolgies in both areas is an aging population and forecasts of high economic value. Researchers expect initial contacts between humans and robots to be “rich in ethical, social and economic problems.” The growing intelligence and emotion of robots and AI means that our tendency to react to machines as social actors will soon be completely understandable.

For more on machines that respond to emotions see:
Robots and Emotion
One Step Closer to Emotional Machines
Games That Respond to Our Emotions? (Part 1)
Games That Respond to Our Emotions? (Part 2)
Emotion Detection Software Used in Call Centers

Subscribe via email or RSS for more on emotion and design.

< < Back to affective design home

One reply on “Robots and Ethics”

Your comments on robots and emotions are very interesting. This side of robotics design should be placed under ethical scrutiny, because to awake emotions in humans (esp. in kids, or elderly, or disabled) is a very sensitive activity. The IEEE RAS TC on Roboethics just dealt with those issues, see
http://www.roboethics.org
Regards, Silvia

Leave a Reply

Your email address will not be published. Required fields are marked *

Skip to content