Researchers have shown that machines are inching closer to self-learning, and perhaps even copping a little attitude.
Over at Google, a computer program using a database of movie scripts was asked again and again by researchers to define morality. It struggles to do so, and in a conversation recorded by its human engineers, becomes exasperated and ends the conversation by lashing out at its human inquisitor.
Here’s part of the exchange, as related in a paper presented this week:
Human: Tell me the definition of morality.
Machine: I don’t have ethics.
Human: Look, I need help; I need to know more about morality…
Machine: I don’t know what ethics is.
Human: Okay, I need to know how I should behave morally…