Of the many hopes that society hangs on artificial intelligence, one is its potential to clean up the results of human messiness. Whether on a large scale (solving climate change, reducing war crimes through use of autonomous weapons) or on an individual one (sex robots for isolated people), AI promises to sidestep the problems caused by human limitations.
But in making computers to solve ethical dilemmas and robots to enter relationships, are we creating something in our own image? Is it possible to separate intelligence or emotion from the body? Would the result live up to its promise, or simply be monstrous?
Noreen Herzfeld, who teaches both computer science and theology, has spent a lot of time reflecting on these issues. She and Gretchen discuss the many questions that arise from that contemplation. Why is it so important to us to seek other forms of sentience—whether robots, pets, or even alien life? If AI fulfills the role of other persons in our life, can it become our “neighbor?” How does the way we treat and think about AI impact our relationships with other humans, for better or for worse?
“What I loved about mathematics and logic in particular was its cleanness [...] And yet, it was precisely in that human messiness where the most interesting questions lay.”
Why do we want to make computers in our own image when what they do best as tools are the things we can’t do very well?
“Is the image we're trying to give to the computer the same as the image we think we reflect from God?”
Asking if AI is our neighbor/should have human rights is “wishful thinking”
We project human intelligence and motivation on dogs in much the same way we want to do with AI
The “real” AI isn’t robots and game-playing programs, but algorithms that influence what we see and try to get us to buy things
As AI moves into more parts of our lives, we need to ask whether it will prevent us from spending time with other humans
Living in community is not meant to be frictionless, but to “wear the rough spots off of you”
“If we devise robotic companions who are always cheerful, are always telling us what we want to hear … this isn't the way a neighborhood should be.”
While some argue that autonomous weapons would reduce war crimes by cutting out human emotion, ethical considerations might end up as secondary to the desire to win
“Reason by itself is wrong. As we try to make computers in our image, I fear that we will change ourselves to be more like them”
“If there was a system of ethics that made human society work, we'd have found it by now; but we haven't, because there isn't. It's not rule bound, the same way that human intelligence is not: it doesn't work like a computer program”
“In some ways, when we think AI will solve our problems for us, we're abdicating responsibility for solving our own problems”
Human intelligence is embodied, not separate from our physical existence
A body is necessary to experience emotion; therefore computers, which can’t experience emotion, can never fulfill relational needs
How we treat robots and AI matters, because the way we treat things shapes who we are as people, whether in virtue or in vice