Skip to Main Content
Northern Illinois University professor David J. Gunkel and Knox College political science professor Thomas Bell discuss robot rights during the Rights for Robots presentation.

Knox College Presents “Rights for Robots?” A Discussion at Orpheum Theatre

The growth of artificial intelligence and increasing human dependence on it raise important questions about the future role of robots in society. These questions took center stage at Knox College's "Rights for Robots?" discussion at the Orpheum Theatre. The event examined important questions about the impact of AI on society.

Robert M. Geraci, Knight Distinguished Chair for the Study of Religion and Culture, organized the event.

The event was sponsored by the Knight Fund for the Study of Religion and Culture and comes a week after the closing of re:Generated Prairie, an exhibition by the inaugural Knight Fund Distinguished Artist-in-Residence, Michael Takeo Magruder.

“I really want to use the Knight Fund to do interesting things, and this year the focus is on AI” Geraci said. “I felt a lecture on robots and rights, whether civil or religious, could be a big thing at the Orpheum. I think it’s an interesting subject and I wanted to do it in the fall so we can tie it in with our First-Year Preceptorial seminars and have students from those classes attend.”

For the discussion, Geraci invited Northern Illinois University professor David J. Gunkel, who has written multiple books on the subject, to lead the discussion. Gunkel presented varying opinions from multiple sources while combining his thoughts about how we treat each other when it comes to morals and rights. 

“It is not really about the artifact; it is about us and the limits of who’s included in and what they are excluded from. It is about how we decide together and across differences, to respond to and take responsibility for our shared social reality,” Gunkel said. “It is in responding to the moral opportunities and challenges posed by seemingly intelligent and social artifacts that we are called to take responsibility for ourselves, for our world, and for those others who are encountered here.”


Northern Illinois professor David Gunkel speaks to the audience at the Orhpeum Theatre during the Rights for Robots discussion.

Knox College political science professor Thomas Bell provided a response to the presentation, admitting to being skeptical about rights for robots, but also admitting that it is not a subject he had given much thought to. 

“Professor Gunkel challenged the preconceived notions that many of us have about robots and rights,” Bell said. “I think this challenge is essential to move the conversation forward, but I think Professor Gunkel is correct in thinking that it is going to be nearly impossible to sway the vast majority of people, myself included, that robots ought to have rights that are attached to them intrinsically, as we often talk about human rights. I am resistant to thinking about robot rights in this way, but Professor Gunkel’s relational approach to ethics and robot rights has given me new opportunities to consider, some of which I think can move the conversation forward.” 

A question-and-answer session from Knox College students, faculty, staff, and Galesburg community members followed, pressing Gunkel on his thoughts on rights for robots and how they can be applied to human rights. Questions also arose on how AI is impacting everything from the environment, social interaction, daily tasks, large language models (Chat GPT), gaining awareness and consciousness, and, if robots are given rights, how they can benefit humans. 

Gunkel even cited recent court actions to demonstrate that non-human or living beings already have rights and responsibilities, and how those court actions could lead to AI receiving the same rights.

“Today, we live in a world where artifacts have rights and responsibilities, not just human beings, but human-made artifacts that have rights and responsibilities—corporations are considered people,” Gunkel said. “They’re not actual living persons like you and I, but they do have rights by legal decree. The reason we did that is not because they have feelings or emotions or are conscious, but we need to fit them into our moral and legal framework. We give them rights and responsibilities, so they align with our moral and legal decision-making. Right now, there are some lawyers going around the world filing lawsuits on behalf of AI following the corporate model.”

He also mentioned the need to potentially classify robots into another category of legal rights, outside of being seen as a ‘person’ or ‘thing’.

“In the subject of law, we have two categories: you are a person, a subject of law, or a thing, an object of law,” he said. “The problem is, if we don’t get these categories right, we can have situations where corporations can hide behind their chatbots and use them as liability shields. So, we have to find a way to figure out who is accountable, who is liable for wrongdoing, harm, and damages that occur. It may be that these categories are too limited to accommodate these challenges.”  

No matter what side of the discussion you may fall on, Gunkel insists that more people need to be involved in the discussion because a choice may be made for you.

“What kind of rights are on the table and which are off the table, that’s a conversation we have to have, but we don’t have that conversation unless we start to ask the questions,” he said.