Giving Robots Rights Is a Bad Idea – But Confucianism Offers an Alternative

Confucianism Robot

A new study argues against granting rights to robots, instead suggesting Confucianism-inspired role obligations as a more harmonious approach. It posits that treating robots as participants in social rites—rather than as rights bearers—avoids potential human-robot conflict and fosters teamwork, further adding that respect towards robots, made in our image, reflects our own self-respect.

Notable philosophers and legal experts have delved into the moral and legal implications of robots, with a few advocating for giving robots rights. As robots become more integrated into various aspects of life, a recent review of research on robot rights concluded that extending rights to robots is a bad idea. The study, instead, proposes a Confucian-inspired approach.

This review, by a scholar from Carnegie Mellon University (CMU), was recently published in the Communications of the ACM, a journal published by the Association for Computing Machinery.

“People are worried about the risks of granting rights to robots,” notes Tae Wan Kim, Associate Professor of Business Ethics at CMU’s Tepper School of Business, who conducted the analysis. “Granting rights is not the only way to address the moral status of robots: Envisioning robots as rites bearers—not a rights bearers—could work better.”

Although many believe that respecting robots should lead to granting them rights, Kim argues for a different approach. Confucianism, an ancient Chinese belief system, focuses on the social value of achieving harmony; individuals are made distinctively human by their ability to conceive of interests not purely in terms of personal self-interest, but in terms that include a relational and a communal self. This, in turn, requires a unique perspective on rites, with people enhancing themselves morally by participating in proper rituals.

When considering robots, Kim suggests that the Confucian alternative of assigning rites—or what he calls role obligations—to robots is more appropriate than giving robots rights. The concept of rights is often adversarial and competitive, and potential conflict between humans and robots is concerning.

“Assigning role obligations to robots encourages teamwork, which triggers an understanding that fulfilling those obligations should be done harmoniously,” explains Kim. “Artificial intelligence (AI) imitates human intelligence, so for robots to develop as rites bearers, they must be powered by a type of AI that can imitate humans’ capacity to recognize and execute team activities—and a machine can learn that ability in various ways.”

Kim acknowledges that some will question why robots should be treated respectfully in the first place. “To the extent that we make robots in our image, if we don’t treat them well, as entities capable of participating in rites, we degrade ourselves,” he suggests.

Various non-natural entities—such as corporations—are considered people and even assume some Constitutional rights. In addition, humans are not the only species with moral and legal status; in most developed societies, moral and legal considerations preclude researchers from gratuitously using animals for lab experiments.

Reference: “Should Robots Have Rights or Rites?” by Tae Wan Kim and Alan Strudler, 24 May 2023, Communications of the ACM.
DOI: 10.1145/3571721