Review of Robot Rights by David J. Gunkel

  • PDF / 499,304 Bytes
  • 4 Pages / 595.276 x 790.866 pts Page_size
  • 91 Downloads / 192 Views

DOWNLOAD

REPORT


BOOK REVIEW

Review of Robot Rights by David J. Gunkel David J. Gunkel (2018). Robot Rights. M. I. T. Press, Cambridge. ISBN: 9780262038621 Kestutis Mosakas1

© Springer-Verlag London Ltd., part of Springer Nature 2020

The recent literature on the ethics of artificial intelligence (AI) and robotics has been dominated by inherently anthropocentric questions, such as the impact of AI agency on human beings and how to utilize AI without violating the fundamental values of society. Meanwhile, the question of humans’ moral duties towards artificially intelligent robots has been relatively overlooked. David Gunkel offers a lively, provocative attempt to address the important issue of whether robots can or should have moral and/or legal rights. Gunkel adopts an approach akin to Socratic investigation, surveying various experts on the question at hand to determine whether it is the right kind of question to ask in the first place. He takes into consideration a broad range of opinions, including those of philosophers, ethicists, IT and robotics specialists, legal theorists, policymakers, and even science fiction writers. Gunkel’s varied sources range from academic articles, books, and legal documents to blog posts, news articles, podcasts, and interviews, along with science fiction. His excellent use of these sources makes the reading process an engaging and stimulating experience. Ultimately, Gunkel’s investigations lead him to conclude that the standard approaches to moral consideration are all lacking in some vital aspects and that a radically different perspective is needed. He sidesteps the “yes or no” dichotomy and instead invites us to think differently about the very question of robot rights. In this way, his book presents a challenge to traditional moral philosophy. To help readers of this review understand Gunkel’s method, I must first outline some of the heuristics he uses to approach the question of robot rights. The first is the so-called “is-ought” distinction, also known as “Hume’s * Kestutis Mosakas [email protected] 1



Department of Philosophy, Faculty of Humanities, Vytautas Magnus University, V. Putvinskio g. 23, 44243 Kaunas, Lithuania

guillotine.” In his famous Treatise of Human Nature, David Hume notes that in discussions about values, such as those about ethics and morality, there is often a conceptual slippage from mere facts about the world (i.e., how the world is) into the realm of values (i.e., how the world ought to be). According to the most common contemporary interpretations of Hume, this leap cannot be justified; if our argument contains only facts as its premises, we cannot draw a conclusion that contains a value judgment, for there is a categorical difference between facts and values. Gunkel is not concerned with bridging the “is-ought” gap or questioning whether the fact-value distinction is a pertinent one. Rather, he contends that the most common approaches to the question of machine rights tend to follow what we could call the “is-ought script.” In other words, we observe wha