Robot: I have working definitions for those terms. I
don’t understand. Do you want something for dessert?
Frank: Do you have any programming that makes you
obey the law?
Robot: Do you want me to incorporate state and federal law directly into my programming?
Frank: No, no, no, no! Leave it as it is. You’re starting
to grow on me.
What Are the Ethical Issues?
Robot & Frank is at once a comic caper movie and an
elegiac examination of aging and loss. Its protagonist, Frank, is a retired jewel thief whose children get
him a caretaker robot so he can stay in his home,
even while his dementia progresses. While the movie
seems simple and amusing in many ways, when
approached from the perspective of how it speaks to
the role of robots in our society, it raises some disturbing issues. For instance, it turns out that Frank’s
health is Robot’s top priority, superseding all other
considerations (including the wellbeing of others).
During the course of the movie, we find that Robot
plays a central role in steering Frank back into a life
of crime. Robot’s protocols for helping Frank center
on finding a long-term activity that keeps Frank mentally engaged and physically active. Because preparing for a heist meets these criteria, Robot is willing to
allow Frank to rob from his rich neighbors, and even
to help him.
Robot and Frank develop an odd friendship over
the course of the story, but the movie makes clear
that Robot is not actually a person in the same way
that human beings are, even though Frank — and
through him, the audience — come to regard him as
if he were. Moreover, for much of the movie, Frank’s
relationship with Robot complicates, and even takes
priority over, his relationships with his children.
At the end, (spoiler warning!), in order to escape
arrest and prosecution, Robot persuades Frank to
wipe his memory. Even though Robot has made it
clear that he is untroubled by his own “death,” Frank
has essentially killed his friend. What are the moral
ramifications of this?
How Does Ethical Theory Help
Us Interpret Robot & Frank?
Does Deontology Help? The premise of the movie —
that Robot is guided solely by his duty to Frank —
seems to put deontology at the center. Robot’s duty is
to Frank’s health, and that duty supersedes all other
directives, including the duty to tell the truth, even
to Frank, and to avoid stealing from others in the
community. But in privileging this duty above all
other kinds of duties, Robot’s guiding laws are local,
rather than universal.
The deontological question is whether there is a
way that a carebot can follow the guiding principle of
his existence — to care for the person to whom it is
assigned — without violating other duties that con-
stitute behaving well in society. Robot’s choice to
attend to Frank’s well-being, at the expense of other
concerns, suggests that these things cannot easily be
Does Virtue Ethics Help? Virtue ethics proves a more
illuminating angle, on both Frank and Robot.
Though it is Robot whose memory is wiped at the
end — and with it, his very selfhood — Frank is also
suffering from memory loss. Like Robot, Frank is constituted in large part by his memories; unlike Robot,
he is a person who has made choices about which
memories are most important. Frank is not only a
jewel thief but a father, though he was largely absent
(in prison) when his now-adult children were growing up. Throughout the movie, Frank frequently reminisces about the highlights of his criminal career, but
only occasionally about his children. At the climax
of the movie, we learn important details of Frank’s
family history that he himself has forgotten, and it
becomes clear that his choice to focus on his memories of thieving have quite literally cost him those
other family-related memories, and with them a
complete picture of himself.
Virtue ethics can also help us understand Robot
more clearly: instead of following universal laws such
as deontology would prescribe, Robot is making
choices according to his own particular goals and
ends, which are to care for Frank. Robot, it seems, is
operating by a different ethical theory than the robot
designer might expect. But though Robot is acting in
accordance with his own dedicated ends, he seems to
lack “phronesis,” the capacity for practical wisdom
that would allow him to exercise nuanced judgment
about how to act. Whether he is genuinely unaware
about the social harm caused by stealing, or simply
prioritizes Frank’s well-being over the thriving of others, Robot’s willingness to accommodate, and even
encourage, Frank’s criminality suggests that his reasoning abilities are not adequate to the task of making socially responsible ethical judgments. Moreover,
Robot works to preserve Frank’s physical health at
the direct expense of his moral well-being, suggesting that Robot has a limited understanding even of
his own appointed task of caring for Frank.
Furthermore, Robot — unlike nearly any human
being — seems untroubled by the prospect of his own
destruction, telling Frank that he doesn’t care about
having his memory wiped. Robot’s complete absence
of self-regard makes him difficult to evaluate with the
same criteria that virtue ethics uses for human actors,
because virtue ethics presumes (on the basis of good
evidence!) that human beings are concerned about
their own welfare and success, as well as that of others. In this way, the movie may be suggesting that
human beings and robots may never be able to
understand each other.
However, we can also understand this differently.
Even though Robot’s memory is wiped and he van-ishes (the last shot of two identical model carebots in