Monday, October 30, 2023

The Advent of Digital Twins: Should They Replace Caregivers?

 

It's 2027, and your father is in a rest home suffering from Alzheimer's disease.  You are considering a new service that takes samples of your voice and videoclips of you, and creates a highly realistic 3-D "digital twin" that your father can talk with on a screen any time he wants to.  The digital twin has your voice and mannerisms, and shows up on your father's phone to remind him to take his medicine and furnish what the company offering the service calls "companionship."  In the meantime, you yourself can simply go about your own life without having to do the largely tedious work of getting your father to take care of himself. 

 

Should you go ahead and pay for this service?  Or should you just continue with your daily visits to him, visits that are becoming increasingly inconvenient?

 

I put this scenario a few years in the future, but already academics are considering the ethical implications of using digital twins in healthcare.  Matthias Braun, an ethicist at the Friedrich-Alexander University in Germany, thinks that the answer to this question depends on the issue of how much control the original of the twin exerts over it.  Applying that notion to the situation I just outlined, who is involved, and what benefits and harms could result?

 

The people involved are you, your father, and the organization providing the digital twin.  Digital twins are not people—they are software, so while the digital twin is at the core of the issue, it has no ethical rights or responsibilities of its own. 

 

Consider your father first.  It may be that his mind is so fogged by Alzheimer's disease that he may be completely fooled into thinking he is talking on the phone with and watching you, when in fact he's speaking with a sophisticated piece of software.  So by means of the digital twin, your father may well be persuaded to believe something that is not objectively true. 

 

But people who deal with Alzheimer's patients know that sometimes the truth has to be at least elided, if not downright falsified.  When my wife's father with dementia lived with us, he would often ask, "Where's your mother?"  His wife had died some years previously.  An answer like, "She's not here right now," doesn't strictly violate the truth, but leaves an impression that is false.  Nevertheless, it's likely to be a less disruptive reply than something like, "You dummy!  Don't you remember she died in 2007?"

 

Then consider you.  One alternative to providing the digital twin is to hire a full-time personal caregiver, as some people can afford to do.  Besides the expense, there is the question of whether your father will get along with such a person.  While my father-in-law was with us, we tried hiring a caregiver for limited times so that my wife and I could get a few hours' break from continuous 24-hour caregiving.  Unfortunately, the caregiver—an older man—didn't appeal to his patient, and after one such visit we got an earful of complaints about "that guy," and it didn't work out.  So in addition to being expensive, personal caregivers don't always do the job the way you hoped they would.

 

From your perspective, the digital-twin caregiver has the advantage that if successful, your father will think he is really talking with a very familiar person, and is more likely to follow instructions than if a stranger is dealing with him.

 

So where's the harm?  What could possibly go wrong?

 

Consider hacking.  No computer system is 100% secure, and the opportunities for mischief ranging from random meddling to theft and murder are obviously present if someone managed to gain control of the digital twin's software.  It wouldn't be easy, but a lot of very difficult hacks have been carried out by criminals in the past, and if the motivation is there, they will find a way sooner or later. 

 

Even if criminals aren't interested in messing with digital-twin rest-home caregivers, what if your father starts to like the digital twin more than he likes your real physical presence?  After all, a digital twin could be programmed to have nearly infinite patience in dealing with the repeated questions that dementia patients often ask—"Where's your mother?" being a prime example.  How would you feel if you visit your father some day and he says, "I like you a lot better on the screen than I like you now."? 

 

And even if the digital twin doesn't manage to alienate you, the original of its copy, I can't rid myself of a feeling of distaste that if the twin succeeds in fooling your father into thinking it's really you, a species of fraud has been committed.

 

At a minimum, even a successful digital-twin substitution would mean that once again in our digital world, an "I-thou" relationship, in Martin Buber's terms, has been replaced by an "I-it" relationship.  Instead of continuing one of the most meaningful relationships anyone can have in this life—the relationship with one's father—that relationship would be replaced by one that connects your father to a machine.  Yes, a sophisticated machine, a machine that tricks him into thinking he's talking with you, but a machine nonetheless.  In the greater scheme of things, and even leaving religious considerations aside, it's hard to believe that both you and your father would be ultimately better off if your father spent his days talking with a computer and you went about whatever other business you have instead of spending time with him. 

 

Digital twins are not yet so thick on the ground that we have to deal with them as a routine thing—not yet.  But if the momentum of generative AI keeps up its current pace, it is only a matter of time before they will be a genuine option, and we'll have to decide whether to use them, not only in a medical context but in many others as well.  We should sort out what is right and wrong about their use now, before it's too late.

 

Sources:  Matthias Braun's article "Represent me: please! Towards an ethics of digital twins in medicine"  appeared in 2021 in the Journal of Medical Ethics, vol. 47, pp. 394-400. 

No comments:

Post a Comment