Monday, June 02, 2025

AI-Induced False Memories in Criminal Justice: Fiction or Reality?

 

A filmmaker in Germany named Hashem Al-Ghaili has come up with an idea to solve our prison problems:  overcrowding, high recidivism rates, and all the rest.  Instead of locking up your rapist, robber, or arsonist for five to twenty years, you offer him a choice:  conventional prison and all that entails, or a "treatment" taking only a few minutes, after which he could return to society a free . . . I was going to say, "man," but once you find out what the treatment is, you may understand why I hesitated.

 

Al-Ghaili works with an artificial-intelligence firm called Cognify, and his treatment would do the following.  After a detailed scan of the criminal's brain, false memories would be inserted into his brain, the nature of which would be chosen to make sure the criminal doesn't commit that crime again.  Was he a rapist?  Insert memories of what the victim felt like and experienced.  Was he a thief?  Give him a whole history of realizing the loss he caused to others, repentance on his part, and rejection of his criminal ways.  And by the bye, the same brain scans could be used to create a useful database of criminal minds to figure out how to prevent these people from committing crimes in the first place.

 

Al-Ghaili admits that his idea is pretty far beyond current technological capabilities, but at the rate AI and brain research is progressing, he thinks now is the time to consider what we should do with these technologies once they're available. 

 

Lest you think these notions are just a pipe dream, a sober study from the MIT Media Lab experimented with implanting false memories simply by sending some of a study group of 200 people to have a conversation with a chatbot about a crime video they all watched.  The members of the study did not know that the chatbots were designed to mislead them with questions that would confuse their memories of what they saw.  The researchers also tried the same trick with a simple set of survey questions, and left a fourth division of the group alone as a control.

 

What the MIT researchers found was that the generative type of chatbot induced its subjects to have more than three times the false memories of the control group, who were not exposed to any memory-clouding techniques, and more than those who took the survey experienced.  What this study tells us is that using chatbots to interrogate suspects or witnesses in a criminal setting could easily be misused to distort the already less-than-100%-reliable recollections that we base legal decisions on. 

 

Once again, we are looking down a road where we see some novel technologies in the future beckoning us to use them, and we face a decision:  should we go there or not?  Or if we do go there, what rules should we follow? 

 

Let's take the Cognify prison alternative first.  As ethicist Charles Camosy pointed out in a broadcast discussion of the idea, messing with a person's memories by direct physical intervention and bypassing their conscious mind altogether is a gross violation of the integrity of the human person.  Our memories form an essential part of our being, as the sad case of Alzheimer's sufferers attests.  To implant a whole set of false memories into a person's brain, and therefore mind as well, is as violent an intrusion as cutting off a leg and replacing it with a cybernetic prosthesis.  Even if the person consents to such an action, the act itself is intrinsically wrong and should not be done. 

 

We laugh at such things when we see them in comedies such as Men in Black, when Tommy Lee Jones whips out a little flash device that makes everyone in sight forget what they've seen for the last half hour or so.  But each person has a right to experience the whole of life as it happens, and wiping out even a few minutes is wrong, let alone replacing them with a cobbled-together script designed to remake a person morally. 

 

Yes, it would save money compared to years of imprisonment, but if you really want to save money, just chop off the head of every offender, even for minor infractions.  That idea is too physically violent for today's cultural sensibilities, but in a culture inured to the death of many thousands of unborn children every year, we can apparently get used to almost any variety of violence as long as it is implemented in a non-messy and clinical-looking way.

 

C. S. Lewis saw this type of thing coming as long ago as 1949, when he criticized the trend of substituting therapeutic treatment of criminals as suffering from a disease, for retributive fixed-term punishment as the delivery of a just penalty to one who deserved it.  He wrote "My contention is that this doctrine [of therapy rather than punishment], merciful though it appears, really means that each one of us, from the moment he breaks the law, is deprived of the rights of a human being." 

 

No matter what either C. S. Lewis or I say, there are some people who will see nothing wrong with this idea, because they have a defective model of what a human being is.  One can show entirely from philosophical, not religious, presuppositions that the human intellect is immaterial.  Any system of thought which neglects that essential fact is capable of serious and violent errors, such as the Cognify notion of criminal memory-replacement would be.

 

As for allowing AI to implant false memories simply by persuasion, as the MIT Media Lab study appeared to do, we are already well down that road.  What do you think is going on any time a person "randomly" trolls the Internet looking at whatever the fantastically sophisticated algorithms show him or her?  AI-powered persuasion, of course.  And the crisis in teenage mental health and many other social-media ills can be largely attributed to such persuasion.

 

I'm glad that Hashem Al-Ghaili's prison of the future is likely to stay in the pipe-dream category at least for the next few years.  But now is the time to realize what a pernicious thing it would be, and to agree as a culture that we need to move away from treating criminals as laboratory animals and restore to them the dignity that every human being deserves. 

 

Sources:  Charles Camosy was interviewed on the Catholic network Relevant Radio on a recent edition of the Drew Mariani Show, where I heard about Cognify's "prison of the future" proposal. The quote from C. S. Lewis comes from his essay "The Humanitarian Theory of Punishment," which appears in his God in the Dock (Eerdmans, 1970).  I also referred to an article on Cognify at https://www.dazeddigital.com/life-culture/article/62983/1/inside-the-prison-of-the-future-where-ai-rewires-your-brain-hashem-al-ghaili and the MIT Media Lab abstract at https://www.media.mit.edu/projects/ai-false-memories/overview/. 

No comments:

Post a Comment