Pages

Monday, September 16, 2024

Deepfake Porn: The Nadir of AI

 

In Dante's Inferno, Hell is imagined as a conical pit with ever-deepening rings dedicated to the torment of worse and worse sinners.  At the very bottom is Satan himself, constantly gnawing on Judas, the betrayer of Jesus Christ. 

 

While much of Dante's medieval imagery would be lost on most people today, we still recognize a connection in language between lowness and badness.  Calling deepfake porn the nadir of how artificial intelligence is used expresses my opinion of it, and also the opinion of women who have become victims of having their faces stolen and applied to pornographic images.  A recent article by Eliza Strickland in IEEE Spectrum shows both the magnitude of the problem and the largely ineffective measures that have been taken to mitigate this evil—for evil it is.

 

With the latest AI-powered software, it can take less than half an hour to use a single photograph of a woman's face to produce a 60-second porn video that makes it look like the victim was a willing participant in whatever debauchery the original video portrayed.  A 2024 research paper cites a survey of 16,000 adults in ten countries, and shows that 2.2% of the respondents reported being a victim of "non-consensual synthetic intimate imagery," which is apparently just a more technical way of saying "deepfake porn."  The U. S. was one of the ten countries included, and 1.1% of the respondents in the U. S. reported being victimized by it.  Because virtually all the victims are women and assuming men and women were represented equally in the survey, that means one out of every fifty women in the U. S. has been a victim of deepfake porn. 

 

That may not sound like much, but it means that over 3 million women in the U. S. have suffered the indignity of being visually raped.  Rape is not only a physical act; it is a shattering assault on the soul.  And simply knowing that one's visage is serving the carnal pleasure of anonymous men is a horrifying situation that no woman should have to face.

 

If a woman discovers she has become the victim of deepfake porn, what can she do?  Strickland interviewed Susanna Gibson, who founded a nonprofit called MyOwn to combat deepfake porn after she ran for public office and the Republican party of Virginia mailed out sexual images of her made without her consent.  Gibson said that although 49 of the 50 U. S. states have laws against nonconsensual distribution of intimate imagery, each state's law is different.  Most of the laws require proof that "the perpetrator acted with intent to harrass or intimidate the victim," and that is often difficult, even if the perpetrator can be found.  Depending on the state, the offense can be classified as either a civil or criminal matter, and so different legal countermeasures are called for in each case.

 

Removing the content is so challenging that at least one company, Alecto AI (named after a Greek goddess of vengeance) offers to search the Web for a person's image being misused in this way, although the startup is not yet ready for prime time.  In the absence of such help, women who have been victimized have to approach each host site individually with legal threats and hope for the best, which is often pretty bad.

 

The Spectrum article ends this way:  ". . . it would be better if our society tried to make sure that the attacks don't happen in the first place."  Right now I'm trying to imagine what kind of a society that would be.

 

All I'm coming up with so far is a picture I saw in a magazine a few years ago of Holland, Michigan.  I have no idea what the rates of deepfake porn production are in Holland, but I suspect they are pretty low.  Holland is famous for its Dutch heritage, its homogeneous culture, and its 140 churches for a town of only 34,000 people.  The Herman Miller furniture company is based there, and the meme that became popular a few years ago, "What Would Jesus Do?" originated there. 

 

Though I've never been to Holland, Michigan, it seems like it's a place that emphasizes human connectedness over anonymous interchanges.  If everybody just put down their phones and started talking to each other instead, there would be no market for deepfake porn, or for most of the other products and services that use the Internet either.

 

As recently as 40 years ago, we had a society in which deepfake porn attacks didn't happen (at least not without a lot of work that would require movie-studio-quality equipment to do).  That was because the technology wasn't available.  So there's one solution:  throw away the Internet.  Of course, that's like saying "throw away the power grid," or "throw away fossil fuels."  But people are saying the latter, though for very different reasons. 

 

This little fantasy exercise shows that logically, we can imagine a society (or really a congeries of societies—congeries meaning "disorderly collection") in which deepfake porn doesn't happen.  But we'd have to give up a whole lot of other stuff we like, such as the ability to use advanced free Internet services for all sorts of things other than deepfake porn. 

 

The fact that swearing off fossil fuels—which are currently just as vital to the lives of billions as the Internet—is the topic of serious discussions, planning, and legislation worldwide, while the problem of deepfake porn is being dealt with piecemeal and at a leisurely pace, says something about the priorities of the societies we live in. 

 

I happen to believe that the devil is more than an imaginative construct in the minds of medieval writers and thinkers.  And one trick the devil likes to pull is to get people's attention away from a present immediate problem and onto some far-away future threat that may not even happen.  His trickery appears to be working fine in the fact that deepfake porn is spreading with little effective legal opposition, while global warming (which is undeniably happening) looms infinitely larger on the worry lists of millions. 

 

Sources:  Eliza Strickland's article "Defending Against Deepfake Pornography" appeared on pp. 5-7 of the October 2024 issue of IEEE Spectrum.  The article "Non-Consensual Synthetic Intimate Imagery:  Prevalence, Attitudes, and Knowledge in 10 Countries" is from the Proceedings of the CHI (presumably Computer-Human Interface) 2024 conference and available at https://dl.acm.org/doi/full/10.1145/3613904.3642382. 

No comments:

Post a Comment