It’s quite true that in my travels, I have sampled and studied my share of porn. I’ve noted addiction and the effects it has on the brain, I’ve seen compulsive viewing, and I know how it has branched out from “backroom” VHS rental stores and adult book and peep stores to convenient use located all over the internet from the mid-1980s to the present.
I know the genres and the niche genres within them. I’ve seen everything from Deepthroat with Linda Lovelace to anonymous “amateur” shorts scattered everywhere. You name it, and I have seen it.
Lately, I regret ever having been exposed to porn by my parents. Through the years, I’ve seen pictures, bought magazines, and I’ve seen 8mm reels for projectors, and more.
The effects porn had on me have been devastating. After seeing “zoo” porn (humans, usually women, with animals), you’d be forgiven for thinking that there’s nothing else to see; you’ve reached the bottom of hell.
Trust me, you have not.
Some things you may hear about but never see. If you’re lucky.
It’s a terrible pit of bubbling tar waiting to drag in anyone it can and drown them.
In the recent past, or the past 15 years, nude models who bore some resemblance to any singer or actress would have to do. And while some posed for Playboy over the years, beginning with first issue centerfold Marilyn Monroe, most turned the offers down. I’m not getting into the New York and California party scenes or the top fashion models who exposed breasts and became (usually mediocre) actresses. For now, there’s a much more alarming porn trend known as “deepfakes” that we should all be concerned about.
When it became news last week that Taylor Swift had been targeted by some very realistic fake porn videos and pictures, I didn’t hear anything about it. But I was aware, since at least 2008, that altered pictures had made the porn sites. These began with convincing porn pics of customers in, say, a Walmart, where you hardly need to add to or take away from a person’s image to make it lewd or insulting (I once saw a woman wearing sheer pantyhose with no panties in Walmart).
That older type of image “enhancement” has always been around, but home computer systems made it easier. For Joe Lunchbox, it wasn’t doable, but enough dedicated anarchist-perverts were around that could pull it off.
As I tried to gauge the popularity of fakes in both film and photo, I became aware that these were not sectioned off as separate genres or niches. They were mixed in with everything else. Sometimes, a fake was obvious. I was able to recognize the body of a nude model with a faked head on it. Crude and laughable at first. Now –we’ll now, you can’t tell.
Deepfakes represent a danger we’re only beginning to comprehend. It can be used for revenge porn, selfies of real school kids, or it can be used against people teaching, political figures, and, of course, celebrities.
Considering my recent rants about Taylor Swift, I wrote exactly how I felt, but this is wrong. Nobody should be the target of that level, which I cannot reach or even comprehend, of malice.
The questions on the legal side are many and frightening. Will victims be able to sue (and will that even help them when the damage is done)? What wounds can these cause, and will they be used to break up marriages and families? Can you set someone up with this tech?
How about placing someone who’s innocent at a crime scene?
Because if we’ve learned anything, it’s that technology keeps getting more and more powerful.
I’m sorry for Taylor Swift and Blake Lively. They shouldn’t have to deal with this humiliation and cruelty. I may not be a Swifty, but this stuff makes me sick. And I’ve seen some things before that you’d never believe. If you saw them, some of you would puke.
This is something I find sickening and terrifying at the same time. But can we stop it?
I don’t believe so.