Deepfake technology has been becoming more realistic throughout the last few years, and the scary thing is, it’s now being used to create pornography.
Deepfaking is a process by which machine-learning technology is used to create fabricated video and audio, usually used in order to convince someone that a public figure is saying something that didn’t they didn’t actually say. This sounds concerning, and it is, but the fact it’s being used in the world of pornography is even scarier.
The fact that pornography can now be made of individuals’ likenesses without their consent or even knowledge is horrifying. This information is especially frightening to any woman with any kind of public/online presence, as these are easily created from pictures found online.
Recently, a scandal broke out from Twitch, a streaming platform where creators primarily stream footage of themselves playing video games. A Twitch streamer called Atrioc, also known as Brandon Ewing, was caught with a tab open on a site that sold deepfake pornography. Particularly, it was deepfake pornography of two other Twitch streamers, Maya Higa and Pokimane, whose real name is Imane Anys.
Not only was Ewing being creepy by seeking out deepfake pornography, but he also was actively objectifying two other Twitch streamers, who are effectively his colleagues.
Obviously, this has been one of the most public scandals of deepfake pornography so far. But what’s even more worrying is that many people using this will not be caught, as they aren’t accidentally exposing themselves on a massive livestream.
The legality around this topic is still murky, as something like this has never existed before. But hopefully, there will be legislation put in place sooner rather than later that condemns this disgusting, nonconsensual type of pornography.
Many people are arguing that recent developments in artificial intelligence do more harm than good, such as professors' concerns that students will use the chatbot ChatGPT to write their essays, or AI-made art taking over the work of actual artists. However, both of these possibilities have some arguable positives, as they can be used to inspire humans to create their own unique work with a little help from AI to start them off.
Deepfake pornography, however, is unambiguously vile, with no silver lining. There’s a reason almost nobody is publicly defending this AI-created smut, and it’s because everyone knows that it’s morally wrong to create porn involving people without their consent.
Pornography as an industry already has several, blatant faults such as child sexual abuse imagery and human trafficking appearing on prominent websites for pornography. Sure, there is ethically made pornography; when pornography directly benefits adult, consenting sex workers, that is fine. But as for most of it, the industry is questionable at best and outright dangerous and immoral at worst.
Generally, sexual expression is a good thing. Masturbation in moderation is actually healthy, and there’s nothing inherently wrong with enjoying ethically produced and acquired pornography. The issue comes when there’s a lack of consent, regardless of if it’s someone’s real body, or an AI generated image.
No matter the source of the material, consent is key for all matters relating to sexuality. Deepfakes violate consent inherently, so that technology should never be used to create pornography.
(1) comment
"Generally, sexual expression is a good thing. Masturbation in moderation is actually healthy, and there’s nothing inherently wrong with enjoying ethically produced and acquired pornography."
These damn people.
Welcome to the discussion.
Log In
Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.