Deepfakes Causing Strife For Female Streamers
It has been an exhausting month for women in the streaming industry, and it has everything to do with deepfake porn and consent.
If you have never heard of deepfake porn, you are not the only one. Deepfake Pornography is when computer technology maps an individual's face onto a sexually explicit video or image. In simpler terms, it means taking the look of someone from a selfie they post, putting it on a digital body doing whatever they want, and creating a fake image that appears very real.
This topic was brought more into the spotlight on January 26th during a live stream of Brandon Ewing, who goes by Atrioc online, when he accidentally showed off a tab filled with deepfake porn of several of his fellow streamers, including those he knows and works with.
I will not reveal most of their names as these images were not made with their consent, nor will I show the website in order to lessen the spread of the photos.
Since then, Ewing has released a video tearfully apologizing for his actions, announcing he will be stepping away from streaming for a while; however, the damage has already been done.
In the aftermath, popular female streamer QTCinderella or Blaire as she is known in real life released a tearful statement about the whole incident during a stream.
“This is what it looks like to feel violated,” She said. “This is what it feels like to be taken advantage of; this is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.”
Blaire also mentioned pursuing legal action against the deep fake website with her doctored image on it in this stream. Since then, the creator of this website has scrubbed the domain from the internet, issuing an apology over the controversy.
“The best course of action I have understood is to just wipe my part off the internet and help decrease the number of future videos of those involved. You will not see me pop up again.”
However, while Blaire has accepted this apology, she is still pursuing legal action as someone has to set the legal precedent for deepfake pornography. Technically speaking, deepfake porn is not illegal despite having real-world consequences on the victim's life.
This is for a couple of reasons. One of the biggest is that many people cannot understand that online figures are real people, not just faces on screens thus consent is blurred for these individuals, especially with this problem skewing towards women.
Many see a prominent woman in the streaming industry post a selfie that may have sexual connotations and believe that to be consent to use her likeness in more sexually explicit material when that is simply not the case.
The second part is, unfortunately, the law is simply slow. This technology is developing faster than the law can keep up with at this point.
The law is in a good place with crimes such as Revenge Porn. However, it took about 10-20 years to get to this place. Due to that, by the time one problem is solved, another more complicated problem takes its place, and it’s very likely that by the time deepfake pornography is made illegal, another even worse thing will take its place.
The second issue is that laws in America are written to be within state lines, but the Internet is not contained within state lines. While people in Texas have the same access to the Internet as those in, say, Illinois and California, this issue also extends to other countries such as Japan or Australia. So it becomes nearly impossible for a police force in Iowa to arrest someone in Germany.
The best way to fight against Deepfake Porn running rampant may just be to start copyrighting people’s likenesses and bring a civil suit against the maker of these doctored images and videos.
However, we will have to keep a close eye on this issue and hope that the law can be modified to deal with these issues faster as technology is getting more intelligent. What will happen in the next ten years if it takes little effort to doctor someone’s image or voice now?