real life

'This is what pain looks like.' What happened when Blaire found deepfake porn of herself.

28-year-old Blaire knows exactly what it's like to be violated online.

As someone who has a public profile online and creates content on the streaming platform Twitch via the name 'QTCinderella', Blaire is inundated with comments from complete strangers every day. And those comments are often sexually inappropriate and uncomfortable to read. 

These comments spiked recently when Blaire received a flurry of DMs saying an 'adult video' of her was going viral on a pornographic website. Blaire was confused and immediately distraught. She had never created a video. She knew that for a fact. So what had happened?

Blaire quickly learned she had become a victim of deepfake porn

Her face had been pasted onto another person's body to make it look as though Blaire was legitimately engaging in a pornographic video. And that video was being seen by thousands of people.

Watch an explanation on what exactly are deepfakes. Post continues below.


Video via 7News.

While this is a situation that would make many retreat from online spaces, Blaire decided to instead 'go live' on a Twitch stream. 

"I wanted to go live because this is what pain looks like. F**k the f**king internet. F**k the people DM'ing me pictures of myself from that website," she said during a stream, visibly distraught.

ADVERTISEMENT

"It should not be part of my job to have to pay money to get this stuff taken down. It should not be part of my job to be harassed, to see pictures of me 'nude' spread around. It should not be something that is found on the internet. The fact that it is, is exhausting. This is what it looks like to be violated, to be taken advantage of."

For Blaire, she doesn't know who the perpetrator is, given they're a stranger on the internet. But she knows the website that decided to publish and promote the fake pornographic video without her consent, and she plans to take legal action against them.

But in the meantime, she also wanted to use her platform to 'call out' a fellow Twitch content creator for being a part of the problem.

Atrioc (real name Brandon Ewing) recently said in a livestream of his that he had purchased deepfake pornography for his own entertainment. And those deepfakes were of popular female Twitch streamers, like Blaire – none of whom had consented to those videos being made or viewed.

Blaire 'QTCinderella'. Image: Twitch.

ADVERTISEMENT

He has since apologised, but by promoting the website in which he was watching the material, led to thousands and thousands of clicks and views – further impacting victims like Blaire.

"F**k Atroic for showing it to thousands of people. F**k the constant exploitation and objectification of women, it's exhausting," she said.

"If you are able to look at [videos of] women who are not selling themselves or benefiting off being seen sexually, you are the problem. You see women as an object, you should not be okay doing that."

Blaire isn't the only person to fall victim to deepfake pornography.

Sensity AI, a research company tracking deepfake videos, found in a 2019 study that 96 per cent were pornographic. And we can only guess the sheer amount of people within that percentage whose identity was used in a deepfake porn video without their consent. 

Several years ago, Reddit and PornHub banned deepfake porn, but smaller pornographic sites still frequently post violative pornography like deepfakes with little to no repercussions. 

In the United States, around 46 states have some form of ban on revenge porn, but only Virginia's and California's laws include faked and deepfaked media. 

And in the UK, the government recently amended the Online Safety Bill in a bid to crackdown on deepfakes.  

ADVERTISEMENT

In Australia, the laws surrounding this issue remain not entirely clear, but are being addressed.

eSafety Commissioner Julie Inman Grant said to SBS' The Feed that they have been seeing a rise in deepfake technology being weaponised to create pornographic videos, often targeting people with public profiles. 

"As this technology becomes more accessible, we expect everyday Australian citizens will also be affected. Posting nude or sexual deepfakes would be a form of image-based abuse, which is sharing intimate images of someone online without their consent."

Image-based abuse is a breach of the Online Safety Act 2021, and under the Act, perpetrators can be issued a fine or imposed with jail time in some jurisdictions. The Commissioner also said that any Australian whose images or videos have been altered to appear sexualised and are published online without consent can contact eSafety for help to have them removed.

As a fellow Twitch streamer Stephanie Peloza said to Refinery29, she no longer feels surprised when she sees stories of well-known women being harassed online. And that's a tough pill to swallow.

"Basic respect and the basic rules we follow when we meet people in person, that stuff kind of goes out the window in these digital spaces," she said.

"If a person has a question about whether or not they should do something with another person's image, they really should consider whether they could even ask the person that question. And if they can't, I think that's a pretty clear answer."

Feature Image: Twitch.