Deepfakes and gender based violence


In 2017, journalist Samantha Cole discovered someone on Reddit who was using open-source artificial intelligence (AI) technology to create homemade, non-consensual pornography and sexual images using the faces of celebrities. This person called themselves, “deepfake.”

Early deepfakes were easy to spot because they were glitchy and looked unrealistic. However, this technology has become so sophisticated that anyone with a half decent understanding of computers and AI and with access to a decent computer can now easily make, distribute and sell a decent deepfake. 

All a deepfaker needs to do is find photographs of the person they want to target on a public source like Instagram to create very realistic sexualized images or pornography. 

“There are real sexual autonomy questions and harms that come with just the creation,” Suzie Dunn, assistant professor at Dalhousie’s Schulich School of Law, told rabble.ca during an interview.

Dunn went on to say, “Now, you can have people who can really sexually violate people in pretty serious ways without actually even having to have contact with them.”

A new form of gender based violence

The creation of a sexualized deepfake, in and of itself, is a violation of sexual autonomy – the right of an individual to make decisions about their own body and sexuality without interference. 

Publishing deepfakes online while claiming they are real sexual content is tantamount to non-consensual distribution of intimate images. That’s because the resulting public harm is the same.

The Sensity AI report, The State of Deepfakes 2019 Landscape, Threats, and Impact, found 96 per cent of deepfakes are used to create non-consensual sexual content. Of those deepfakes, 99 per cent were images of women.

This is the newest form of gender-based violence.

“In Canada, and globally, the harms of sexual deepfakes are being recognized. When these deepfakes started coming out of that Reddit site, a lot of people were posting them on Pornhub and different places. Quite immediately, most social media companies, including Pornhub, created policy that said that type of content is not allowed and we include it under the same category as other non-consensual image rules that you can’t post non-consensual content on our websites,” Dunn said.

Australian Noel Martin was targeted by someone who found her on the internet. They began making fake photoshop porn and eventually deepfakes of Martin. 

Martin advocated for legal changes that included adding the term “altered images” to Australia’s non-consensual image laws. Including altered images means that anyone sharing a sexual image – either a true likeness or a deepfake – without consent is subject to the law. 

Canada’s criminal law does not include that provision. So, if a sexual image is released without your consent, then it actually has to be your naked body in order to press charges. 

Civil statutes in British Columbia, Alberta, Saskatchewan, Nova Scotia and Newfoundland and Labrador do include altered images providing an option to sue for damages and harms. 

In provinces and territories without this option, someone could still file a civil lawsuit, but it would be a novel legal argument meaning the first of its kind – in other words, a precedent-setting case. 

Particular groups of women are more prone to being turned into deepfakes including journalists, gamers and those who use the video live streaming service Twitch.

Deepfakes represent real consequences for victims

“Once a deepfake is associated with your name and is Googleable, even if people do know it’s fake, it impacts the way people perceive you,” Dunn explained.

Deepfakes can have significant economic impacts especially when labour laws are weak and fail to protect those targeted through this content.

The repercussions may cause women to self-isolate or suffer mental health issues. Women may remove their images from online outlets. However, increasingly, women’s careers dictate that they be online and have a public presence so choosing to take those images down may impact their income and career advancement.

“There’s a real loss of sexual autonomy that happens for women. And, when we lose our sexual autonomy, people will have a variety of reactions. Some people might find it a mere annoyance, but for other people it can be devastating, life ruining,” stated Dunn. 

However, Dunn says the laws are progressing and most of the legal responses are addressing the distribution of porn without consent.

Some porn sites have content moderation rules around who can post and what can be posted. Yet, on Meta, Instagram and TikTok, even though there are clear content moderation rules they are not always enforced. 

“When I talk about pornography, what I’m talking about is content that was specifically made for public viewing. When I’m talking about sexual abuse materials or image-based abuse, this is content that is put onto pornography sites, but I don’t think we should categorize it as pornography. It’s abuse material,” explained Dunn.

When image-based sexual abuse content is uploaded onto porn sites, there’s usually a number of boxes that need to be checked including age and consent verification. Once all the boxes are checked, the content is made public. 

However, Dunn points out that it’s impossible to look at a video and know whether it was consensual or not.

That remains one of the big challenges that demands ongoing conversations around the obligations of pornography sites and how they plan to ensure everyone involved has consented to the material being uploaded.

According to Dunn, unless strong ethical practices are built into their systems, websites can very easily upload image-based sexual assault content.

Dunn also points out that, as facilitated violence evolves, society and the legal system needs to create a language for technology-facilitated abuse. It must be named before it can be categorized and the harms it inflicts identified and addressed.

Currently, the Criminal Code of Canada does not include altered images. However, Dunn says including them opens up a large conversation around where’s the boundary between what is criminal and what breaches criminality? Do the images have to be extraordinarily realistic? Should this definition be extended to include sexualized stories and cartoons of individuals? 

These complex questions can only be addressed through growth within both the criminal and civil systems that focuses on technology-driven gender violence like deepfakes. 

For changes to be meaningful they need to be reinforced with improved regulations for social media companies and porn sites to ensure rules are in place barring sexualized images or pornographic content. There also needs to be rules around how to handle situations when this content does get posted to ensure that it is taken down in a timely manner.

Dunn cautions that there needs to be a differentiation between consensual sexual expression on the internet and sexual abuse. This is important because sometimes when people are making an effort to try to get rid of abusive sexual content, they want to sweep all sexual content. 

“There’s a really important role that sexually expressive content plays in our society. So, when we’re thinking about improving the way that sexual content is available on the internet, I think we have to be careful about not throwing the baby out with the bath water. We want positive healthy sexual expression in our physical selves and that’s different than sexual assault. In the same way that we can have positive, healthy sexual expression in digital spaces like kid’s sites about sexual health information that could be caught up in the sweep,” Dunn said. 

Deepa Mattoo, lawyer and executive director of the Toronto-based Barbra Schlifer Commemorative Clinic, told rabble.ca, “we haven’t seen a case come forward yet, but when a case does come forward, it is going to be a challenge. Online violence in general, is on the rise and AI definitely has played a big role.”

Mattoo does not think there should be a different standard of legal test for abusive online content. If it is AI generated and done with the same intention of sexually or mentally harassing this person; attacking their reputation; or blackmailing them, then it is a crime and should be seen as sexual violence. 

“It is part of a pattern of coercion and control that should be taken even more seriously because the intention of the act shows that you planned it and you applied yourself to the extent of using the technology. So, from that perspective, it should be taken even more seriously than other crimes,” Mattoo stated. 

Mattoo believes there should be no excuse of any kind when someone deliberately procures images and then sexualizes them with the intent of trying to harm the victim.

Mattoo points out that science has established that trauma resulting from sexual violence can impact the victim for the rest of their life.

Mattoo would like to see the jail system become an education system because as she sees it, keeping someone inside a small space and curbing their capacity isn’t enough to actually change them. 

“This is about something happening without your consent, your agency was taken away. So, the psychological harm is much deeper. What is really needed is a change in society so that these crimes stop happening,” Mattoo said.



Source link

Leave a Comment