
OTTAWA – The Liberal government’s legislation to criminalize sexual deepfakes would not cover most of the images that have proliferated on X in recent weeks, a law professor said.
Suzie Dunn, an assistant professor of law at Dalhousie University, said the bill wouldn’t necessarily apply to the wave of sexualized images created by Elon Musk’s Grok chatbot and shared on X.
“That law would not actually address the majority of the adult content that’s been created on Grok,” Dunn said.
ARTICLE CONTINUES BELOW
Bill C-16 would criminalize the non-consensual sharing of images which show the subject nude, exposing their sexual organs or engaged in explicit sexual activity.
Images created by Grok — such as edits to photos of women to depict them wearing see-through bikinis — may not meet that standard.
While the subjects in the images are not shown entirely nude, they are “sexualized in their visual aspect but also sexualized in … the way a person’s body is positioned,” Dunn said.
AI Minister Evan Solomon recently pointed to Bill C-16 in a post on X that addressed the controversy over the images, which have triggered a global backlash.
“Deepfake sexual abuse is violence,” he wrote last week. “We will keep Canadians safe by amending the Criminal Code and holding abusers accountable.”
A spokesperson for Justice Minister Sean Fraser referred questions about deepfakes and Bill C-16 to Solomon’s office.
ARTICLE CONTINUES BELOW ARTICLE CONTINUES BELOW
Rosel Kim, senior staff lawyer at the Women’s Legal Education and Action Fund, agreed that it’s “possible that a lot of the deepfake images on Grok will not be caught by that provision.”
She said this shows “the need for a multi-pronged response to a problem like this that goes just beyond the criminal law.”
Both Kim and Lloyd Richardson, director of technology at the Canadian Centre for Child Protection, have said the wave of deepfakes shows the government needs to create an online regulator.
They have called for a new regulatory body that would function like the one the Liberal government proposed in 2024 in its Online Harms Act, which never became law.
While the technology to make sexualized deepfakes is not new, X made it easily accessible by allowing users to ask Grok to edit images directly on the platform. That feature has now been restricted to paid users.
Dunn said most app stores have banned so-called “nudifying” apps, but Grok is publicly available.
ARTICLE CONTINUES BELOW ARTICLE CONTINUES BELOW
People using those apps would often share images privately with others who want to look at such content, she noted.
“Right now, anyone who’s using X is getting bombarded with this non-consensual sexual imagery,” Dunn said.
On Monday, Liberal MP Will Greaves said he would suspend his use of the platform because of the deepfakes.
“To be clear: this is sexual exploitation, it is unacceptable, and it may be illegal,” he wrote in a statement posted on X.
Over the weekend, Solomon said Canada is not considering banning X. In the days following that announcement, the government has not responded to questions about whether it will stop using the platform.
This report by The Canadian Press was first published Jan. 13, 2023.
Politics Headlines Newsletter Get the latest news and unmatched insights in your inbox every evening
Error! Sorry, there was an error processing your request.
There was a problem with the recaptcha. Please try again.
Please enter a valid email address. Sign Up This newsletter is only available for subscribers. If you are already a subscriber, please login now. If you want to become subscriber, please click here Yes, I’d also like to receive customized content suggestions and promotional messages from the Star.
You may unsubscribe at any time. By signing up, you agree to our terms of use and privacy policy. This site is protected by reCAPTCHA and the Google privacy policy and terms of service apply.
Politics Headlines Newsletter You’re signed up! You’ll start getting Politics Headlines in your inbox soon.
Want more of the latest from us? Sign up for more at our newsletter page.

