“Deepfakes” will be targeted in the Liberal government’s planned online harms bill.

The government’s announcement comes in response to sexual images of pop singer Taylor Swift, created with artificial intelligence, being disseminated online.  

The bill, which was promised in 2021 but has been delayed for some time, would establish new regulations of what would be deemed permissible internet content. 

The Trudeau government announced that posting non-consensual intimate images would be prohibited under the proposed legislation, which could also tackle misinformation and so-called hate speech. 

It has yet to be determined if AI-created videos could be included under the existing definition of such images under Canada’s current Criminal Code. 

“Keeping our kids and young people safe online is a legislative priority for our government — especially given the evolving capabilities of AI,” said Justice Minister Arif Virani in a statement to the Montreal Gazette.

Virani said deepfakes can “exacerbate forms of online exploitation, harassment and cyberbullying.”

The term deepfake refers to incredibly realistic-looking video simulations that depict celebrities, politicians or any public figure using their likeness and voice. 

Sexual deepfakes include nude or pornographic content, which may mean they could fall under the Protection of Communities and Exploited Persons Act, which came into effect in 2014. 

The act “seeks to protect the dignity and equality of all Canadians by denouncing and prohibiting the purchase of sexual services, the exploitation of the prostitution of others, the development of economic interests in the sexual exploitation of others and the institutionalization of prostitution through commercial enterprises, such as strip clubs, massage parlours and escort agencies that offer sexual services for sale,” according to the Government of Canada website.

One Conservative MP said it’s important to protect not just celebrities from these sorts of depictions, but also ordinary people.

“The Taylor Swift example is a high-profile case, but there are examples in Canada of women facing this already — women that do not have the resources that Taylor Swift has,” Conservative MP Michelle Rempel Garner told the Montreal Gazette.

She pointed to a case of underage female students in Winnipeg who had AI-generated images of them circulating online last year. 

While Canada’s Criminal Code does have existing laws that deal with the distribution of intimate images currently, these do not specify altered or AI-generated images.

While provinces deal with these issues through civil resolution tribunals, where victims can apply to have the photos removed and can potentially be paid compensation, at the national level, there is no clear law stated on the matter as the laws predate the invention of deepfakes. 

According to former CRTC vice-chair Peter Menzies, passing an amendment to the existing criminal law would be an easier way to address the problem, as opposed to bringing in new legislation.

“I think you should always take the fastest, most efficient route to a solution if it’s available, and I see this one as readily available,” said Menzies in the Montreal Gazette article. “You probably only have to change about four or five words.”

“I would not like to see this become something that’s used for political purposes.”