Jail time for those caught distributing deepfake porn under new Australian laws


Sharing digitally altered “deepfake” pornographic images will attract a penalty of six years’ jail, or seven years for those who also created them, under proposed new national laws to go before federal parliament next week.

The attorney general, Mark Dreyfus, is expected to introduce legislation on Wednesday to create a new criminal offence of sharing, without consent, sexually explicit images that have been digitally created using artificial intelligence or other forms of technology.

Once passed, the new laws will make it illegal to share any non-consensual deepfake pornographic image with another person, whether by email or personal message to an individual or to a mass audience on a private or open platform.

Flagging the new legislation on Saturday, Dreyfus said the government would not tolerate “this sort of insidious criminal behaviour”.

“Digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse,” Dreyfus said. “We know it overwhelmingly affects women and girls who are the target of this kind of deeply offensive and harmful behaviour. It can inflict deep, long-lasting harm on victims.”

The change aims to enable the law to catch up with technology. Currently, it is not illegal to create a deepfake AI-generated or digitally altered pornographic image.

Outlawing that on its own is not within the commonwealth’s jurisdiction and would require changes to state and territory law, with moves under way in some jurisdictions.

But there are commonwealth laws about using a carriage service – telephone, video or internet-based technology – to commit crimes and the new criminal code amendment (deepfake sexual material) bill expands those to include the dissemination of deepfake porn on the list of offences.

In establishing the commonwealth offence of sharing these images punishable by six years’ imprisonment, the government is adding a companion aggravated offence covering anyone who was also responsible for creating them. The aggravated offence will attract an extra year’s jail.

The new offences will only cover images depicting adults. There are existing separate laws that cover the possession of sexually explicit images of real children or images designed to be childlike which can already capture artificially generated material.

The change is part of a suite of moves aimed at reducing the incidence of violence against women and addressing the role that technology, including social media, plays in spreading and normalising violent, degrading and misogynistic imagery and ideas.

They include a review of the Online Safety Act and proposed measures to address doxing – the use or publication of private or identifying material with malicious intent.

skip past newsletter promotion

Australia doesn’t have toolkit to detect AI-generated election misinformation, says AEC – video

Outlawing the sharing of non-consensual deepfake pornographic material was among the commitments arising from a national cabinet meeting on 1 May, at which first ministers pledged themselves to the goal of ending violence against women within a generation.

On Saturday, Dreyfus said the government wanted users of technology to understand that it is not only the creation of degrading images without someone’s consent – whether depicting real people or created digitally – that causes harm but the act of sharing them.

“The government’s reforms will make clear that those who share sexually explicit material without consent using technology like artificial intelligence will be subject to serious criminal penalties,” Dreyfus said.

In an address to the National Press Club earlier this year, the Australian federal police commissioner, Reece Kershaw, described the challenge to law enforcement posed by the sharply increasing use of AI in pornographic material being circulated online and the need for the law to better reflect rapid technological change.

Kershaw said he believed “a tsunami of AI-generated abuse material” was coming.

“There is no silver bullet and offenders are always looking at how they can beat technological countermeasures,” he said.

A Senate select committee is also currently examining the potential opportunities and dangers that generative AI presents. It is due to report in September.



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.