Victoria’s premier has said there is no place for “misogynistic conduct” in the state following allegations deepfakes depicting about 50 female students from a private school in regional Victoria were circulated online.
Police are investigating the “incredibly graphic” nude images which appeared to have been created using artificial intelligence and photos of the girls’ faces taken from social media sites, and then circulated online.
A schoolboy was arrested but has been released pending further inquiries.
Victorian premier Jacinta Allan said her thoughts were with the young women of Bacchus Marsh Grammar and their families.
“There is no place for this disgraceful and misogynistic conduct in Victoria,” Allen said.
“Women and girls deserve respect in class, online and everywhere else in our community, which is why we have made laws against this behaviour and we are teaching respectful relationships in schools to stop violence before it starts.”
Victoria police said officers were informed that a number of images were sent to a person in the Melton area, which is about a 15-minute drive from Bacchus Marsh, via an online platform on Friday 7 June.
A woman named Emily, the parent of one Bacchus Marsh Grammar student and a trauma therapist, said her daughter was “sick to her stomach” over the incident, although she did not appear in the images.
“She was very upset, and she was throwing up. It was incredibly graphic,” she told ABC Radio Melbourne on Wednesday. “Fifty girls is a lot. It is really disturbing.”
“I mean they are children … The photos were mutilated, and so graphic. I almost threw up when I saw it.”
“How can we reassure them that once measures are in place, it won’t happen again?”
The eSafety commissioner, Julie Inman Grant, said the increasing sophistication of generative AI means it is becoming easier for deepfakes to inflict “great harm” on those that are targeted.
“It is becoming harder and harder to tell the difference between what is real and what is fake,” Inman Grant said.
Inman Grant said her office was already receiving reports containing AI-generated child sexual abuse material, as well as deepfake images and videos created by teens to bully their peers.
She said there had been a high success rate in having material taken down when reported. When the material involves someone under the age of 18, it is reported to the Australian Centre to Counter Child Exploitation (ACCCE).
She said there needed to be better safeguards for AI technology to prevent misuse.
“We are not going to regulate or litigate our way out of this – the primary digital safeguards must be embedded at the design phase and throughout the model development and deployment process,” Inman Grant said.
In June the federal government introduced legislation to ban the creation and sharing of deepfake pornography as part of measures to combat violence against women.
The chief executive of Sexual Assault Services Victoria, Kathleen Maltzahn, said the incident showed there was a lack of education about the illegality of image-based abuse.
She said the Victorian Department of Education needed to be better resourced to respond to sexual violence in schools, and the federal government should step up its regulation of social media companies.
“Schools are not equipped to deal with this, and they come to our services, and our services are not funded at the level we need to be able to go into schools and give an emergency response,” she said.
The acting principal of Bacchus Marsh Grammar, Kevin Richardson, said the school was taking the incident very seriously.
“The wellbeing of Bacchus Marsh Grammar students and their families is of paramount importance to the school and is being addressed. All students affected are being offered support from our wellbeing staff.”
– Additional reporting AAP
Source link
lol