news

A fake nude image of Emma's daughter was circulated by peers. She'll never forget what police said.

Content warning: This article discusses suicide.

Emma Mason's daughter, Matilda 'Tilly' Rosewarne, was an eight-year-old girl who loved her family and had a passion for dance when the bullying began.

Via social media, Tilly's classmates had been relentless. They used Snapchat and a Belgian pornography website to circulate fake nude images of her.

In a submission to the Joint Select Committee on Social Media and Australian Society recently, Emma recounted: "I left work and came home and by the time I got home, Tilly was hysterical. I immediately sent a text to the principal of her school asking him to contact me urgently. We exchanged messages, and I sent him the images I had from Snapchat."

"After the conclusion of school that day, we came to understand how far the image had been spread amongst students in Bathurst. By 6pm, I had called the ambulance as Tilly attempted suicide. Despite the nude image not being of Tilly, she knew that people believed [the perpetrator] saying that it was her in the fake nude image. In a small town this was catastrophic."

After seven years of incessant online and in-person bullying, both at school and around her hometown of Bathurst, New South Wales, Tilly took her own life in 2022. She was just 15 years old.

Matilda 'Tilly' Rosewarne was just 15 years old. Image: Supplied.

When Emma had contacted her local police station in relation to the spreading of the nude image, she was deeply frustrated at the response.

"I had all the evidence available. There was enough evidence — I say that as a lawyer. But ultimately very little was done. I recall that the police officer said to me, 'It's really hard to stop this happening.'"

Explicit deepfakes — sexual images and videos falsely depicting a person's likeness without their consent — have increased on the internet as much as 550 per cent year on year since 2019. It often involves people's faces being superimposed onto pornography.

While the most convincing deepfakes are generated by powerful computers and intelligent software, anyone who has a laptop and some technological literacy could do it too. 

Last week Australia passed its proposed deepfake laws. The sharing and creation of non-consensual sexual images, including those which are digitally altered, will be criminalised. People who share such images will face up to six years in prison, while creators could be subject to seven years.

96 per cent of deepfake victims are women and girls.

"As a mother I'm so distressed by this — the technology that is in the hands of kids, and in the hands of harmful people," Emma tells Mamamia.

According to the eSafety Commissioner, they are regularly receiving reports of AI-generated child sexual abuse material, and deepfake videos created by teens to bully and humiliate their peers. eSafety Commissioner Julie Inman Grant acknowledges how devastating it can be to a person whose image is hijacked or altered without their knowledge and consent.

"The criminalisation of the sharing of non-consensual AI-generated sexually explicit deepfakes will serve as an important deterrent to this morally repugnant conduct. The latest legislation also adds powerfully to the existing interlocking civil powers through our regulatory complaints schemes and through our proactive Safety by Design initiative," Julie explains to Mamamia.

"But we can't just arrest or regulate our way out of these challenges."

Watch: SBS' inside the mind of a bully. Post continues below.

Noelle Martin is an activist, researcher, keynote speaker and lawyer. She is also a survivor of image-based sexual abuse.

She feels "complex and frustrated" about the new deepfake laws.

"I'm not convinced that these laws will really tackle the problem. The difficulty is that image-based abuse is a global issue, and it's happening all over the world. There is very little that the Australian Government can do when you have perpetrators committing these online crimes from overseas against Australians here," she notes.

"I hope these new laws have been survivor-led, but I don't think they are. The frustrating thing about being an advocate in this space is navigating all the politics of the issue — both racial politics, and also sexual politics."

For Noelle, she continues to fight for human rights and women's rights in the digital age.

Now more of the onus is being placed back on AI companies to do more to reduce the risk that their platforms can be used to generate highly damaging content. 

Noelle also highlights the need for compensation for victim-survivors. 

"We need to support victims and acknowledge the harms to survivors are lifelong. It impacts people's employability, their reputation, their earning capacity and more. Ideally, these billion-dollar digital platforms could potentially be fined in some capacity if they are found to have played a role in image-based abuse. Then these fines should be put in a fund for victims," she says.

Noelle Martin. Image: Instagram @noellemmartin.

"What I'm hoping for in the future is that there is some sort of global, collaborative, coordinated response from governments around the world. And if we reach that point, survivors need to have a huge, significant seat at the table."

For Emma, she is also advocating for the Federal Government to act and raise the age that children can access social media from 13 to 16. When it comes to the criminalisation of non-consensual sexual images, she feels "optimistic" but will wait and see how it comes into practice. 

"Every time we hear that the law is going to change, you wonder to yourself if it will actually change though. You need strong police work and investigations and access to put the laws into effect," she notes.

"I absolutely want any person who ever does something of this nature to face justice. I want there to be a hard response. The harms that occur as a result of these things are long-standing — you can't bring these images back."

For more information, you can visit eSafety.gov.au. There are also free webinars for parents and carers around online safety and emerging technologies, including generative AI.

Feature Image: Supplied.

Related Stories

Recommended

Top Comments

little lucy 2 months ago 1 upvotes
What happened to the people who bullied her.....where they ever brought to account

keen teacher 2 months ago
Snorks I think you should have rights over it when it appears to be you because your image -say face- has been used. I also think we  eed to criminalise bullying. That young woman suicided because the bullying was both online and in the real world.
snorks 2 months ago
@keen teacher you don't have any rights over your face either. 
I can certainly get behind the bullying angle though.