Home

Deepfake abuse app ban: Annika Wells announces Digital Duty of Care bill to target online abuse

Headshot of Caitlyn Rintoul
Caitlyn RintoulThe Nightly
Minister for Communications, Anika Wells, holds a press conference at Parliament House in Canberra.
Camera IconMinister for Communications, Anika Wells, holds a press conference at Parliament House in Canberra. Credit: Martin Ollman NewsWire/NCA NewsWire

Australia’s Communications Minister has announced new plans to ban AI apps used to create deepfake nude images and undetectable online stalking tools.

Anika Wells said it was an issue “too important for us not to act” as cases of apps being used to abuse, humiliate and harm continued to rise in Australia, especially among children.

Ms Wells says she will work with the tech industry as well as abuse prevention groups to come up with a framework to restrict access to deepfake abuse tools, but hasn’t specified a timeframe yet.

“Abusive technologies are widely and easily accessible and are causing real and irreparable damage now. This is too important for us not to act,” she said.

A deepfake is an image or video in which a person’s face or body has been altered to make it appear they are doing or saying something that never actually happened.

Often, they can share sexually explicit material without consent.

Ms Wells’ promise of a future crackdown came as advocates, politicians, government and business leaders convened a roundtable in Canberra on Tuesday on the use of AI in abuse.

Ms Wells said the Albanese Government was working to progress a new Digital Duty of Care bill.

The legislation would place the onus on digital platforms to proactively keep Australians safe and better prevent online harms.

Minister for Communications, Anika Wells holds a press conference at Parliament House in Canberra.
Camera IconMinister for Communications, Anika Wells holds a press conference at Parliament House in Canberra. Credit: Martin Ollman/NCA NewsWire

It was first announced with much-hype by her predecessor Michelle Rowland in November last year, but almost a year on it hasn’t progressed further.

Asked about the Digital Duty of Care on Tuesday at a press conference and in multiple TV interviews, Ms Well promised it was under way.

“The Digital Duty of Care is part of the recommendations in the Online Safety Act reform. So, we are working with the Safety Commissioner. That work, together with the work that we do with the Safety Commissioner, continues,” Ms Wells said.

The Digital Duty of Care model was a key recommendation of the independent statutory review of the Online Safety Act 2021 undertaken by former head of the ACCC’s Consumer Protection Branch Delia Rickard.

The roundtable also brought together parliamentarians from across the political spectrum, including independent MPs Kate Chaney and Zali Steggall, as well as Greens senator David Shoebridge.

In July, Ms Chaney had introduced a private members bill to make an amendment to the Criminal Code to criminalise deepfake sexual material, with breaches punishable by up to 15 years in prison.

Ms Chaney had proposed making it illegal to download tools for creating deepfake sexual content, as well as images used to train ‘nudify’ apps. The bill is currently before Parliament.

Asked if Labor would support Ms Chaney’s previous work, Ms Wells acknowledged that the independent MP had proposed “good” suggestions but said she wanted to go further.

“I think what she’s doing is good. What I’m trying to do is stop it at the source,” Ms Wells said.

“So, this is about maybe looking at ways that maybe the eSafety Commissioner could block these apps from being on app stores or websites.

“Maybe we could direct the platforms to take them down, or to be responsible for taking them down in the first place.”

The Albanese Government has already introduced laws to target the sharing of deepfake porn or sexually-explicit AI-generated content.

Australia’s eSafety Commissioner Julie Inman Grant also joined discussions on Tuesday after she previously called out big tech for putting profits before child protection when it came to combating the explosion of deepfakes and other sexual abuse material.

Ms Inman Grant last month said companies weren’t answering key questions she’d put to them and weren’t employing the full arsenal of tools to fight the depraved crimes occurring on their services, even though they had gone to the effort of creating them.

The crackdown on deepfakes and Digital Duty of Care comes with the Government continuing to work towards a social media ban for under-16s, which is expected to be implemented on December 10.

Former Australian of the Year Grace Tame also attended and called for a new national plan to educate Australians about grooming at a press conference after the event.

She warned the worst abusers are calculated offenders who manipulate both victims and communities.

Ms Tame said while respectful relationships and consent education had been important milestones, grooming awareness must be the next step.

“I want to put a focus on prevention. We have a huge gap in prevention of child sexual abuse, specifically in how offenders are able to groom children,” the advocate for sexual assault survivors said.

“I’m advocating for grooming prevention education as a stand-alone measure of prevention for child sexual abuse.

“When we’re talking about adults harming children and using methods like grooming, it is a completely different framework that is required to empower not just children, but all adults who work with children to actually detect their methods.”

The Government’s call to explore how to ban the apps was welcomed by DIGI, the peak body representing the tech sector.

In a statement on Tuesday, DIGI’s head of regulatory affairs Jennifer Duxbury said their members were already “taking proactive steps” against nudification apps.

“We support ecosystem approaches to tackling harm and look forward to working constructively with the Government on the details of the proposal,” she said.

Get the latest news from thewest.com.au in your inbox.

Sign up for our emails