Deepfake-posting man faces huge $450,000 fine

A man is facing a $450,000 AU fine after he published deepfake images of prominent Australian women on the now-defunct MrDeepfakes...

May 28, 2025 - 23:40
 0
Deepfake-posting man faces huge $450,000 fine

A man is facing a $450,000 AU fine after he published deepfake images of prominent Australian women on the now-defunct MrDeepfakes web site. That’s if Australia’s online safety regulator gets its way.

Anthony Rotondo faces charges of posting these and other explicit deepfake images to the MrDeepfakes website, which closed down earlier this month.

According to a court order approving an arrest warrant for him in October 2023, the 55 year-old posted pictures of the Australian public figures online but when the country’s eSafety Commissioner—which regulates online safety—asked him to take them down in May 2023, he responded:

“I am not a resident of Australia. The removal notice means nothing to me. Get an arrest warrant if you think you are right.”

Rotondo, who lived in the Philippines, traveled to Australia on October 10, 2023, apparently to attend a car race on the Gold Coast. On October 20, the Office of the ESafety Commissioner got an injunction against him in Australian Federal Court, asking him to take down the images. Instead, he sent another deepfake image to media outlets and to the eSafety Commissioner’s office. The police arrested him at an apartment in Brisbane, Queensland, a few days later.

Once in custody, Rotondo gave police his access credentials to the website, enabling them to take the images down. However, a federal judge fined him $25,000 for contempt of court. He was also charged with six counts of obscene publication, one of which involved a minor. The court added another charge of endangering property by fire.

The eSafety Commissioner is now pushing for a fine of $450,000 over the obscenity charges.

What is a deepfake?

A deepfake is an image of a person produced using AI. Today it’s most commonly used to project an existing person’s likeness onto someone else’s image or video. Some include just photos, while others consist of video and audio. Audio-only deepfakes are also used to impersonate others’ voices.

Deepfake technology can be used for good, such as rekindling someone’s voice after they lose the ability to speak. There have also been some imaginative uses, such as the representation of a murder victim as a deepfake who gave an impact statement in court. Some have explored using the technology to animate the images of deceased loved ones.

However, many uses of deepfakes are less savory. Scammers use deepfake videos of popular public figures to lure victims into fraudulent investments, and deepfake voice recordings to fool family members into thinking their loved one has been involved in an accident or arrested. Deepfake porn, in which a victim’s likeness is projected onto explicit images or video, is now a scourge, and deepfake child sex abuse material is also on the rise.

As Australian eSafety Commissioner Julie Inman Grant said in a testimony to the country’s senate last July:

“The harms caused by image-based abuse have been consistently reported. They include negative impacts on mental health and career prospects, as well as social withdrawal and interpersonal difficulties.”

She continued:

“Victim-survivors have also described how their experiences of image-based abuse radically disrupted their lives, altering their sense of self, identity and their relationships with their bodies and with others.”

The following month, politicians passed an amendment to the country’s Criminal Code that introduced new penalties for sharing such content.

However, politicians have also been a hindrance. The Liberal National Party in Queensland posted a nonsexual deepfake of the state’s premier, Steven Miles, in a negative political campaign.

MrDeepfakes was the largest deepfake site in the world. It hosted at least 43,000 deepfake pictures of 3,800 people, most of whom were female musicians or actors. The site’s creators took it down early this month, citing data loss, and stating that they would not be resurrecting it.

How to protect yourself

The National Cybersecurity Alliance offers advice on protecting yourself against deepfakes, and the Cyber Civil Rights Initiative offers resources for those who have been targeted.

If you’re in the UK, the Revenge Porn helpline helps support those targeted by image abuse.


We don’t just report on threats—we remove them

Cybersecurity risks should never spread beyond a headline. Keep threats off your devices by downloading Malwarebytes today.