Reporter turns tables, exposes disturbing vid

Popular NRL presenter Tiffany Salmond has taken aim at online trolls after fresh sexually explicit deepfake images began to circulate online.
Salmond spoke out about the disturbing AI attacks when she revealed she had been targeted in May with a bikini snap being doctored.
Watch the biggest Aussie sports & the best from overseas LIVE on Kayo Sports | New to Kayo? Join now and get your first month for just $1.
Now the 27-year-old has hit out again after fresh images posted to her Instagram had been used to show her performing sexually explicit acts.
Not only did she take aim at those creating the images, she took aim at people questioning if she was the one behind the deepfakes in an attempt to capture the spotlight.
“Earlier this year, I spoke out about being the target of explicit AI-generated deepfakes – fake sexual imagery made from real photos of me, without my consent,” she posted.
“But what I haven’t spoken about until now is the doubt and suspicion that’s followed.
“Over the past few months, I’ve seen a continued narrative online accusing me of faking the story for attention.
You can watch the disturbing clips in the video player above
“Claiming the deepfakes were never real, and that I made the whole thing up.
“I stopped talking about this months ago because I’d already said what I needed to. But that doesn’t mean it stopped.
“For anyone who assumed it was a one-off, it wasn’t. The deepfakes have continued to be made and posted.
“The most recent deepfake (and the most explicit one yet) was shared online just three weeks ago.”
Salmond said one of the main issues behind it the disturbing online videos was the lack of understanding from people who aren’t directly affected by the issue.
“After seeing people suggest I made this whole thing up – while literally watching new deepfakes of me still being created, I realised something,” she said.
“Most people don’t actually understand what a deepfake is, or how common they’ve become. Especially when it comes to explicit ones made of real people.
“It’s still just a buzzword to most. And unless you’ve made them, shared them, or had them made of you … it’s just a concept, or a theoretical discussion.
“But that gap in understanding is part of the problem.
“When people don’t see it for themselves, they dismiss it and don’t take it seriously, or assume it’s too far-fetched to be real.”
The New Zealand presenter decided to share the posts to her own Instagram account to show just how damaging they can be.
She took the extra step of censoring the explicit parts of the posts, but showed just how alarming the deepfakes can be.
“You can’t solve a problem you don’t even see,” she wrote.
“So I’m going to share a few censored versions of the deepfakes that were created using my photos.
“I know it’s unorthodox. But if we’re serious about protecting women from this new, easily accessible technology, then people need to understand what it actually is.
“And to truly understand, you need to see it for yourself
“The original shared online was full, AI-generated nudity. There have been so many created and shared over the past few months, that I’ve lost count.”
Salmond said her light bulb moment about the whole ordeal came when an AI-generated video appeared online of one of her proudest moments.
The image she posted on her account showed her holding up a copy of the Sunday Telegraph after she had spoken out about the deepfake images.
But the image was used in yet another sexually explicit way which proved to Salmond that the trolls simply wanted to strip her of her power.
“The moment I truly knew I was right, that this was always about power, was when they deepfaked that photo,” she wrote.
“I had to cover my face in that one, they used AI to fake my body, but it was the face that felt most violating.
“It was still recognisingly mine, but twisted. Almost demonic. There’s something even more disturbing in that than even the nudity.
“But despite how unsettling that image was … they exposed themselves.
“It wasn’t random, they chose that specific photo because they felt threatened. By a woman who was confident, celebrated and platformed.
“It was retaliation. There was nothing sexual about that photo. But it was symbolic, and the only power move they had left.
“But they didn’t just expose themselves to me. They exposed themselves to each other.
“Because once I spoke out about what the intent behind them really was, it became impossible for the other men in these online spaces to ignore the pattern.
“The pattern that every time I’ve spoken out this year, every time I’ve been platformed, or made a power move – there’s a targeted spike in retaliation.
“And eventually, the other men started seeing it too.”
Salmond had become a fan favourite in recent years, largely thanks to her sideline coverage of New Zealand Warriors games as part of Fox League’s coverage.
She has since moved to Australia but has not as yet appeared on an NRL TV broadcast in 2025, prompting many fans to ask her when she’ll be returning to their screens.
In an interview with The Sydney Morning Herald talking about why she has been unable to land a role this season, she said women are frowned on if they are “too sexy”.
“I can only speculate why no one has made the logical decision to hire me. Rugby league media is very conservative when it comes to female representation,” she said.
“The men are allowed to be edgy, loud, have huge personalities and take up space. But women have to toe the line of being attractive – but not too sexy. Knowledgeable – but not enough to outshine your male counterparts.”
Salmond said that her presence had become too large and that she was beginning to outgrow her role in reporting from the sidelines.
In a post to her Instagram account, Salmond says she was asked to be a part of Triple M’s team for the 2025 season where she was “encouraged to move to Sydney”.
Salmond’s ordeal isn’t the first to strike the rugby league world after NRLW star Jaime Chapman went public in early May after becoming a victim of a deepfake AI attack.
The Gold Coast Titans winger on hit out after seeing doctored images of herself swirling around cyberspace, prompting a police investigation.
The 23-year-old told her 86,000 Instagram followers it was not the first time images she has shared on social media have been distorted through deepfake AI programs.
Her Instagram post showed a high rise photo of a beach and an inset image, believed to be the fake AI image, of herself posing for a mirror selfie in a golden bikini.
It was a public plea for whoever is responsible to stop.
“Have a good day to everyone except those who make fake ai photos of other people,” she posted.
She also wrote: “Next time think of how damaging this can be to someone and their loved ones.
“This has happened a few times now and it needs to stop.”
Alongside the bikini pic Chapman wrote: “AI is scary these days”.
What is deepfake AI?
Deepfake AI videos are synthetic media created using artificial intelligence, particularly a branch of machine learning called deep learning.
These videos convincingly alter or generate footage to make it appear that someone is saying or doing something they never actually did.
This is typically achieved by training algorithms on large amounts of video and audio data of a person, allowing the AI to mimic their facial expressions, voice, and mannerisms with startling realism.
While the technology has legitimate uses — such as in film production, gaming, or even education — deepfakes are more commonly known for their misuse.
They’ve been used to spread misinformation, impersonate public figures, and create non-consensual explicit content, particularly targeting women.
Australian school students are facing the terrifying threat of disturbing fake nude photographs with their faces being circulated online.
Cybersafety expert and former police officer Susan McLean said creating sexually explicit, AI-generated images is now as easy as uploading a clothed photo of someone and choosing a pose for an app to then spit out a pornographic image, and it will only become more of a problem.
Sending real or fake nude images of people under the age of 18 is a crime, but the harsh reality is, “there is nothing any person can do to protect themselves from this,” Ms McLean told news.com.au.
“You have to hope that someone doesn’t choose you to become a victim; you have to hope that the offender doesn’t offend,” she said.
Ms McLean said instead of trying to fight the apps that create the content, the focus needs to be on educating young men on respectful and lawful behaviour.
“Victims of this crime need a lot of support and consistent and ongoing support. And it is never their fault whatsoever,” Ms McLean said.
“We’ve got to do something to change the mindset of young men who believe this is a good idea to do.”




