By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Online Tech Guru
  • News
  • PC/Windows
  • Mobile
  • Apps
  • Gadgets
  • More
    • Gaming
    • Accessories
    • Editor’s Choice
    • Press Release
Reading: Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous
Best Deal
Font ResizerAa
Online Tech GuruOnline Tech Guru
  • News
  • Mobile
  • PC/Windows
  • Gaming
  • Apps
  • Gadgets
  • Accessories
Search
  • News
  • PC/Windows
  • Mobile
  • Apps
  • Gadgets
  • More
    • Gaming
    • Accessories
    • Editor’s Choice
    • Press Release
Redditors Are Mounting a Resistance Against ICE

Redditors Are Mounting a Resistance Against ICE

News Room News Room 26 January 2026
FacebookLike
InstagramFollow
YoutubeSubscribe
TiktokFollow
  • Subscribe
  • Privacy Policy
  • Contact
  • Terms of Use
© Foxiz News Network. Ruby Design Company. All Rights Reserved.
Online Tech Guru > News > Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous
News

Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous

News Room
Last updated: 26 January 2026 18:21
By News Room 5 Min Read
Share
Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous
SHARE

Open the website of one explicit deepfake generator and you’ll be presented with a menu of horrors. With just a couple of clicks, it offers you the ability to convert a single photo into an eight-second explicit videoclip, inserting women into realistic-looking graphic sexual situations. “Transform any photo into a nude version with our advanced AI technology,” text on the website says.

The options for potential abuse are extensive. Among the 65 video “templates” on the website are a range of “undressing” videos where the women being depicted will remove clothing—but there are also explicit video scenes named “fuck machine deepthroat” and various “semen” videos. Each video costs a small fee to be generated; adding AI-generated audio costs more.

The website, which WIRED is not naming to limit further exposure, includes warnings saying people should only upload photos they have consent to transform with AI. It’s unclear if there are any checks to enforce this.

Grok, the chatbot created by Elon Musk’s companies, has been used to created thousands of nonconsensual “undressing” or “nudify” bikini images—further industrializing and normalizing the process of digital sexual harassment. But it’s only the most visible—and far from the most explicit. For years, a deepfake ecosystem, comprising dozens of websites, bots, and apps, has been growing, making it easier than ever before to automate image-based sexual abuse, including the creation of child sexual abuse material (CSAM). This “nudify” ecosystem, and the harm it causes to women and girls, is likely more sophisticated than many people understand.

“It’s no longer a very crude synthetic strip,” says Henry Ajder, a deepfake expert who has tracked the technology for more than half a decade. “We’re talking about a much higher degree of realism of what’s actually generated, but also a much broader range of functionality.” Combined, the services are likely making millions of dollars per year. “It’s a societal scourge, and it’s one of the worst, darkest parts of this AI revolution and synthetic media revolution that we’re seeing,” he says.

Over the past year, WIRED has tracked how multiple explicit deepfake services have introduced new functionality and rapidly expanded to offer harmful video creation. Image-to-video models typically now only need one photo to generate a short clip. A WIRED review of more than 50 “deepfake” websites, which likely receive millions of views per month, shows that nearly all of them now offer explicit, high-quality video generation and often list dozens of sexual scenarios women can be depicted into.

Meanwhile, on Telegram, dozens of sexual deepfake channels and bots have regularly released new features and software updates, such as different sexual poses and positions. For instance, in June last year, one deepfake service promoted a “sex-mode,” advertising it alongside the message: “Try different clothes, your favorite poses, age, and other settings.” Another posted that “more styles” of images and videos would be coming soon and users could “create exactly what you envision with your own descriptions” using custom prompts to AI systems.

“It’s not just, ‘You want to undress someone.’ It’s like, ‘Here are all these different fantasy versions of it.’ It’s the different poses. It’s the different sexual positions,” says independent analyst Santiago Lakatos, who along with media outlet Indicator has researched how “nudify” services often use big technology company infrastructure and likely made big money in the process. “There’s versions where you can make someone [appear] pregnant,” Lakatos says.

A WIRED review found more than 1.4 million accounts were signed up to 39 deepfake creation bots and channels on Telegram. After WIRED asked Telegram about the services, the company removed at least 32 of the deepfake tools. “Nonconsensual pornography—including deepfakes and the tools used to create them—is strictly prohibited under Telegram’s terms of service,” a Telegram spokesperson says, adding that it removes content when it is detected and has removed 44 million pieces of content that violated its policies last year.

Share This Article
Facebook Twitter Copy Link
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

‘We Know There’s a Lot of Eyes on Us After Our Game Awards Trailer’ — as Highguard Goes Live, Dev Insists ‘We’re in This for the Long Haul’

‘We Know There’s a Lot of Eyes on Us After Our Game Awards Trailer’ — as Highguard Goes Live, Dev Insists ‘We’re in This for the Long Haul’

News Room News Room 26 January 2026
FacebookLike
InstagramFollow
YoutubeSubscribe
TiktokFollow

Trending

The Best Sleep Masks

Compare Our Top 10 Sleep MasksMore Sleep Masks We TestedPhotograph: Kat MerckDrowsy Silk Eye Mask…

26 January 2026

Even the big dick subreddit is mad about ICE

“Fuck ICE first, second, third, and fourth. Then worry about fucking me,” a Reddit post…

26 January 2026

Ubisoft launches voluntary redundancy process to cut 200 jobs from Paris head office

The latest target of Ubisoft's eternal cutbacks is its head office, with 200 roles proposed…

26 January 2026
News

Google will settle its Assistant spying lawsuit for $68 million

Google will settle its Assistant spying lawsuit for  million

Google could owe you some money, now that it’s moving to settle a class-action lawsuit over how it handled recordings captured when its devices were activated by something other than…

News Room 26 January 2026

Your may also like!

MCP extension unites Claude with apps like Slack, Canva, and Figma
News

MCP extension unites Claude with apps like Slack, Canva, and Figma

News Room 26 January 2026
Internal chats show how social media companies discussed teen engagement
News

Internal chats show how social media companies discussed teen engagement

News Room 26 January 2026
After Prince of Persia: Sands of Time Remake Cancelation, Actress Says She’s Lost 3 Years of Work and Found Out via the Internet
Gaming

After Prince of Persia: Sands of Time Remake Cancelation, Actress Says She’s Lost 3 Years of Work and Found Out via the Internet

News Room 26 January 2026
Microsoft’s latest AI chip goes head-to-head with Amazon and Google
News

Microsoft’s latest AI chip goes head-to-head with Amazon and Google

News Room 26 January 2026

Our website stores cookies on your computer. They allow us to remember you and help personalize your experience with our site.

Read our privacy policy for more information.

Quick Links

  • Subscribe
  • Privacy Policy
  • Contact
  • Terms of Use
Advertise with us

Socials

Follow US
Welcome Back!

Sign in to your account

Lost your password?