Business, Finance & Tech
Meta Sues Company Over AI Apps Creating Fake Nude Images of Children and Teens
Meta is taking legal action to fight back against the growing threat of AI-powered apps that can turn everyday photos—often of children and teens—into realistic-looking fake nude images. In an exclusive interview with NBC’s Savannah Sellers, the company detailed its efforts to combat this emerging form of digital abuse.
Meta has filed a lawsuit against a company called Joy Timeline, which it says violated Meta’s advertising policies by promoting apps with captions like “erase any clothes on girls” and offering tools to generate non-consensual explicit images. The company says these apps were able to manipulate ordinary photos—such as prom or graduation pictures—into lewd AI-generated content.
“This isn’t an admission that we’ve lost control,” a Meta spokesperson said. “This is a recognition that this is an adversarial space, and we’re going to keep fighting it.”
Despite Meta’s announcement, NBC News was still able to find dozens of similar ads circulating on Meta’s platforms. One victim, Alliston Berry, said a classmate used a “nudify” app to create a fake nude image of her. “It looked completely real,” she said. “To someone who doesn’t know me, you wouldn’t be able to tell it was fake.”
Meta says it is working not only through lawsuits but also by developing better detection technologies and collaborating with others in the tech industry to identify and shut down bad actors more efficiently.
“For families who feel this is too little, too late—we are on their side,” Meta stated. “We’re committed to making our platforms safer and taking real action.”
NBC News reached out to Joy Timeline but has not received a response.
Explore: NBCPalmSprings.com, where we are connecting the Valley.
By: NBC Palm Springs
June 13, 2025


