schools, nudify, apps, AI, teens, technology,
Source: izusek

Parents beware. Tech-savvy teenagers are turning to artificial intelligence to bully and embarrass their classmates and peers. According to the New York Post, an alarming trend is growing in high schools across the country where some teens are using AI apps to create realistic nude photos of their classmates. To make matters even worse, many of these bullies share the fake images around school, leaving victims distraught and unsettled. 

“Nudify” Apps Are On The Rise

With just a casual photo — whether it’s a selfie, a group shot, or a snapshot pulled from social media — AI-powered “nudify” apps can generate disturbingly realistic nude images by digitally removing clothing. According to Gabb Now, the resulting images often appear eerily genuine, which adds to the danger. While some platforms offer these manipulations for free as a trial, many quickly shift to a paid model, requiring either a fee per image or a subscription.

RELATED CONTENT: 4 Things You Should NEVER Share With ChatGPT

Investigations into major nudification sites like Clothoff have revealed deceptive payment systems designed to obscure the nature of the transaction. This means a teen could potentially buy access to these services without alerting parents, purchases might appear on a credit card statement as “flowers” or “photo editing tutorials,” masking the app’s true intent, Gabb Now noted. 

Lawmakers Are Taking Action To Protect Schools Nationwide

As the troubling trend of AI-generated explicit imagery grows, states across the U.S. are stepping up efforts to protect students. In Oregon, lawmakers have taken a unanimous stand against deepfake technology. According to the Oregon Capital Chronicle, the state’s House of Representatives voted 56-0 in favor of legislation that would expand its existing “revenge porn” law to ban the use of nudify apps and similar AI tools that generate sexually explicit content without consent.

If passed by the Senate, House Bill 2299 would make Oregon the 32nd state to criminalize the creation and distribution of manipulated nude or explicit images using AI or other digital tools, slapping offenders with a Class A misdemeanor. AI crime committers could face up to 364 days in county jail and a fine up to $6,250.

Crook County District Attorney Kari Hathorn told the Oregon Capital Chronicle, that legislation needs to be enforced fast, as there are no clearly defined laws present at the moment holding nudify apps accountable. The court official revealed that she recently received a report of an adult man in his 40s that created a fake nude photo of the 15-year-old daughter of his ex-girlfriend. According to Hathorn, the man shared the inappropriate image to boys who attended the teen’s school.

RELATED CONTENT: Cardi B & Offset’s Relationship Is More Than Just ‘Toxic’ Hood Love, It’s Abusive [Op-Ed]

“Unfortunately, because the image was AI generated, neither law enforcement, nor the prosecutor’s office was able to take action,” the DA explained. “Technology continues to advance at a faster pace than the law.” 

In 2024, the San Francisco City Attorney’s Office took legal action against 16 websites accused of using AI to create fake nude images, targeting them for alleged violations of laws related to child exploitation and nonconsensual imagery, the Los Angeles Times reported. 

Meanwhile, in Minnesota, Democratic Senator Erin Maye Quade has introduced legislation aimed at curbing access to these sites within the state. Her proposed bill would require developers of nudify apps and websites to block Minnesota users or face steep civil fines — up to $500,000 per unlawful access, download, or use. If passed, the law would force developers to implement location-based restrictions to avoid penalties, per the Independent.

RELATED CONTENT: An AI Bot ‘Encouraged’ Teen To Kill His Parents — Now The Mom Is Suing