Minnesota Legislation Targets AI Tools Creating Fake Nude Images

In response to increasing misuse of artificial intelligence, Minnesota legislators have enacted a bill aimed at curbing platforms that facilitate this abuse. The Minnesota Senate unanimously approved House File 1606 on Thursday with a 65-0 vote, forwarding it for Governor Tim Walz’s signature. This legislation prohibits websites and applications from offering tools designed to generate realistic fake nude images of identifiable individuals.

According to the bill, operators of any website, app, or software service are barred from allowing users access to or usage of tools that create these images, or generating such content on a user’s behalf. Additionally, it is forbidden to advertise or promote services involving these capabilities.

Victims depicted in AI-generated nude imagery have the right to sue those controlling nudification tools, including websites and apps. Compensation claims can include damages for mental distress, with courts authorized to award up to triple the actual damages plus punitive damages, attorney fees, and injunctions against further violations.

The state’s attorney general is empowered by the legislation to enforce these regulations, imposing civil penalties of up to $500,000 per violation. These fines are allocated to the state’s general fund and subsequently directed towards victim support services for survivors of sexual assault, domestic violence, and child abuse.

Targeting tools that require minimal technical know-how and are readily accessible even by minors, this law will take effect on August 1 for new cases. Notably, while no specific AI developers are named in the bill, its enactment follows high-profile incidents such as Elon Musk’s xAI tool, Grok, creating nude deepfakes of Taylor Swift in August 2025. The pop star preemptively trademarked her voice and likeness to deter future AI reproductions.

Musk faces additional legal challenges including a federal class action lawsuit from three Tennessee minors alleging Grok generated nonconsensual sexual content using their images. Furthermore, Baltimore has filed a consumer protection lawsuit against the company for knowingly deploying systems that distribute and disseminate sexually explicit content without consent, involving minors.

Public Citizen co-president Robert Weissman highlighted how these tools have drastically lowered barriers to producing nonconsensual intimate imagery, predominantly targeting women—over 90% of whom are under 18. He emphasized their role as instruments of intimidation and harassment with severe psychological impacts. “These apps overwhelmingly target women, most of them minors,” Weissman told Decrypt, noting the urgent need for government intervention.

Weissman also noted that state laws can complement federal efforts in regulation enforcement. Local authorities may act swiftly on individual cases compared to potentially less prioritized federal agencies.

The Minnesota legislation coincides with ongoing debates between President Donald Trump’s administration and states over AI regulation control. The Take It Down Act, enacted by President Trump in May 2025, criminalizes distributing nonconsensual intimate images and provides victims a path for civil redress.

“Having complementary federal and state standards is positive in theory,” Weissman remarked to Decrypt. “We’re dealing with different enforcement mechanisms and agencies. A federal standard might exist without corresponding federal enforcement capacity.”

Governor Walz’s office has not yet responded to Decrypt’s request for comment.

Platform Hexoria Forex officieel vertrouwd platform voor AI-handel