Civitai, an AI model sharing site backed by Andreessen Horowitz (a16z) that 404 Media has repeatedly shown is being used to generate nonconsensual adult content, is banning AI models designed to generate the likeness of real people, the site announced Friday.
The policy change, which Civitai attributes in part to new AI regulations in the U.S. and Europe, is the most recent in a flurry of updates Civitai has made under increased pressure from payment processing service providers and 404 Media’s reporting. This recent change, will, at least temporarily, significantly hamper the ecosystem for creating nonconsensual AI-generated porn.
“We are removing models and images depicting real-world individuals from the platform. These resources and images will be available to the uploader for a short period of time before being removed,” Civitai said in its announcement. “This change is a requirement to continue conversations with specialist payment partners and has to be completed this week to prepare for their service.”
Earlier this month, Civitai updated its policies to ban certain types of adult content and introduced further restrictions around content depicting the likeness of real people in order to comply with requests from an unnamed payment processing service provider. This attempt to appease the payment processing service provider ultimately failed. On May 20, Civitai announced that the provider cut off the site, which currently can’t process credit card payments, though it says it will get a new provider soon.
“We know this will be frustrating for many creators and users. We’ve spoken at length about the value of likeness content, and this decision wasn’t made lightly,” Civitai’s statement about banning content depicting the likeness of real people said. “But we’re now facing an increasingly strict regulatory landscape - one evolving rapidly across multiple countries.”
The announcement specifically cites President Donald Trump’s recent signing of the Take It Down Act, which criminalizes and holds platforms liable for nonconsensual AI-generated adult content, and the EU AI Act, a comprehensive piece of AI regulation that was enacted last year.
💡
Do you know other sites that allow people to share models of real people? I would love to hear from you. Using a non-work device, you can message me securely on Signal at (609) 678-3204. Otherwise, send me an email at emanuel@404media.co.
As I’ve reported since 2023, Civitai’s policies against nonconsensual adult content did little to diminish the site’s actual crucial role in the AI-generated nonconsensual content ecosystem. Civitai’s policy allowed people to upload custom AI image generation models (LoRAs, checkpoints, etc) designed to recreate the likeness of real people. These models were mostly of huge movie stars and minor internet celebrities, but as our reporting has shown, also completely random, private people. Civitai also allowed users to share custom AI image generation models designed to depict extremely specific and graphic sex acts and fetishes, but it always banned users from producing nonconsensual nudity or porn.
However, by embedding in huge online spaces dedicated to creating and sharing nonconsensual content, I saw how easily people put these two types of models together. Civitai users couldn’t generate and share those models on Civitai, but they could download the models, combine them, generate nonconsensual porn of real people locally on their machines or on various cloud computing services, and post them to porn sites, Telegram, and social media. I’ve seen people in these spaces explain over and over again how easy it was to create nonconsensual porn of YouTubers, Twitch streamers, or barely known Instagram users by using models to Civitai and linking to those models hosted on Civitai.
One Telegram channel dedicated to AI-generating nonconsensual porn reacted to Civitai’s announcement with several users encouraging others to grab as many AI models of real people as they could before Civitai removed them. On this Telegram, users complained that these models were already removed, and my searches of the site have shown the same.
“The removal of those models really affect me [sic],” one prolific creator of nonconsensual content in the Telegram channel said.
When Civitai first announced that it was being pressured by its payment processing service provider several users started an archiving project to save all the models on the site before they were removed. A Discord server dedicated to this project now has over 100 members, but it appears Civitai has made many models inaccessible sooner than these users anticipated. One member of the archiving project said that there “are many thousands such models which cannot be backed up.”
Unfortunately, while Civitai’s recent policy changes and especially its removal of AI models of real people for now appears to have impacted people who make nonconsensual AI-generated porn, it’s unlikely that the change will slow them down for long. The people who originally created the models can always upload them to other sites, including some that have already positioned themselves as Civitai competitors.
It’s also unclear how Civitai intends to keep users from uploading AI models designed to generate the likeness of real people who are not well-known celebrities, as automated systems would not be able to detect these models.
Civitai did not immediately respond to a request for comment.
Data from a license plate-scanning tool that is primarily marketed as a surveillance solution for small towns to combat crimes like car jackings or finding missing people is being used by ICE, according to data reviewed by 404 Media. Local police around the country are performing lookups in Flock’s AI-powered automatic license plate reader (ALPR) system for “immigration” related searches and as part of other ICE investigations, giving federal law enforcement side-door access to a tool that it currently does not have a formal contract for.
The massive troveof lookup data was obtained by researchers who asked to remain anonymous to avoid potential retaliation and shared with 404 Media. It shows more than 4,000 nation and statewide lookups by local and state police done either at the behest of the federal government or as an “informal” favor to federal law enforcement, or with a potential immigration focus, according to statements from police departments and sheriff offices collected by 404 Media. It shows that, while Flock does not have a contract with ICE, the agency sources data from Flock’s cameras by making requests to local law enforcement. The data reviewed by 404 Media was obtained using a public records request from the Danville, Illinois Police Department, and shows the Flock search logs from police departments around the country.
As part of a Flock search, police have to provide a “reason” they are performing the lookup. In the “reason” field for searches of Danville’s cameras, officers from across the U.S. wrote “immigration,” “ICE,” “ICE+ERO,” which is ICE’s Enforcement and Removal Operations, the section that focuses on deportations; “illegal immigration,” “ICE WARRANT,” and other immigration-related reasons. Although lookups mentioning ICE occurred across both the Biden and Trump administrations, all of the lookups that explicitly list “immigration” as their reason were made after Trump was inaugurated, according to the data.
💡
Do you know anything else about Flock? We would love to hear from you. Using a non-work device, you can message Jason securely on Signal at jason.404 and Joseph at joseph.404
“Different law enforcement systems serve different purposes and might be more appropriate for one agency or another. There should be public conversations about what we want different agencies to be able to do,” Jay Stanley, senior policy analyst at the ACLU’s Speech, Privacy, and Technology Project, told 404 Media. “I assume there’s a fair number of community residents who accept giving police the power to deploy license plate readers to catch a bank robber, who would absolutely gag on the idea that their community’s cameras have become part of a nationwide ICE surveillance infrastructure. And yet if this kind of informal backdoor access to surveillance devices is allowed, then there’s functionally no limits to what systems ICE can tap into with no public oversight or control into what they are tapping into.”
A screenshot of the data.
Flock says its ALPR cameras are “trusted by more than 5,000 communities across the country.” These cameras continuously record the plates, color, and brand of vehicles passing in front of them. Law enforcement can then perform searches to see where exactly a vehicle, and by extension person, was at a certain time or map out their movements across a wide date range. Flock is also developing a new product called Nova which will supplement that ALPR data with people lookup tools, data brokers, and data breaches to “jump from LPR [license plate reader] to person,” 404 Media previously revealed. Law enforcement typically do these lookups without a warrant or court order, something which an ongoing lawsuit argues is unconstitutional.
Law enforcement agencies are able to search their own Flock cameras, but also those in other states or even nationwide. A Flock user guide says that national lookups allow “all law enforcement agencies across the country” who are also opted into that setting to search a user’s cameras.
That user guide also says that users can “run a Network Audit to see who has searched your network from any agency in the Flock system.”
The researchers used a public records request to obtain the Danville Police Department’s Network Audit. Because Flock allows police departments to share their cameras’ records across a nation and statewide network of law enforcement agencies, the audit shows whenever Danville’s camera records were searched by police departments around the country.
The data used to report this story shows in real numbers how expansive Flock’s nationwide network of cameras has become. When the Dallas Police Department, for example, performed a series of searches for “ICE+ERO” on March 6, the department wasn’t just searching its own cameras, it was searching 6,674 different individual Flock camera networks composed of 77,771 total devices, the data says. (The Dallas Police Department declined to comment on its searches).
Searches across Danville’s Flock cameras came from other agencies in Illinois, such as the Chicago Police Department. The data also includes state and local law enforcement agencies from all over the country, such as sheriff offices and police departments in Florida, Arkansas, Louisiana, South Carolina, Virginia, Arizona, and Texas. The Florida Highway Patrol and Missouri State Highway Patrol are also included in the data. The network audit stretches from June 1, 2024 to May 5, 2025 and contains millions of total searches. The researchers then narrowed that data to the more than 4,000 searches that contained immigration keywords in the “reason” field.
“I can't speak for the company as a whole, but I was unaware that Flock's tools were being used by local departments in collaboration with ICE. I'm disappointed, but not surprised,” a Flock source said. 404 Media granted the source anonymity as they were not permitted to speak to the press. “It's really important that people understand how this tech—which they pay for with tax dollars—is used, since ultimately it's up to state and local governments to draw the boundaries of fair use by law enforcement.”
Image from Flock's media kit.
There are some caveats with the data. Many of the entries list the lookup reason as HSI, and HSI has a broad criminal investigative mandate beyond immigration enforcement, meaning that the police are helping a division of ICE but may not be using Flock specifically for immigration enforcement. Some law enforcement agencies told 404 Media they are not engaging in immigration enforcement despite the reason for the Flock lookup saying “immigration.”
A Missouri State Highway Patrol spokesperson told 404 Media that although the listed reason for using Flock was “immigration,” the lookup “was related to a traffic stop with indicators of possible human trafficking.” The spokesperson added “We are in the process of obtaining the training and creating the applicable policies” for immigration enforcement. Other agencies that listed “immigration” as the reason for the lookup did not respond to a request for comment.
The Trump administration has made a point of encouraging state and local police departments, which do not normally have authority to enforce immigration laws, to apply for a program called 287(g), which allows ICE to “delegate” the enforcement of immigration laws to local police. A January executive order issued by Trump instructs DHS and ICE “to authorize State and local law enforcement officials, as the Secretary of Homeland Security determines are qualified and appropriate, to perform the functions of immigration officers in relation to the investigation, apprehension, or detention of aliens in the United States.”
It is particularly notable that the data in question came from an Illinois police department, because Illinois is one of the few states that specifically bans the use of ALPR data for immigration enforcement. Illinois-based police departments that ran searches shown in the data insisted that the searches were for criminal cases or were not specifically for immigration enforcement purposes.
“The chart [data] provided does not indicate that Danville PD is searching Flock LPR data or acting for another municipal, county, or state LE agency, nor ICE regarding immigration,” Danville’s police chief Chris Yates told 404 Media. “As required by the State of Illinois we ensure that we will not use LPR data or enforce a law or relate a person’s immigration status.” Yates did not respond to follow up questions about why the Flock audit showed searches for immigration-related reasons from other agencies around the country.
“Long-story-short, what is being alleged is not happening,” Danville’s mayor, Rickey Williams Jr added.
A screenshot of the data.
But Danville’s own data is showing that these searches by other police departments are in fact happening, and 404 Media confirmed the details of several searches with the departments that performed the search. The police departments we got details from said that sometimes searches for federal agencies are “informal,” and sometimes they are part of a specific investigation. What is clear, however, is that ICE and HSI have gained side-door access to a tool that they do not formally have access to.
Andrew Perley, the deputy chief of the Village of Glencoe, Illinois police department, told 404 Media that a specific search “was not related to an investigation involving immigration status. The inquiry was an informal request from Homeland Security Investigations into a criminal matter aside from immigration.” Ryan Glew of Evanston, Illinois police department, told 404 Media that one of their specific searches was because “We were assisting Homeland Security in the apprehension of a wanted subject. The subject was part of a nationwide retail theft ring that was responsible for millions of dollars from stores across the country. The queries were not immigration-related.”
Other police departments in Illinois we spoke to said that some of the searches were done to “assist” federal law enforcement, or that the searches were done by one of their “task force officers,” who are local police that are embedded with federal units. Mike Yott, the police chief of Palos Heights, Illinois, said that, due to Illinois law, his department does not do immigration enforcement. But he said that he did not know what a search performed by one of its department’s task force officers embedded with the Drug Enforcement Administration was for, even though it read “immigration violation.”
“Based on the limited information on the report, the coding/wording may be poor and the use of Flock may be part of a narcotics investigation or a fugitive status warrant, which does on occasion involve people with various immigration statuses,” Yott said.
The fact that police almost never get a warrant to perform a Flock search means that there is not as much oversight into its use, which leads to local police either formally or informally helping the feds by doing lookups.
A screenshot of the data.
“Law enforcement really likes license plate readers because of the lack of restrictions on that data. They don’t feel like they need a warrant. Oftentimes there are no restrictions whatsoever on what they search,” Dave Maass, who studies border technology at the Electronic Frontier Foundation, told 404 Media. “It might be totally true that some of these searches are for people who have warrants or who are wanted for criminal activity. They might be looking for a terrorist, who knows. But that’s kind of the point—we don’t know.”
Flock said in a statement that “We are committed to ensuring every customer can leverage technology in a way that reflects their values, and support democratically-authorized governing bodies to determine what that means for their community.”
“All Flock customers own and control 100% of the data collected by their Flock systems and choose who to share data with. The tools are fully auditable, indefinitely saving usage reports so command staff or city leadership has full insight into the use of the products. The network audit logs are an example of this auditing-by-design approach,” the statement continued. The company said its tools have helped law enforcement locate more than 1,000 missing persons.
“We work with local governments across the country to adopt best practices on LPR policies, including robust auditing requirements. Flock’s platform requires double opt-in for agencies to share data amongst each other—we recommend every agency adopt a strong LPR policy, conduct regular audits, and be thoughtful about how and with whom they share data,” the statement continued.
“What is incredibly frustrating is that Flock in particular in Illinois marketed themselves to a bunch of communities in the suburbs and in Central Illinois as a device that would be critical to combatting an uptick in crime, violent crime, gun violence. But this is really a national system of data once you start collecting this, whether it’s Bloomington or Springfield or Danville, you start looping together those networks,” Edwin Yohnka, director of communications and public policy for ACLU Illinois, told 404 Media. “So it is incredibly troubling to see this list of places from around the country who are performing these searches of Illinois cameras.”
DHS did not respond to multiple requests for comment.
Business owners hate employees more than anything. Unfortunately, AI’s not very good at doing human jobs. We noted last month how chatbots hadn’t knocked down wages or hours in any occupation.
Orgvue surveyed 1,000 executives. 39% of them had fired employees for AI — but of those 39%, over half (55%) regretted it. The AI boosters are crippling their own businesses. [Orgvue]
Credit company Klarna tried to go full AI, including customer service by chatbot. The CEO had said he wanted Klarna to be OpenAI’s “favourite guinea pig.” This went so badly they’re now on a hiring spree to restaff with humans. [Bloomberg, archive]
Duolingo declared it was going as AI as possible earlier this month and now it’s feeling the backlash. CEO Luis von Ahn has tried to provide “clarity”: [LinkedIn]
I do not see AI as replacing what our employees do (we are in fact continuing to hire at the same speed as before). I see it as a tool to accelerate what we do, at the same or better level of quality.
Nobody believes him. Von Ahn also said how AI is better than school teachers, but you still need the teachers for child care. [Fortune]
Replacing programmers with AI coding isn’t working out so well. I’m hearing stories of consultant programmers being called in to quietly rewrite vibe code disasters that were the CEO’s personal pet project, because the code cannot be fixed in place.
If you get the call, charge as much as you can. Then charge more.