Skip to main content
Only Pay If We Win
(855) 339-8879
FREE CONSULTATION

California Bill on AI Generated Child Sex Abuse Images Introduced in State Legislature


California Bill on AI Generated Child Sex Abuse Images Introduced in State Legislature lawyer attorney sue

A bill designed to combat the rise of child sexual abuse material (CSAM) generated by AI is currently making its way through the California state legislature. Known as Assembly Bill 1831, the legislation was introduced by Assemblymember Marc Berman (D-Menlo Park) and Ventura County District Attorney Erik Nasarenko. According to a news release, the bill “would address the escalating threat posed by artificial intelligence (AI) in the creation of lifelike, illicit content involving children.”

The bill is in response to a series of incidents involving artificially generated images of minors, including students sharing nude photos of their classmates at a middle school in Beverly Hills. Essentially, the victims’ faces were superimposed onto AI-generated bodies to create sexually explicit content. The incident, which occurred in February of this year, resulted in the expulsion of 5 students one month later.

California is just one of many states that are struggling with child sexual exploitation using artificial intelligence. In September 2023, prosecutors in all 50 states sent a letter to Congressional leaders, urging them to “establish an expert commission to study the means and methods of AI that can be used to exploit children specifically.”

The usage of Photoshop and other technology has been used for many years to alter photos for nefarious purposes, like revenge porn. But the problem has become much worse in the past few years with advances in artificial intelligence technology. It’s very easy nowadays to download AI software from dark web sources that even young children can figure out. As a result, it’s not just adult child molesters that are creating and distributing these images. Kids are “nudifying” photos of their classmates and sharing them with their friends, who go on to share them with other friends, and so on. Eventually, these images are distributed over and over again, and there’s no way to permanently remove their presence from the internet.

Though it’s uncertain what the federal government will do about this issue, DA Narasenko hopes that California will be a leader in strong, comprehensive laws that target child sexual abuse using artificial intelligence. In a statement, Nasarenko said, “As technology evolves, so must our laws. This bill sends a clear message that our society will not tolerate the malicious use of artificial intelligence to produce harmful sexual content involving minors.”

We fully support any legislation that aims to combat AI-generated CSAM, but we are still left with the question of how current victims can obtain justice for the harm they’ve suffered. Seeking criminal prosecution is one option, but you may also have grounds to file a lawsuit against the parties that failed to protect you from sexual exploitation. For a free consultation with a AI generated child nude and pornography lawsuit attorney, contact the offices of DTLA Law Group.

California Bill on AI Generated Child Sex Abuse Images Introduced in State Legislature lawyer attorney compensation incident liability
Our Latest Verdicts and Settlements

$1.93 Million

Security Guard Assault

$2,287,495

Lead Poisoning

$54 Million

Sexual Abuse

$22 Million

Gym Accident

$600,000

Assault & Battery

$965,000

Assaulted By Employee

$1,964,400

Child Sex Abuse

$1,975,000

Head Injury
California Laws on Child Sexual Abuse Imagery and Content

AB 1831 is not the first law in California targeting sexual content featuring children and going after the people who aid or facilitate the dissemination of such material. In October 2023, Governor Gavin Newsom signed into law Assembly Bill 1394, which would punish social media and other web services for “knowingly facilitating, aiding, or abetting commercial sexual exploitation” of minors.

The law will take effect on January 1, 2025, and according to the governor’s office, there will be liabilities for websites – particularly social media platforms – that fail to remove such material online. Additionally, web services can face legal penalties if they “deploy a system, design, feature, or affordance that is a substantial factor in causing minor users to be victims of commercial sexual exploitation.” This section of the law is referring to technology used by these apps that can be exploited to create, upload, and share pornographic images of children. More specifically, it aims to punish apps and websites (TikTok, for example) that refuse to enforce rules or protective measures to prevent child abuse on their platforms.

The recently introduced Assembly Bill 1831 is another form of protection for children whose images on social media sites and other online platforms can be altered using AI technology. Along with the usage of Instagram, TikTok and other social media by children, it’s easy to understand how AI-generated child pornographic images spread like wildfire.

How is AI-Generated Child Sexual Abuse Material Created?

Frankly, a lot of parents feel lost when it comes to online technology like artificial intelligence software, but it’s important to have a sense of how these applications are used and why they are so dangerous. The safety watchdog organization IWF (Internet Watch Foundation) conducted a month long investigation into artificially generated child abuse material, which has shown rapid acceleration in the past few years. In their report, the group warned that nude and sexually graphic images of kids altered by AI have flooded the internet.

Aside from images of real kids taken mostly from social media accounts, adults can also be victimized by artificial intelligence used for the purpose of CSAM. For example, IWF found that the vast majority of these images were created by an artificial intelligence based tool called Stable Diffusion. Many of the users were taking photos of adults and “de-aging” them to look like children and depicting them in sexual situations.

Stability AI, the owners or Stable Diffusion, have issued a public statement saying that Stability AI “prohibits any misuse for illegal or immoral purposes across our platforms.” But this only points to the fact that such companies are well aware of how their product is being used. And with this knowledge, companies have a duty to monitor their systems on a regular basis and take all complaints of sexual abuse seriously. They must also comply with any investigation by law enforcement that has to do with AI-generated child abuse content using their software or internet tools.

Unfortunately, these companies are very slow to act when victims or their family members report that their images are being used as CSAM. Complaints regularly slip through the cracks, and victims are given the run around with legal jargon on why the company has no duty to help them. In the meanwhile, child sexual predators and other individuals with bad intentions are allowed to continue their campaign of sexual abuse against countless innocent victims.

Can I Sue if My Images were Used to Create AI Child Porn?

Yes, you can file a lawsuit if your image was used for the purpose of child sexual abuse imagery, like AI-generated nude photos and depictions of sex acts. The people who create and distribute this content try to use the argument that the victims (or their parents) are to blame, as they put up photos of themselves and their children online on public accounts that can be accessed by anyone. Some of them also say that they are creating “artistic” imagery as opposed to child pornography, and that these are artificially created images, so no one was actually abused.

However, these arguments do not take into account that creating depictions of children in sexual situations is illegal in California, even if they were digitally altered through AI software. Furthermore, knowing that your likeness was used in this manner and that these photos / videos of you are out there forever, to be used time and time again, is extremely traumatizing. Criminals even use these photos as blackmail material to extort money from the victims by saying that they will send the images to their family members and employers.

This is why it’s essential to uncover the root of how your images were used to generate CSAM using artificial intelligence technology. Along with the criminal that victimized you, you may have grounds to sue other parties, like a software developer, a social media platform, or a school district. These entities have a duty of care to protect minors from sexual abuse, but how they are liable and the degree of negligence that placed you in harm’s way are difficult concepts for the average person to wrestle with. That’s why our attorneys are here to help if you are a victim of AI-generated CSAM and other online child abuse material.

To schedule a free case review on your rights and legal options, contact the child sexual abuse lawyers of DTLA Law Group.

Average Value of a AI Generated Child Sexual Abuse Lawsuit

Child sexual abuse lawsuits can settle for around $400,000 to $10,000,000 or more, depending on many factors that are specific to the victim and how the abuse affected their lives. Frankly, it’s impossible to come up with a singular case value that’s adequate for each individual who has been hurt by artificially generated nude photos and other sexual content made in their image.

Overall, the severity of emotional and physical harm will play the biggest role, and this alone can put the value of a child sexual exploitation lawsuit at $1,000,000 to $5,000,000. But negligence by the liable entities, like a school district or social media platform also has a big impact on the settlement value of a child sex abuse lawsuit. Failing to take the necessary actions in spite of complaints from many victims, not taking preventative measures to prevent the dissemination of CSAM, and other examples of negligence is a significant factor in why these cases are generally worth anywhere from 6 to 8 figures.

How Long Do these Cases Take to Settle?

Lawsuits for child sexual exploitation and assault often take 1 to 2 years to settle, and over 3 years depending on factors like whether the case is tried in court. Overall, we would say that going to trial is not something most victims have to worry about. This happens in less than 5% of all child sex abuse lawsuit, so you can generally expect that we will reach a settlement without court intervention. But it can still take over 12 months to get to that point, as our goal is to bring you a fair amount of compensation based on the emotional trauma and monetary losses you sustained due to the defendant’s negligence. It’s possible that an acceptable settlement will be negotiated in 6 months or less, but decades of experience with these cases have shown us that a timeline of less than 1 year is rare when it comes to lawsuit for child sexual abuse.

California Bill on AI Generated Child Sex Abuse Images Introduced in State Legislature compensation lawyer attorney sue liability
What is the Deadline to Sue for AI Generated Child Pornographic Material?

A lawsuit for the sexual exploitation of a minor must be filed within 22 years from when the victim turns 18 years old, which is the age of adulthood in California. While this is a generous amount of time to come forward with allegations of child sexual abuse, there is another guideline that can provide you with even more time to sue for sexual abuse via AI-generated images.

As a child, victims often keep the secret of sexual abuse to themselves, or they are unable to process what has happened to them, either from trauma or lack of knowledge about sexual relations. This results in the victim suppressing these awful memories and moving forward as if nothing happened. But they only move forward in a superficial sense, while the damage from sexual assault or abuse festers deep inside. Over time, the victim will develop mental health issues that take a significant toll on their life, but they are unable to see that this is the result of being sexually abused during their childhood.

Once a victim is able to make the connection, they can finally begin to deal with the psychological injuries they sustained from child sexual abuse. Under California’s discovery rule, those who were sexually assaulted or exploited during childhood have 5 years from the discovery of injury or illness to file a lawsuit against the responsible party.

Contact an AI Generated CSAM Lawsuit Attorney

Anger, shame, frustration, anxiety – these are just some of the emotions you feel as someone that was exploited for the purpose of sexually explicit content. Victims can struggle with these emotions for the rest of their lives, and it’s unconscionable that the people who are responsible for their pain and suffering are left unpunished.

Our primary goal is to help you understand your rights and decide on the best way to move forward. Filing a lawsuit is completely up to you, but it’s essential to know about the legal actions that are available to you when you have been violated in a sexual manner.

Should you decide to go ahead with a lawsuit, you won’t have to worry about how much it will cost you to hire an AI-generated child sexual abuse material lawsuit lawyer. The cost of representing you is billed to the individual or entity that’s being sued, so they will cover our expenses as a part of your settlement award. On top of that, our law firm has a Zero Fee Guarantee policy, so you don’t pay any legal fees if we fail to win your lawsuit.

We look forward to speaking with you during a free case evaluation, so please contact us as soon as possible.


Over $1 BILLION Recovered
for Our Clients
Learn More

YOU Deserve the Best

Free Case Review 24/7
You Don’t Pay unless we win
Call (855) 339-8879

  • +1
  • This field is for validation purposes and should be left unchanged.

ultimate guide uber lyft accidents