California Deepfake Laws First in Country to Take Effect

Jan 20, 2020

Reading Time : 3 min

Deepfakes are images, audio recordings or videos that have been manipulated to yield fabricated images and sounds that appear to be real. Some deepfakes are nearly indistinguishable from unedited media and may easily deceive viewers. Artificial intelligence is often used to create the content and complicates identification. Although some deepfakes are innocuous, many are specifically intended to harm or humiliate the person depicted and, in the right situation, may have lasting effects on public opinion (e.g., a deepfake of a politician on the eve of an election).

Assembly Bill 602 – Deepfakes and Sexually Explicit Material

California Assembly Bill 602 (AB 602) creates a private cause of action against a person who either: (1) creates and intentionally discloses sexually explicit material where the person knows or reasonably should have known the depicted individual did not consent to the creation or disclosure; or (2) intentionally discloses sexually explicit material that the person did not create and the person knows that the depicted individual did not consent to the creation of the material. A “depicted individual” is an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.

AB 602 exempts a person from liability if the material was disclosed at a legal proceeding, to report unlawful activity or if the material is a matter of legitimate public concern, a political or newsworthy work, or is otherwise protected by the California or U.S. Constitutions. A work is not newsworthy simply because the depicted individual is a public figure.

A successful plaintiff can recover: (1) either (a) economic and noneconomic damages proximately caused, including emotional distress, or (b) statutory damages of at least $1,500 but no more than $30,00, or, if the act was committed with malice, up to $150,000; (2) punitive damages; (3) attorney’s fees and costs; and (4) injunctive relief.

A plaintiff must bring suit within three years from the date the material was discovered or should have been discovered. The bill is not set to sunset.

Assembly Bill 730 – Deepfakes to Influence Political Campaigns

California Assembly Bill 730 (AB 730) amends the California Elections Code to prohibit a person, committee or other entity from distributing with actual malice and within 60 days of an election “materially deceptive audio or visual media” of a candidate on the ballot with the intent to injure the candidate’s reputation or to deceive a voter into voting for or against the candidate. “Materially deceptive audio or visual media” includes an intentionally manipulated image, audio or video recording of a candidate that would both (1) falsely appear to a reasonable person to be authentic and (2) would cause a reasonable person to have a fundamentally different understanding or impression of the expressive content than that person would have if they were hearing or seeing the unaltered, original version of the content.

AB 730 exempts any media accompanied by a disclosure stating that the media has been manipulated. Media distributed for news-reporting purposes is similarly exempt, provided that those distributing the context acknowledge that its authenticity is in question or that it does not accurately represent the speech or conduct of the candidate. Media that constitutes satire or parody is also exempt.

An affected candidate may seek injunctive relief or may seek general or special damages against the distributing party. The candidate bears the burden of establishing the violation through clear and convincing evidence. AB 730 is set to sunset on January 1, 2023, after which it will no longer be in effect.

Conclusion

It remains to be seen how plaintiffs will make use of their new rights. Both bills, but AB 730 in particular, have been criticized by free speech proponents and others concerned about the potential effects of enforcement. Critics contend that failure to define a few key terms, particularly in AB 730, could lead to misuse or, on the other hand, a situation in which the exceptions overtake the rule. As the United States heads toward the 2020 election, it remains to be seen what effect (if any) these laws, and AB 730 in particular, may have in helping the public better identify deepfakes and in helping victims recover from the same.

Share This Insight

© 2024 Akin Gump Strauss Hauer & Feld LLP. All rights reserved. Attorney advertising. This document is distributed for informational use only; it does not constitute legal advice and should not be used as such. Prior results do not guarantee a similar outcome. Akin is the practicing name of Akin Gump LLP, a New York limited liability partnership authorized and regulated by the Solicitors Regulation Authority under number 267321. A list of the partners is available for inspection at Eighth Floor, Ten Bishops Square, London E1 6EG. For more information about Akin Gump LLP, Akin Gump Strauss Hauer & Feld LLP and other associated entities under which the Akin Gump network operates worldwide, please see our Legal Notices page.