The generative AI market is growing rapidly and is estimated to increase from
However, AI's expansion of creative expression conflicts with intellectual property rights and the right to control one's persona. For example, in
Similarly, the AI-generated song, "Heart on my Sleeve," posted by an anonymous TikTok user, featured voices that mimicked Drake and the Weeknd without their authorization. The video of the song was viewed millions of times, and the song was even submitted for a
Although
The Right of Publicity
As social media use has exploded and the power of influencers and celebrity endorsements has grown exponentially, protecting the right to control the use of one's persona is more important than ever.
Currently, the right of publicity generally protects individuals' rights to control the "commercial" use of their names, images, and likenesses, and in some states, other aspects of their persona, such as voices or signatures.
The right of publicity is governed by state law. Only 38 states recognize the right of publicity (some by statute, some by common law, and some by both), and only 20 states recognize a post-mortem right of publicity. These state laws often conflict and therefore, whether an individual's rights in his or her persona are protected, and the scope of such protection depends, in large part, on where the individual resides.
The right to control one's persona does not trump the First Amendment. Therefore, right of publicity cases often turn on whether the particular use was commercial in nature or not, and each state that recognizes the right of publicity has its own test to balance individuals' rights to control the use of their personas with the First Amendment right to free expression.
First Amendment defenses have been successful in certain cases where a third party used an individual's persona in connection with entertainment programs, news, public affairs, sports broadcasts, political campaigns, or in connection with parodies. For example, Cardtoons, L.C. v. Major League Baseball Players Ass'n held that trading cards featuring caricatures of Major
Moreover, First Amendment defenses have also been accepted where significant transformative components have been added to the use of the persona. Winter v. DC Comics determined that the use of characters in a comic book that evoked musicians Johnny and Edgar Autumn did not violate the musicians' rights of publicity where the images were of distorted cartoon characters that were half-human, half worm. In addition, No Doubt v. Activision Publ'g, Inc. rejected a video game manufacturer's First Amendment defense against the band No Doubt's right of publicity claim where the manufacturer used "computer-generated recreations of the real band members, painstakingly designed to mimic their likenesses," in its video game, and creative elements in the game were nothing more than conventional images of the band members.
No Fakes Act
Given that right of publicity laws vary from state to state (with some states not even recognizing such a right), there are obvious gaps in protection. These gaps will become more pronounced when applied to new technologies like AI.
In an effort to "protect the voice and visual likeness of all individuals from unauthorized recreations from generative AI," Senators
Particularly, if enacted, the No Fakes Act would:
-
Create a post-mortem right of publicity that would protect one's image, voice, and visual likeness for 70 years after death;
- Prohibit the production of digital replicas or computer-generated electronic representations that are nearly indistinguishable from the actual voice or visual likeness of individuals without their consent;
- Prohibit the distribution, publication, or transmission of unauthorized digital replicas that one knows is unauthorized; and
- Create civil liability for violations equal to the greater of
$5,000 per violation or the damages suffered by the injured party, with punitive damages and attorneys' fees available for willful violations.
Much like the affirmative defenses under state law based on First Amendment grounds, the No Fakes Act does not extend to digital replicas that are: (i) used as part of a news, public affairs, or sports report, (ii) used in a documentary or historical work if the digital replica of the applicable individual is used as a depiction of that actual individual, (iii) used for purposes of comment, criticism, scholarship, satire, or parody, (iv) used in an advertisement or commercial announcement for the purposes in (i)-(iii) above, or (v) are de minimis or incidental.
As more content is created that includes AI-generated images, voices, and/or likenesses - particularly of individuals who were not involved in, and did not content to, such inclusion, a push for a more robust set of rules governing the right of publicity is inevitable.
The No Fakes Act would create a uniform law to protect individuals' rights in their persona from unauthorized use in digital replicas and would fill in some of the gaps where state laws fail to address the use of generative AI. Notably, unlike the various state laws, the Act does not specifically require that the unauthorized use be commercial in nature. As such, if the Act ultimately becomes law, there will likely be First Amendment considerations that will be litigated, and the constitutionality of the law may be challenged.
Originally published by
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
Tel: 973597 2500
Fax: 973597 2400
E-mail: rcolon@lowenstein.com
URL: www.lowenstein.com
© Mondaq Ltd, 2024 - Tel. +44 (0)20 8544 8300 - http://www.mondaq.com, source