Newly Proposed Federal Legislation to Curtail the Uprise in Deep Fakes and What it Means for Businesses and Individuals
September 3, 2024In response to the rise of deep fakes, the NO FAKES Act of 2024 has been proposed, aiming to establish a federal framework to protect against the unauthorized use of digital replica. As deep fakes evolve, they pose significant risks to personal privacy and business reputation. In this article, we explore the implications of the NO FAKES Act for both businesses and individuals, highlighting its potential impact on the digital space.
What is a Deep Fake?
Deep Fakes are fabricated videos or images that are computer-generated by a form of artificial intelligence called “deep learning,” which uses special algorithms to create realistic fake images. This technique may falsely portray a person or event in a manner so realistic that humans often cannot tell that they are computer-generated.
What’s the Harm?
Deep fakes may deceive the public by depicting a celebrity, politician, business, or individual doing or saying something unsavory; creating sexually explicit videos and images; conducting fraudulent activities; and misinformation campaigns that undermine the political system and news reporting. Deep fakes are becoming increasingly present on social media platforms, websites, and in other electronic platforms.
Some recent examples of deep fakes include explicit fake images of Taylor Swift posted on social media platform “X” that were viewed millions of times before being removed from the site and reports of deep fake phone calls claiming to be from President Joe Biden, Florida Governor Ron DeSantis, or various private-sector CEOs. There have also been reports of cyber criminals creating deep fake videos of private citizens asking a loved one for money, or fake videos of a business’s CFO requesting their staff to transfer company money. In one instance, a deep fake video of a CFO caused a finance worker pay over $25 million to fraudsters in China.
Are There Currently Laws in Place to Address Deep Fakes?
Existing legal remedies are limited and provide only incomplete protection. State laws and common laws protect the right of privacy, the right of publicity, and protect individuals against false defamatory content, and at last count, 14 states have passed laws addressing nonconsensual deep fakes of a sexual nature and 10 states have passed laws limiting the use of deep fakes in political campaigns. Additionally, earlier this year Tennessee passed a broader deep fake law not limited to sexually or politically related deepfakes, commonly referenced as the ELVIS Act (TN HB 2091).
However, there is no comprehensive federal framework for dealing with deep fakes. While some limited protection might be available in certain circumstances under the Copyright Act, Federal Trade Commission Act, the Lanham Act (the main federal law concerning trademarks), and the Communications Act, none of these laws were written to address deep fakes specifically, nor to the unique issues they raise.
The Proposed Federal NO FAKES Act of 2024
The NO FAKES Act was initially introduced in Congress in 2023 and was reintroduced in July of this year, though has not yet become law. The purpose of the proposed Act is to provide a single national framework for protecting intellectual property rights in the voice and visual likeness of all individuals. On the same day the NO FAKES Act of 2024 was reintroduced, the Copyright Office issued a report on the problems with digital replicas, and recommendations for a new federal law to specifically address the issue of deep fakes. The Copyright Office report recommended prompt legislative action, and its significantly, though not entirely, aligned with the proposed NO FAKES Act.
- Digital Replication Rights
The NO FAKES Act would provide that all individuals shall have a descendible and transferable property right to authorize the use of individual’s voice or visual likeness in a digital replica. This right cannot be assigned during the lifetime of the individual, but may be licensed.
- Duration of Digital Replication Rights
In general, the proposed Act provides that the digital replication rights are valid for the lifetime of the individual and can survive the individual’s death. Upon the death of the individual, that individual’s digital replication rights are transferred to the executors, heirs, assigns, licensees, or devisees of the individual. The rights continue to be valid for at least ten (10) years following the individual’s death, and can be extended to up to seventy (70) years after the death of the individual, in five (5) year increments, if the voice or visual likeness of the portrayed individual is used during the two (2) year period preceding the termination/expiration of rights.
- Licensing Requirements
Digital replication rights could be licensed by both adults and minors, but there would be different duration requirements for licenses involving adults compared to minors and licenses involving minors also have additional requirements necessary.
- Infringing Acts
- Producing a digital replica without consent of the individual (or right holder); and
- Publishing, reproducing, displaying, transmitting, or otherwise making available to the public, a digital replica without consent of the individual (or right holder).
- Remedies
If a civil action is brought within three (3) years of when the infringing conduct was discovered, the potential remedies include:
- Monetary damages
- Statutory damages: $5,000 per work (unauthorized digital replica) if the violating entity is an individual; $5,000 per violation if the violating entity is an online service provider; and $25,000 per work for entities that are not an online service; OR
- Actual damages: any actual damages suffered by the injured party, plus profits from unauthorized use
- Injunctive or other equitable relief
- Punitive damages
- Attorney’s fees
The attorneys at Lewis Rice have extensive experience in artificial intelligence and intellectual property law. If you have any questions about how the NO FAKES Act, deep fakes, and the rising use of artificial intelligence might impact your business or intellectual property rights, please contact a member of Lewis Rice’s Intellectual Property practice group.