Provides an overview of intellectual property principles (patent, copyright, and trademark law) for Spanish-language artists and entrepreneurs. Esta presentación es una introducción a la propiedad intellectual y como se protege legalmente mediante patentes (para un invento, que es un producto, aparato, o proceso que ofrece una solución técnica o científica), derechos de autor (para obras artísticas, literarias, o expresivas), y marcas comerciales (símbolos, nombres, frases, o diseños unido a un producto o servicio que idenifican un negocio específico).

The ™️ and ©️ symbols are more than just formalities – used correctly, they can be a competitive edge for your brand!

Learn how to build business success and brand equity with Irwin IP attorneys Jason Keener and Suet Lee.

On April 29, 2025, the Federal Circuit adopted a test from the Trademark Trial and Appeal Board (“TTAB”) for determining whether a color mark is generic.  Under the test, the Federal Circuit affirmed that a color mark for medical examination gloves did not function as a source indicator, lacked sufficient evidence of acquired distinctiveness, and as such was not entitled to trademark protection. 

Medisafe is a medical glove manufacturer and distributor.  Medisafe applied for a color mark used in connection with its medical examination gloves, namely, “the color dark green (Pantone 3285 c) as applied to the entire surface of the goods which consist of chloroprene examination gloves.”  But the trademark examiner refused to register the mark finding that the mark was not inherently distinctive and that, to be registered, the color mark required a showing of acquired distinctiveness.  Medisafe submitted a declaration and color photographs and advertisements showing competitive goods in the industry, which did not convince the examiner that the mark had acquired distinctiveness.  Medisafe responded to the rejection with additional declarations, but the examiner found that evidence insufficient.  The examiner relied on the two-step test set forth in H. Marvin Ginn Corp. v. International Ass’n of Fire Chiefs, Inc., 782 F.2d 987, 990 (Fed. Cir. 1986) in determining that Medisafe’s mark was generic.  Specifically, (1) identify the genus of goods or services at issue and then (2) determine if the term sought is understood by the relevant public to refer primarily to that genus of goods or services. 

Medisafe appealed to the TTAB which applied a slight variation of the Marvin Ginn test that was tailored to the analysis of color marks as set out in Milwaukee Electric Tool Corp. v. Freud America, Inc., 2019 WL 6522400 (T.T.A.B. Dec. 2, 2019).  Milwaukee considered (i) the genus of goods or services at issue, and (ii) whether the color sought to be registered or retained on the register is understood by the relevant public primarily as a category or type of trade dress for that genus of goods or services.  The TTAB rejected Medisafe’s proposed genus (gloves to authorized resellers) in favor of a broader genus encompassing all chloroprene medical examination gloves.  The TTAB agreed with the examiner that the relevant public includes “all such people or businesses who do or may purchase chloroprene medical examination gloves” and that the mark was generic because it “is so common in the chloroprene medical examination glove industry that it cannot identify a single source.”  The TTAB found all 25 screenshots of third parties selling gloves in a same or near same color to be probative of genericness because “[t]he relevant consumer – even including unspecified ‘authorized resellers’ – could be exposed to . . . gloves that appear under a large number of third-party marks without identifying [Medisafe] as the source or manufacturer.”  Medisafe appealed that decision. 

The Federal Circuit affirmed that the TTAB had correctly applied the Milwaukee test.  The Federal Circuit held that, although a color mark may serve as a source indicator, substantial evidence supported the TTAB’s finding that Medisafe’s proposed mark failed to function as a trademark.  With this new case of first impression, there is now controlling precedent for evaluating the genericness of color marks. 

The Digital Replica Accountability and Identity Protection Act

Preamble

Purpose: To protect individuals against the unauthorized creation and distribution of realistic digital deepfakes – computer-generated replicas of a person’s name, image, voice, or likeness – while safeguarding free expression, artistic creativity, and technological innovation. This Act establishes clear personal Name, Image, Voice, and Likeness (NIVL) rights as a form of intellectual property, provides a balanced notice-and-takedown mechanism (with counter-notice) modeled on the DMCA to prevent abuse, and holds platforms and creators accountable for harmful misuses of generative AI. It applies broadly to commercial and non-commercial actors alike (including individuals, platforms, and nonprofits), and sets a federal floor of protection for personal likeness rights without unduly infringing on First Amendment freedoms.

Findings

Congress finds the following:

1. Rising Threat of Deepfakes: Advances in artificial intelligence have enabled the creation of “digital replicas” (deepfakes) that can realistically depict individuals doing or saying things they never did. While AI offers creative and educational benefits, its misuse poses serious risks to privacy, reputation, and security. Notably, a large portion of deepfake content has been exploitative or harmful (e.g. non-consensual intimate imagery and impersonation for fraud), causing emotional, reputational, and economic injury to victims.

2. Inadequacy of Existing Law: Existing laws (such as defamation, fraud, or copyright) address some harms but leave gaps regarding purely synthetic but false portrayals of individuals. The patchwork of state “right of publicity” laws is inconsistent, with some states providing strong protections and others none. This lack of a clear federal standard enables bad actors to exploit generative AI across state lines, undermining individual privacy and control over one’s own likeness.

3. Need for Personal NIVL Rights: Individuals deserve a personal property right in their name, image, voice, and likeness (NIVL) to prevent unauthorized digital impersonations. Establishing these as federally recognized intellectual property rights empowers individuals to control commercial and non-commercial exploitations of their identity. Congress has authority under the Commerce Clause to regulate these uses, which affect interstate commerce and consumer trust. However, such rights must be carefully balanced with free speech; unqualified or overly broad restrictions on deepfake creation risk chilling protected expression and would likely fail strict scrutiny.

4. First Amendment Safeguards: Lawful expressive uses of a person’s likeness – including parody, satire, commentary, criticism, news reporting, education, art, and other transformative or fictional works – have significant social value and are protected by the First Amendment. For example, AI-generated parodies of public figures (such as satirical “deepfake” videos of public officials or celebrities) and creative works (like a museum’s digital reenactment of a historical figure) are forms of expression. This Act therefore exempts such uses from liability, providing a clear list of protected categories and a constitutional “savings clause” to ensure no infringement of free speech rights.

5. Learnings from Prior Proposals: Recent legislative efforts (e.g. the NO FAKES Act of 2025) highlighted both the importance of protecting digital likeness and the pitfalls of an overly rigid approach. Notably, critics warned that the NO FAKES Act lacked a simple counter-notice mechanism and would force speakers to file suit within 14 days to contest takedowns, creating a “heckler’s veto” where platforms remove content to avoid liability. Additionally, allowing broad, long-term licensing or transfer of one’s digital likeness (for up to 10-year exclusive terms) could alienate individuals from their own identity. A 70-year post-mortem term in earlier proposals was criticized as unjustifiably long, burdening speech about the deceased and raising constitutional concerns. This Act addresses those critiques by implementing a DMCA-modeled notice-and-counter-notice process to prevent abuse of takedowns, by imposing strict limits on licensing (especially for minors), and by adopting a shorter, renewable post-mortem rights period in line with the diminishing interest after death.

6. Platform Responsibility and Safe Harbor: Online platforms and services play a key role in the dissemination of AI-generated content. While Section 230 of the Communications Act provides important protections for platforms hosting user content, it does not immunize intellectual property violations (47 U.S.C. §230(e)(2)). Consistent with that principle, this Act clarifies that claims under these personal likeness rights are not subject to Section 230 immunity. At the same time, the Act creates a safe harbor for platforms that expeditiously remove reported infringing deepfakes and honor counter-notices, similar to the DMCA’s balanced approach. This ensures platform accountability for NIVL violations while minimizing incentives to over-censor lawful speech.

7. Federal Floor, Not Ceiling: In establishing federal NIVL protections, Congress intends to set a minimum baseline that does not preempt stronger state laws. States that provide more expansive rights or remedies for misuse of one’s likeness (such as broader privacy torts or higher damages) may continue to enforce those laws; however, weaker state protections that fall below the standards of this Act are preempted. This approach creates uniform basic rights nationwide while allowing states to innovate further in combating persona-based harms.

Section 1. Short Title

This Act may be cited as the “Digital Replica Accountability and Identity Protection Act.”

Section 2. Definitions

For purposes of this Act:

“Individual” – A living natural person. For rights that continue after death, “individual” also refers to the deceased person (to the extent post-mortem rights are recognized by this Act) and to their rightful heir or estate representative as the right holder.

“Name, Image, Voice, or Likeness (NIVL)” – The personal identifying attributes of an individual, including their name, image, photograph, visual likeness, voice, or any other distinctive identifiable aspect of their persona. This includes a person’s signature, gestures, or likeness of mannerisms, to the extent they identify the person. These attributes are considered property rights of the individual for purposes of this Act.

“Digital Replica” – Any computer-generated or electronically simulated depiction of an individual’s likeness, image, voice, or other identifying characteristic that realistically appears or sounds like the individual. This includes any video, audio, image, or other recording that has been created, altered, or synthesized using AI or other digital means such that it imitates the real person in a lifelike manner, especially in scenarios where the real person did not perform or utter the content portrayed. A digital replica may be entirely fictional (e.g. an AI-generated video of the person) or a modified version of real footage or audio (e.g. dubbing or face-swapping). For avoidance of doubt, a simple reference to a person’s name or a caricature that is not a realistic depiction does not by itself constitute a “digital replica” under this Act.

“Authorize” – To grant permission in a written, signed agreement for a specific use of one’s NIVL or digital replica. “Unauthorized” means without the consent of the individual (or of the right holder, if applicable). A use is considered unauthorized if it falls outside the scope of an agreement or license that meets the requirements of this Act.

“Right Holder” – The person or entity who holds an individual’s rights under this Act. During a person’s life, the individual themself is the right holder. After an individual’s death, the heir, estate, or assign to whom the post-mortem rights have lawfully passed (via a valid will or through intestate succession) is the right holder, for the limited duration of post-mortem rights recognized by this Act.

“Interactive Computer Service” or “Service Provider” – An entity offering services that store, display, distribute, or transmit user-generated content on the internet, including but not limited to social media platforms, content-sharing websites, forums, web hosting providers, cloud storage services, and other online services that allow users to upload or share material (as defined in 47 U.S.C. §230). This term encompasses both commercial platforms and non-profit or educational services that facilitate user content.

“Takedown Notice” – A formal notification sent by a right holder (or their authorized agent) to a service provider or content publisher, alleging that a specific content item contains an unauthorized digital replica or misuse of the sender’s NIVL, and requesting its removal or disabling. To be valid under this Act, a takedown notice must substantially comply with the requirements in Section 6(a).

“Counter-Notice” – A formal response by a user (content provider) to a takedown notice, asserting that the targeted content was removed or disabled in error or misidentification, or is not infringing (for example, because it is authorized or falls under an exemption). A valid counter-notice must meet the requirements of Section 6(b).

“Person” – An individual, corporation, company, partnership, firm, association, or any other legal entity. (In defining who may be liable or who may enforce rights, “person” can include entities such as media organizations or nonprofits as well as natural persons.)

Section 3. Rights in Name, Image, Voice, and Likeness

(a) Recognition of Exclusive Rights. Every individual has exclusive rights in their own name, image, voice, and likeness. These rights are herein recognized as a form of intellectual property (specifically, a right of publicity in digital replicas of one’s persona) belonging to the individual. Except as otherwise provided in this Act, no person may create, use, distribute, or profit from a digital replica of an individual without that individual’s authorization. Unauthorized use of another’s NIVL in a digital replica constitutes an infringement of the individual’s rights under this Act, subject to the remedies and exceptions outlined below. This restriction applies regardless of whether the use is commercial or non-commercial in nature – the focus is on the lack of consent and potential for harm or confusion, not merely on profit. An individual’s NIVL shall be considered their personal property, freely assignable and licensable (subject to subsection (c) below), and infringement of these rights is actionable as outlined in Section 7.

(b) Scope of Protection – Digital Replicas and Other Uses. The rights established by this Act protect individuals against: (1) Digital replicas: the unauthorized creation or dissemination of deepfake audiovisual or audio content that falsely depicts the individual. (2) Misappropriation of persona: other unauthorized uses of an individual’s name, voice, image, or likeness in any context (including still images or written impersonations) where such use is intended to mislead the public into believing the individual engaged in the depicted conduct or endorsed the content. However, works of fiction or satire that depict a person as a character are generally exempted under Section 5, as are uses that merely reference a person (such as name-dropping) without creating a realistic impression of their actual appearance or voice. In determining whether a use is infringing, the key question is whether the content in question would cause a reasonable observer to believe it is actually the real individual speaking or appearing, when it is not. If so – and the use is outside the exemptions of Section 5 – it falls within the protection of this Act.

(c) Transferability and Post-Mortem Rights. The rights in an individual’s NIVL are freely transferable or descendible in accordance with this subsection:

During Life: The individual (as the original right holder) may license or transfer their rights to others by contract, subject to the Licensing Requirements in subsection (d). Any transfer not meeting those requirements is voidable. The individual retains the right to terminate or renegotiate licenses as provided by law and contract.

Upon Death: The rights survive the individual’s death and pass to their estate or heirs. By default, the executor or heirs (as specified by will or by intestate succession) become the new right holder(s) of the decedent’s NIVL rights. However, these post-mortem rights are limited in duration as described below.

Duration of Post-Mortem Rights: Post-mortem rights in a deceased individual’s NIVL shall endure for 20 years from the date of death of the individual. Prior to the expiration of the 20-year term, the right holder (heir or estate) may apply to renew the rights once for an additional 10-year period, provided that the individual’s NIVL has been actively used in commerce or protected during the initial term. In particular, the estate must demonstrate that within the last 2 years of the initial 20-year term, there was authorized public use or commercial exploitation of the individual’s likeness or voice (e.g. licensed products, digital performances, etc.). If such use is shown and renewal is timely registered (with the Copyright Office or designated office), the rights shall extend an additional 10 years. No further renewals are permitted beyond this single extension – thus, in no case shall post-mortem rights extend beyond 30 years after death. After expiration of the post-mortem term (20 years if not renewed, or 30 years if renewed), the individual’s NIVL enters the public domain for purposes of this Act (meaning no exclusive rights are enforceable under this Act, though other laws like trademark or false endorsement may still apply in limited cases).

Termination of Post-Mortem Rights: If the right holder does not secure a renewal (or does not wish to renew) after the initial 20-year period, the federal rights established by this Act terminate upon the 20-year mark. Even if renewal is obtained, all rights terminate at the 30-year mark post-mortem, regardless of continued use. Earliest Termination: If for any reason the right holder ceases to exist or the estate is dissolved before the term ends, and no successor or assign is available, the rights shall terminate earlier. Moreover, if a right holder affirmatively waives or releases the rights before the term ends, those rights shall likewise terminate (and fall into the public domain for such uses).

Rationale: Congress finds that lengthy post-mortem rights (such as 70 years in some proposals) are unnecessary and burden free expression about deceased persons. A 20-year term, extendable to at most 30 years with demonstrated use, strikes a balance between allowing heirs to commercially benefit from or protect their loved one’s legacy and ensuring that the identities of long-deceased individuals eventually become available for creative works, historical fiction, and public discourse without legal impediments.

(d) Licensing and Consent Requirements. An individual may authorize the creation or use of their digital replica or other uses of their NIVL via a license or contract. To be valid and enforceable (against the individual or any party), any license of rights under this Act must meet the following conditions:

1. Written Agreement: The license or consent must be in a written contract or documented agreement signed by the individual (or by an authorized representative of the individual, such as an agent or attorney-in-fact). Electronic signatures are acceptable if legally valid. No oral licenses or implied consent (aside from the exemptions in Section 5) shall suffice for authorizing a digital replica.

2. Specificity of Scope: The agreement must include a reasonably specific description of the intended uses of the individual’s NIVL. Blanket or open-ended licenses (“any and all uses”) are disfavored; the uses authorized should be described with enough detail (e.g. the particular film, game, advertisement, or context in which the replica will appear, and the nature of the portrayal) to inform the individual of what they are agreeing to. Any use outside the scope of this description is not covered by the license and remains unauthorized.

3. Duration Limit: For any license granted by a living individual, the license term shall not exceed 10 years from the date of execution, unless earlier terminated by its own terms. Even if the contract purports to grant rights for longer, it will be construed to expire after 10 years by operation of law. The individual may renew or extend the license by a new agreement at or near the end of the term if they so choose, but no single agreement can lock up the individual’s rights for more than 10 years. This safeguard prevents individuals from inadvertently losing control of their persona for excessive durations via a one-time contract.

4. No Assignment Without Consent: The license cannot be transferred or assigned by the licensee to any third party (nor may any sublicense be granted) unless the original contract expressly permits it and the individual explicitly consents to the particular transfer. In other words, one cannot gain rights to someone’s likeness via a boilerplate license and then sell or give those rights to another entity without the person’s knowledge and consent. Any unauthorized transfer or sublicensing is void.

5. Right to Revocation for Material Breach or Misuse: The individual (or right holder) retains the right to revoke the license if the licensee materially breaches the agreement or materially misuses the individual’s likeness in a way that was not contemplated (for instance, using the digital replica to create intentionally misleading or harmful content not agreed upon). Contracts may include specific terms for revocation, but at minimum, if a licensee’s use of the likeness violates the law or the agreed restrictions, the individual may terminate the license and prohibit further use.

6. Special Protection for Minors: In the case of an individual who is a minor (under 18 years old), additional safeguards apply. A minor’s parent or legal guardian may consent on the minor’s behalf to a license only with court approval: any license of a minor’s NIVL rights must be approved by an appropriate state court (e.g. a probate or family court), consistent with state laws on minor contracts in entertainment, to ensure the terms are in the minor’s best interest. Furthermore, no license involving a minor may last beyond 5 years, and in any event it terminates automatically when the minor turns 18 (at which point the now-adult individual can choose to affirm or re-negotiate it on their own). These measures protect minors from being bound by onerous contracts that exploit their identity before they are adults – a direct response to concerns about minors potentially signing away rights they don’t fully understand.

7. Non-Waivability of Requirements: The above requirements (written form, specific scope, duration limit, etc.) are mandatory conditions for any license or assignment of rights under this Act. Parties cannot waive these protections by contract. Any contract term purporting to waive or circumvent these statutory requirements is against public policy and void. (For example, a contract clause stating the individual “waives all rights under the Digital Replica Act” or agrees that the license “shall be irrevocable and perpetual notwithstanding any law” would not be enforceable.)

Discussion: These licensing provisions ensure informed consent and prevent abuses. They incorporate and strengthen concepts from the NO FAKES Act (e.g. the requirement of a signed writing and specific use description), closing loopholes that could allow a one-click online agreement to steal someone’s likeness for a decade. By capping license length and requiring explicit approval (especially for minors), the Act aims to protect individuals’ autonomy over their identities while still permitting legitimate projects (films, games, etc.) to contract for the use of a person’s likeness in a controlled manner.

(e) Relationship to Other Intellectual Property: The rights provided in this Section are intended to coexist with other intellectual property rights and privacy rights. They do not supersede copyright, trademark, or patent law, but an individual’s exercise of NIVL rights might overlap with those domains (for example, a photo of a person might be someone’s copyright, and also the person’s likeness – both rights may need to be cleared). Use of a digital replica could give rise to trademark-like claims (false endorsement) or copyright claims (if using protected footage); this Act provides an additional cause of action focused on the personal likeness aspect. Nothing in this Act shall be construed to limit any rights an individual may have under copyright (like if they authored a performance) or trademark (if their personal name or image functions as a mark). However, a plaintiff may not recover duplicate damages under this Act and another legal theory for the same conduct – they must choose a theory or the court will ensure no double recovery for the same injury.

Section 4. Permitted Uses and Exemptions

Not all uses of a person’s likeness are wrongful. To preserve freedom of expression and robust public discourse, the following exemptions delineate uses of an individual’s name, image, voice, or likeness that do not require consent and do not incur liability under this Act, provided that the use is not a disguised commercial advertisement and does not falsely claim to be authorized by or represent the actual person (as opposed to a portrayal or discussion of them). These exemptions shall be construed liberally in favor of free expression, and the burden is on the plaintiff to prove that a particular use is not exempt. Examples are given to illustrate the type of use intended to be protected:

1. News, Public Affairs, and Sports Reporting: Bona fide news reporting, journalism, or commentary on matters of public interest is exempt. This includes use of an individual’s likeness or voice in reporting on breaking news, political events, social issues, or sports. For example, a TV news program or documentary may use an AI-generated voice or reenactment to illustrate a quote or scenario in the news (such as recreating an interview if the original audio is unavailable), provided it is clearly part of a news report and not presented as an authentic recording. Similarly, a sports broadcast or news website could show a digitally recreated scene of a past sports play or interview for analysis. Uses in this category are protected so long as they are informative or journalistic in nature and not an attempt to commercialize the person’s likeness beyond the news context.

2. Commentary, Criticism, Satire, or Parody: Expressive works that comment on, critique, or parody an individual or societal phenomenon are exempt. This covers a wide range of creative content: comedy sketches, satire videos, memes, editorial cartoons, or social commentary that use a digital replica for humor or critique. For instance, an internet creator might produce a satirical deepfake of a public figure (such as a parody video of a tech CEO appearing to say humorous things about their company, or a fake movie trailer featuring a politician in absurd scenarios) – such parody is protected as long as a reasonable viewer would understand it is a joke or commentary and not a real event. Likewise, critical commentary might include an AI-generated impersonation to highlight the person’s stance or to critique the individual’s behavior. Even if done in a sharp or biting manner, such use is exempt as a form of criticism or parody. Example: The viral deepfake of Facebook’s CEO announcing absurd policies was a form of commentary on data privacy; similarly, an online parody of an actor “endorsing” a silly product can be exempt if it’s clearly a spoof.

3. Artistic and Creative Works (Expressive Uses): Any transformative, expressive, or creative work in which an individual’s likeness or voice is depicted, but the work as a whole is primarily artistic or literary in nature rather than an advertisement or mere trade exploitation, is exempt. This includes films, TV shows, books, music, theater, visual art, or videogames that depict real people or fictional characters based on real people, so long as the work is not simply a product endorsement. For example, a historical drama might use a digital replica of a past public figure to authentically portray events, or a biographical film could recreate a deceased actor’s likeness for certain scenes – these would be exempt as expressive works. Likewise, a painting or digital artwork that incorporates a real person’s image, or a song that samples a celebrity voice via AI for artistic effect, falls under this exemption if the use is part of an original expressive creation. A key consideration is whether the work is offering new insight, message, or meaning (thus protected), versus merely appropriating the identity for profit. Even fictional works and satire featuring a person (for instance, a novel or animation that includes a celebrity character in a story) are generally exempt, as they are creative expressions.

4. Educational or Scholarly Use: Use for bona fide scholarship, research, or education is exempt. This includes classroom uses, academic presentations, or research projects where a person’s likeness is used for illustrative or analytical purposes. For instance, a university lecture about speech synthesis might use an AI voice clone of a famous figure to demonstrate the technology. Or a scholarly publication on media could include a generated image of a historical figure to discuss deepfake techniques. As long as the use is non-commercial and aimed at knowledge dissemination or analysis, it is allowed. Additionally, libraries, museums, or archives that might create digital exhibits involving historical figures (e.g. an interactive display where a historical figure “speaks” via AI) are covered here, provided it’s educational and not a commercial advertisement.

5. Incidental or De Minimis Use: If an individual’s name or likeness is included incidentally or as a de minimis element of a work, it is not actionable. This means that a fleeting or background use that is not central to the work’s value is exempt. For example, if a documentary about New York City uses a few seconds of archival footage where a random bystander (who happens to be identifiable) is walking by, that is incidental. In the AI context, if a generative algorithm unintentionally produces a face that happens to resemble a real person in the background of a scene, that would be an incidental use (assuming it’s not deliberate portrayal). Only deliberate, meaningful portrayal of a person triggers the rights – purely incidental appearances do not.

6. Political, Satirical, or Critical Speech: Any use of an individual’s likeness that is part of a political or social discourse – for example, commentary on public officials or public figures – is strongly protected. Deepfakes used in an election context for satire or critique (such as impersonating a candidate to ridicule their position) are exempt, except if done with actual malice to defraud or incite imminent lawless action (which would fall outside First Amendment protection regardless). This category overlaps with commentary and parody, but emphasizes that public issue speech is at the core of protected expression. Note: Deliberate disinformation deepfakes intended to deceive voters (e.g. a fake video of a candidate making a false confession, presented as real) would not be protected by this clause because the intent is to deceive rather than comment – those would be actionable unless clearly marked as parody.

7. Any Other Expressive Work Protected by the First Amendment: The above categories are illustrative. In general, any use of a digital replica that is transformative, informational, or critical in nature, such that it constitutes protected speech under the First Amendment, is exempt from liability. This catch-all ensures that if a use doesn’t neatly fit the listed categories but is nonetheless legitimate expression (for example, a fictional podcast that uses an actor’s voice in a story, or an AI-generated hologram in a theater performance), it will be protected so long as it doesn’t mislead the public into thinking it’s the actual person outside of an expressive context. Courts shall interpret this exemption in line with First Amendment jurisprudence on artistic relevance, transformative use, and the distinction between commercial speech and expressive speech.

8. First Amendment Savings Clause: Nothing in this Act shall be construed to impede or chill speech protected under the U.S. Constitution. If the application of any provision of this Act to a particular use of an individual’s NIVL would violate the First Amendment, such use shall be exempt from liability. This Act is intended to punish false or misleading impersonations and commercial misappropriation, not to punish satire, criticism, or other protected expression. The exemptions in this Section 4 should be interpreted broadly to avoid any unconstitutional result. In any action brought under this Act, a defendant may raise as an affirmative defense that their use is protected expression under the First Amendment or falls within one of the above exempted categories; if so, the burden shifts to the plaintiff to prove that the use was a purely exploitative or misleading one not entitled to protection.

Illustrative Examples (non-exhaustive):

Parody Video: A comedian uses AI to create a video of a famous actor “reviewing” absurd products as a joke. This is parody/criticism and exempt, even if done without the actor’s consent, because it’s clearly not real and comments on the actor’s persona.

News Documentary: A documentary film uses AI to recreate the voice of a late historical figure reading their letters, to enhance the storytelling. This is an expressive historical work, exempt as artistic/educational use.

Deepfake Art Installation: An artist creates an installation where visitors can interact with a deepfake of a past president to explore historical what-ifs. Artistic and educational – exempt.

Unauthorized Endorsement (Not Exempt): In contrast, if a company or individual creates a deepfake of a celebrity seemingly endorsing a product or service without consent, that is not exempt – it’s a commercial use likely causing confusion as to endorsement, and the celebrity (or their estate) could sue under this Act (in addition to trademark law). Likewise, a deepfake created simply to humiliate or harass a private individual (such as a fake explicit video circulated without consent) has no expressive value or public interest – it would not fall under any exemption and would be fully actionable.

These exemptions align with those found in the Preventing Abuse of Digital Replicas Act (PADRA) and similar proposals, and reflect a consensus that bona fide news, commentary, satire, education, and art must remain free from undue legal burdens. By explicitly listing protected contexts, the Act provides clarity to content creators and platforms about what is allowed, thereby preventing the “chilling effect” on legitimate speech that an overbroad law might cause.

Section 5. Notice-and-Takedown Mechanism for Online Content

To facilitate the efficient enforcement of NIVL rights while minimizing court involvement and protecting lawful speech, the following notice-and-takedown procedure is established, modeled on the Digital Millennium Copyright Act’s 17 U.S.C. §512 safe harbor system. This mechanism allows individuals to seek rapid removal of infringing deepfake content from online platforms, and allows users who believe their content was wrongly removed to seek restoration, all without immediate litigation. Service providers that comply with this procedure in good faith will receive safe harbor protection from monetary liability (as detailed in Section 7), thereby encouraging cooperation and balanced outcomes.

(a) Takedown Notice Procedure:

1. Submitting a Takedown Notice: If an individual (or their authorized agent) discovers an online content item that uses their NIVL in an unauthorized digital replica in violation of this Act, they may submit a Takedown Notice to the service provider hosting or displaying that content. The notice should be sent to the provider’s designated agent for such complaints (if the provider has a registered agent similar to a DMCA agent) or, if no agent is designated, to a readily reachable contact (such as a published email or webform for content complaints).

2. Contents of Notice: To be effective, the Takedown Notice must be in writing (electronic or paper) and include: (A) Identification of the individual whose rights are at issue, and a statement that the sender is the individual or an authorized representative (with proof of authority if the representative is sending, such as a signed letter or power of attorney). (B) Identification of the infringing content sufficient to locate it on the service provider’s system – e.g., a URL of the specific video or post, or other information like the account name and description of the content. (C) Description of the violation: an explanation of how the content depicts a digital replica of the individual without consent. For example, “The video depicts a computer-generated version of my face and voice performing actions I never did, and I did not authorize this.” (D) Statement of unauthorized status: a clear statement that the use of the person’s name/image/voice was not authorized by the individual (and not otherwise permitted by law or any exception). (E) Contact information: for the complaining party – including name, address, phone, and email – so the service provider and the content uploader can contact them. (F) Good faith and accuracy certification: a statement by the notice sender that “I have a good faith belief that the use of the identified content is not authorized by me (or the law) and that the information in this notice is accurate,** under penalty of perjury**.” (Similar to the DMCA requirement, this affirms truthfulness of the claim). (G) Signature: a physical or electronic signature of the individual or their agent. A typed name at the end of an email can suffice as an electronic signature.

3. Provider’s Response to Notice: Upon receiving a conforming Takedown Notice, the service provider shall promptly (and in good faith) remove or disable access to the identified content. “Promptly” generally means as soon as reasonably possible, and no later than by the timeline required under any applicable regulations – for example, within 48 hours is encouraged for most cases. The provider should also, without delay, notify the user/uploader who provided the content that it has been taken down pursuant to a notice under this Act. This notification to the user should include a copy of the takedown notice (or at least the core details, excluding the complainant’s sensitive info if appropriate) and inform the user of their right to submit a counter-notice if they believe the takedown was erroneous.

4. Designated Agent and Accessibility: Service providers covered by this Act are strongly encouraged (and may be required by regulation) to designate an agent to receive Takedown Notices, similar to the DMCA’s Section 512(c) requirement, and to publish this agent’s contact information on their website. The Copyright Office (or another designated federal office) may create a directory of such agents. The goal is to make it easy for individuals to locate where to send a notice. If a provider has not designated an agent, sending the notice to a publicly known contact point (corporate address, official email) shall not be grounds to claim improper notice.

5. Voluntary Take-down by Platforms: Nothing in this Act prevents a service provider from removing or disabling content that it independently discovers to be a likely violation of someone’s NIVL rights (for instance, through content moderation or user flagging) even before any formal notice, if done in good faith. However, such removal should still allow for the counter-notice process described below so that users have recourse if a mistake was made.

(b) Counter-Notice Procedure:

1. Eligibility to Counter-Notice: If a user (individual or entity) believes their content was removed or disabled in error – for example, they believe the content is not actually infringing (perhaps because the depicted person did consent, or the content is actually a parody or otherwise exempt under Section 4, or maybe the likeness is coincidental or not realistic), they may submit a Counter-Notice to the service provider.

2. Contents of Counter-Notice: A valid Counter-Notice must include: (A) Identification of the removed content (such as the URL or unique identifier that was taken down) and a short description of it. (B) Statement of good faith belief of mistake: e.g., “I swear under penalty of perjury that I have a good faith belief the content was removed due to mistake or misidentification.” The user should briefly explain why they believe the takedown was wrong – for instance, “This video is a parody and thus not actually impersonating the person,” or “I had a license/permission from the individual,” or “The image is not actually [the person] but just looks similar.” (C) Uploader’s contact information: name, address, and telephone, and email. (D) Consent to jurisdiction: a statement that the user consents to the jurisdiction of Federal District Court for the district of their address (or, if outside the U.S., they consent to an appropriate U.S. jurisdiction where the service provider is located), and will accept service of process from the complainant who submitted the takedown notice (this is borrowed from the DMCA’s requirement, to ensure the counter-notice sender can be sued in the U.S. if it proceeds to court). (E) Signature: physical or electronic signature of the user (again, a typed signature in an email can suffice).

3. Provider’s Response to Counter-Notice: Upon receiving a valid Counter-Notice, the service provider shall promptly forward a copy to the original complainant (the right holder who sent the takedown). Then, within 10 business days after receipt of the counter (unless a different timeframe is specified by regulation), the provider must restore the content (re-enable access) unless the provider receives notice from the original complainant that they have filed an action seeking a court order to restrain the user from re-posting the content. In other words, the burden shifts to the complainant to initiate a lawsuit if they want the content to remain down after a counter-notice. This mirrors the DMCA’s approach: content comes back online unless the complainant goes to court within a short window. If the complainant does timely file a lawsuit and notifies the provider, the provider should keep the content down pending the outcome (or further order). If no such notice of lawsuit is given within e.g. 10 business days (roughly 14 calendar days) of forwarding the counter-notice, then the provider should re-enable the content. The provider should inform the counter-notice sender when the content is restored.

4. Misrepresentations and Abuse: To deter bad-faith use of this system, any person who knowingly and materially misrepresents in a takedown notice that content is infringing when it is not, or misrepresents in a counter-notice that content was removed by mistake when it actually was infringing, shall be liable for damages incurred by the other party as a result of the misrepresentation (NOFAKES-ACT-Reintroduced.pdf). This is akin to 17 U.S.C. §512(f). For example, if someone files a false takedown to harass a creator (knowing the content was actually authorized or exempt) and the creator incurs costs or loses revenue, the bad-faith complainant can be held liable for those damages and possibly attorney’s fees. Similarly, a user who submits a false counter-notice (claiming fair use or consent when they know none exists) that causes the right holder to have to file suit unnecessarily can be liable for those costs. This provision aims to penalize abuse of the takedown system and thereby encourage honest use.

5. No Waiver of Rights: A service provider’s compliance with or offering of this notice-and-takedown process does not by itself constitute an admission of liability for any infringing content. Likewise, a right holder utilizing this process is not waiving any right to sue; they are simply following the procedure as a first step. All parties preserve their right to pursue legal remedies in court if needed.

6. Additional Guidelines: The Register of Copyrights (or another authority designated by Congress, since the Copyright Office often oversees DMCA) may promulgate regulations to fine-tune the notice and counter-notice procedures under this Act, including standardized forms, time frames, and agent directory maintenance. These regulations should be consistent with the principles above and aim to make the process accessible and efficient for individuals (including those who may not be tech-savvy) and service providers.

Effect of Compliance (Safe Harbor): A service provider that expeditiously removes content upon notice and properly restores it upon counter-notice (absent a court order), as described, shall be shielded from monetary liability for that content under this Act – i.e., they cannot be sued for damages for having hosted the content prior to removal, nor for taking it down or restoring it per the procedure. In essence, if the provider acts as a neutral intermediary and follows the rules in good faith, they enjoy a safe harbor similar to that in the DMCA. This incentivizes cooperation. However, if a provider fails to comply – for example, ignores a valid takedown notice and leaves clearly infringing content up – it can be exposed to liability (including potential contributory liability for the user’s infringement, once the provider had knowledge via the notice). Likewise, if a provider refuses to restore content despite a valid counter-notice and no lawsuit, a wrongfully removed user might have a cause of action against the provider. But as long as providers stick to the script, they remain largely shielded.

This notice-and-takedown framework directly addresses concerns raised about the lack of a counter-notice in previous legislation. By allowing speech to be quickly restored if a takedown was improper (without immediately forcing a user to go to court), it mitigates the speech-chilling effect that overzealous enforcement could create. At the same time, it gives victims of deepfake abuse a rapid means to remove harmful content and requires them to go to court only if a dispute truly needs adjudication – much like copyright holders under the DMCA. This balance will help ensure that platforms are not turned into “censorship tools” at the mere allegation of a violation, while still promptly addressing genuine violations.

Section 6. Enforcement and Remedies

(a) Civil Cause of Action: An individual (or right holder) whose rights under this Act are violated may bring a civil action in an appropriate United States District Court. The action may be against any person or entity who violated the rights (e.g., the creator of the unauthorized digital replica, the person who knowingly distributed or profited from it, or a platform that failed to comply with takedown obligations after notice). The plaintiff must prove by a preponderance of evidence that: (1) they own or represent the rights in question, (2) the defendant used the plaintiff’s name, image, voice, or likeness in a digital replica without consent, (3) the use is not exempt under Section 4, and (4) the defendant’s actions caused harm (or were likely to cause harm) to the plaintiff’s interests (including dignitary, reputational, or economic harm).

(b) Available Remedies: In a civil action under this Act, the court may award the following remedies:

1. Injunctive Relief: The court may issue an injunction to prevent or restrain the continued creation, distribution, or display of the infringing digital replica. This can include orders to remove or destroy infringing content, orders prohibiting the defendant from future similar violations, and in appropriate cases, mandatory injunctions requiring the defendant to take affirmative steps (such as issuing corrections or public clarifications that a certain deepfake was fake, if necessary to mitigate harm). Injunctive relief may also include ordering the take-down of the content from specific websites or online platforms (where feasible, consistent with First Amendment limits on prior restraint – typically, content determined by a court to be unprotected false impersonation can be ordered down).

2. Monetary Damages:

Actual Damages and Profits: The plaintiff may recover actual damages suffered as a result of the violation. Actual damages can include harm to reputation, emotional distress, and/or economic loss (such as lost endorsement opportunities or diminished commercial value of one’s likeness). In addition, the plaintiff can recover any profits unjustly earned by the defendant that are attributable to the unauthorized use, to the extent such profits are not already counted in actual damages. (For example, if a defendant sold a product using the plaintiff’s likeness, the plaintiff could claim the defendant’s profits from those sales.) The plaintiff must present proof of such damages or profits, and the defendant can attempt to prove any portion of profit not due to the use of plaintiff’s likeness.

Statutory Damages (Optional): In lieu of proving actual damages, a plaintiff may elect to receive statutory damages for each violation, if the court finds a violation occurred. Statutory damages are set at up to $5,000 per distinct unauthorized digital replica, per distribution or performance, with a maximum of $1,000,000 per prevailing party per proceeding. The court has discretion to determine the appropriate amount within these limits based on the severity of the violation, the reach of the content, and whether the defendant’s infringement was willful or merely negligent. For example, if a video was posted and accessed widely, each view is not separately counted; rather, the court might consider the number of copies or uploads. The $1,000,000 cap per case ensures that damages, while potentially significant, are not unbounded for a single claim – though multiple distinct works can each be subject to separate claims.

Aggravated Statutory Damages for Willful or Malicious Conduct: If the court finds that the defendant willfully or maliciously engaged in the unauthorized use – for instance, deliberately creating a deepfake to humiliate someone or with knowledge that it would cause substantial harm – the court may in its discretion award up to triple the standard statutory damages cap (i.e., up to $3,000,000) or consider this willfulness in awarding higher end of actual damages. This is to strongly deter egregious conduct such as the spread of non-consensual sexual deepfakes or politically motivated deepfake disinformation. Conversely, if the defendant’s infringement was innocent or unknowingly (perhaps they reasonably thought a satire was protected), the court can reduce damages, potentially even to zero nominal damages, especially if a quick retraction was made.

3. Attorney’s Fees and Costs: The court may award reasonable attorney’s fees and full litigation costs to the prevailing party. Generally, a plaintiff who prevails and proves a willful violation should be awarded fees as a matter of course (to encourage rightful claims). A prevailing defendant may be awarded fees if the lawsuit is found to be frivolous or brought in bad faith (to discourage overreaching claims that try to suppress protected speech). This provision is to ensure that individuals who have been wronged can find legal representation (knowing fees might be recouped) and to deter meritless suits intended to silence legitimate expression. The awarding of fees is at the court’s discretion, taking into account the conduct of parties (e.g., whether either party refused a reasonable settlement, etc.).

4. Punitive Damages: In cases of particularly egregious, willful, and malicious violations – for example, a defendant who repeatedly makes harmful deepfakes of the plaintiff even after being enjoined – the court or jury may award punitive damages under applicable state law standards (if the case allows; note that some states or federal claims allow punitive damages to punish malicious conduct). Because this is a federal statutory tort/intellectual property claim, punitive damages would likely be determined by the court’s assessment of common law principles. Such damages are reserved for extreme cases and are intended to punish and deter outrageous conduct. This is separate from statutory damages (which have a cap); punitive could exceed caps if justified, but would likely require a high bar of proof of malice or reckless disregard for the plaintiff’s rights.

5. Other Relief: The court may order any other equitable relief it deems just and proper, such as ordering the impounding or destruction of infringing artificial intelligence models or files (if, say, the defendant has trained a specific model entirely on the plaintiff’s voice, the court might order that model deleted to prevent future misuse). The court could also require credit or disclosure in some contexts as a remedy (e.g., if content is allowed to remain up because it’s borderline, the court might compel an addition of a disclaimer clarifying it’s not real – though courts must be careful with compelled speech, this could be an option if narrowly tailored and agreed upon).

(c) Limitation of Liability for Service Providers (Safe Harbor): As noted in Section 5, an online service provider that complies with the notice-and-takedown and counter-notice process in good faith shall not be liable for monetary relief for storing, transmitting, or linking to infringing content that was uploaded by a user. In any action under this Act, if the provider can show it has satisfied the conditions of the safe harbor (e.g., had no knowledge of infringement until notice, acted promptly to remove, and followed the counter-notice rules), the plaintiff cannot collect damages or attorney’s fees from the service provider for that content. The provider could still be subject to an injunction to remove the content (which it likely already did) or to remove identical infringing copies if they reappear (for instance, if users try to repost the exact same video, the court can order the platform to keep it down). But beyond such specific injunctive relief, the compliant provider is essentially immune from financial liability for user violations of this Act. Conversely, if a provider fails to comply with the obligations (e.g., refuses to take down notified content, or has a pattern of ignoring known violations), it loses this protection and can be held liable as a contributory or vicarious infringer to the same extent as the primary violator, subject to the damages outlined above.

(d) Relation to Section 230 (No Immunity for NIVL Violations): This Act clarifies that causes of action for violations of individuals’ rights in their name, image, voice, or likeness as established herein are “laws pertaining to intellectual property” and thus not subject to immunity under 47 U.S.C. §230. In other words, an interactive computer service cannot invoke Section 230(c)(1) to dodge liability for hosting or materially contributing to content that violates this Act (just as Section 230 doesn’t apply to copyright or trademark infringements). However, the safe harbor of subsection (c) provides the needed protection for passive hosts that respond to notices. The intent is to hold platforms accountable if they turn a blind eye to clear misuse of personas, without treating them as the guarantors of all user behavior. This is consistent with 47 U.S.C. §230(e)(2) which preserves intellectual property claims from the scope of Section 230 immunity. By explicitly defining the rights here as intellectual property rights, Congress ensures a uniform application: platforms are on notice that they must honor takedown requests for deepfake impersonations or potentially face liability, rather than hiding behind Section 230. (Note: This provision does not impose liability on a platform for user content automatically – it simply removes the special immunity. Plaintiffs would still need to prove the platform’s knowledge or contribution, etc., as required for secondary liability, unless the platform is the direct creator.)

(e) Enforcement by Government Agencies: While the primary enforcement mechanism is a private right of action, the Federal Trade Commission (FTC) is authorized to treat egregious violations of this Act (particularly in commercial contexts, such as false endorsements or deceptive trade practices using deepfakes) as an unfair or deceptive act in commerce, allowing the FTC to take action under its own governing laws. Similarly, state Attorneys General may bring civil actions on behalf of citizens of their state against parties who engage in a pattern of violations of this Act, in federal court, seeking injunctive relief or damages as if they were the individual (with any monetary recovery potentially going to affected individuals or to a state victim compensation fund). However, to avoid double recovery, no separate state AG suit should proceed if an individual is already pursuing a suit over the same act (unless that individual’s suit does not adequately represent state interests, at the court’s discretion).

(f) Statute of Limitations: A civil action under this Act must be brought within 3 years from the date the plaintiff discovered or reasonably should have discovered the unauthorized use (the appearance of the content). Because some deepfakes might not be immediately known to the victim (they could be hidden on the internet), the “discovery rule” applies. However, in no event may an action be brought more than 7 years after the first publication of the offending content, even if discovered later – this is to provide eventual closure and certainty (akin to a statute of repose). The limitations period is tolled during any time when the content was fraudulently concealed by the defendant or if the defendant used measures to hide the identity of the infringer. Each new, separate unauthorized work (e.g. a new deepfake video) constitutes a new cause of action with its own limitation clock, but repeat distributions of the same work do not reset the clock unless it rises to a new violation.

(g) Rebuttable Presumption in Commercial Endorsement Contexts: In any civil action under this Act (or under Lanham Act §43(a) if pled in the alternative) for the use of a digital replica in connection with advertising, promotions, or the sale of goods or services, the law shall recognize a rebuttable presumption that such use is likely to cause confusion or deception about sponsorship or endorsement (PADRA Bill.pdf). In practice, this assists the plaintiff in cases where, for example, their likeness was used in an ad without permission – the court can presume consumer confusion (that people would think the person endorsed it). The defendant can rebut by showing, say, a clear disclaimer or parody intent, but absent that, the presumption stands. This principle is borrowed from the PADRA proposal’s amendment to trademark law and is included here to reinforce that unauthorized commercial exploitation of someone’s persona is inherently misleading. This presumption complements the rights in this Act and does not override the need to also consider First Amendment defenses when applicable (for example, an artistic work with incidental product placement would not invoke this in the same way).

Section 7. Preemption and Relationship to State Laws

(a) Federal Floor for NIVL Rights: This Act establishes a minimum, nationwide level of protection for individuals’ rights in their name, image, voice, and likeness against digital replica misuse. It is the intent of Congress to preempt state laws only to the extent that they provide lesser protection or are in direct conflict with this Act. In particular, no state or local law may offer less stringent protections or shorter post-mortem durations than those provided here, as that would undermine the uniform floor of rights. For example, if a state law currently does not recognize any post-mortem publicity rights or only a 5-year term, this federal Act’s 20-year post-mortem right will govern as a minimum in that state.

(b) Preservation of Stronger State Protections: Conversely, nothing in this Act is intended to preempt or limit any State law that provides equal or greater protection to individuals. If a state has an existing right of publicity statute or common law that extends protection beyond what this Act covers (for instance, covering additional attributes like signature or catchphrases, or providing a longer statute of limitations, or offering higher damages, or covering non-digital uses as well), those state provisions remain in force. Similarly, if a state allows post-mortem rights to last longer than 30 years, or covers commercial uses beyond digital replicas (like simple merchandise uses of one’s image), those laws are not overridden – as long as compliance with the state law would not result in a violation of this Act or vice versa. In essence, this Act acts as a floor, not a ceiling. Individuals may still avail themselves of state law remedies for unauthorized use of their likeness, especially in scenarios outside the scope of this Act (e.g., a purely non-digital use, or use of a persona element not covered here). States are free to enforce stricter rules (such as requiring consent for any use of someone’s persona in advertising, even if not a realistic deepfake).

(c) Conflict Preemption: In the event of a direct conflict between this Act and a state law (i.e., it’s impossible to comply with both, or state law stands as an obstacle to the accomplishment of federal objectives), federal law shall prevail under the Supremacy

Clause. For example, if a state law purported to allow an exception that federal law does not (and that exception undermines the federal minimum protection), the federal rule takes precedence. On the other hand, if a state law has an extra exception (say for works of fine art) that is not specifically enumerated in Section 4 but would be covered by the broad First Amendment savings clause anyway, that’s not a conflict – it’s parallel. Courts should strive to harmonize the application of this Act with state laws, enforcing whichever rule (state or federal) provides the greater protection to the individual’s rights in a given circumstance, except where the federal exemptions or processes (like the notice-and-takedown scheme) are part of the balancing of interests, in which case those should generally govern for a claim under this Act.

(d) No Effect on Other Causes of Action: This Act does not supersede or limit any other cause of action an individual might have under federal or state law for the same conduct, such as: defamation (if the deepfake conveys a defamatory false statement about the person), false light invasion of privacy, intentional infliction of emotional distress, trademark infringement or dilution (if their name or image is used in a way that confuses consumers about a brand or endorsement), rights under state publicity law (if, say, a state law covers a broader range of unauthorized uses), or criminal impersonation or identity theft laws, etc. However, a plaintiff cannot receive double recovery. In practice, this means a person could plead multiple theories (e.g., this Act’s claim alongside defamation and state publicity claims), but if they win on more than one, any damages must be adjusted so they are not paid twice for the same harm. They would typically elect the theory that provides the most complete relief. Courts might, for instance, instruct juries to award damages under one theory or the other but not both for the same underlying harm.

(e) State Law Not Preempted on Procedural Matters: States may continue to enforce laws regarding contractual approval for minors (the requirement of court approval for minors’ contracts, which this Act explicitly nods to) and other procedural safeguards. This Act works in tandem with those – e.g., a license for a minor must get state court approval per our Act, and state procedures for that are acknowledged.

(f) No Preemption of Common Law Tort of Appropriation: Many states recognize a common law or statutory tort for misappropriation of likeness or right of publicity. Those are not preempted except to the extent they would allow recovery when this Act would find the defendant’s conduct exempt (for instance, if a state had an overly broad publicity right that could conflict with First Amendment-protected parody, enforcement of that in a way that punished parody could conflict with the purposes of this Act to protect such speech). But since this Act explicitly sets a floor and includes free speech exemptions, states should likewise interpret their laws consistently with those free speech principles (which they generally do because of the First Amendment’s supremacy).

In summary, this Section ensures that the most protective regime for an individual will apply – whether that’s federal or state – and that this federal law raises the baseline nationwide so no one’s likeness is left unprotected by a lagging state law. It also reassures that we are not wiping out state publicity rights or other IP regimes; indeed, someone could still, for example, sue under California’s publicity statute if that offers something additional, but they’ll also have this federal claim available. The only thing we explicitly preempt are weaker or inconsistent laws that would frustrate the intent of creating strong and balanced protection for digital replicas.

Section 8. Severability

If any provision of this Act, or the application thereof to any person or circumstance, is held to be invalid or unconstitutional, the remainder of the Act and the application of its other provisions to other persons or circumstances shall not be affected. The provisions of this Act are declared to be severable. Congress intends that even if one part of this Act (for example, a particular exemption or a particular remedy) is struck down, the rest shall continue in force to the maximum extent possible. For instance, if the post-mortem term limit is held invalid by a court of competent jurisdiction, that shall not affect the enforcement of the Act’s provisions on living individuals’ rights, and vice versa. If the requirement for court approval of minor licenses is found to conflict with a specific state’s minor contract law, that specific application can be severed or adapted without voiding the entire licensing scheme. The overarching intent is that the core protections for individuals’ name, image, voice, and likeness in the digital realm remain operative even if adjustments must be made to certain details by judicial review.

Conclusion: This Act represents a comprehensive approach to combatting the malicious misuse of generative AI in creating unauthorized deepfakes, by empowering individuals with enforceable rights and tools while preserving creative freedom and technological innovation. It draws on and improves prior legislative proposals (such as NO FAKES Act of 2025 and PADRA) by integrating their positive elements (clear definitions, licensable rights, and strong exceptions) and addressing their shortcomings (overreach, lack of counter-notice, excessive post-mortem terms). Through careful tailoring, this Act seeks to promote accountability for those who would hijack someone’s identity, protect privacy and human dignity, and uphold First Amendment values in the digital age of AI.

Sources:

· Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act of 2025 (NO-FAKES-Concerns-Letter.pdf) (NO-FAKES-Concerns-Letter.pdf) (NO-FAKES-Concerns-Letter.pdf) (provisions and critiques thereof).

· Preventing Abuse of Digital Replicas Act (PADRA) (PADRA Bill.pdf) (PADRA Bill.pdf) (proposed Lanham Act amendments and exemptions).

· Letter from Electronic Frontier Foundation and others re: concerns on NO FAKES Act (NO-FAKES-Concerns-Letter.pdf) (NO-FAKES-Concerns-Letter.pdf) (NO-FAKES-Concerns-Letter.pdf).

· Foundation for Individual Rights and Expression (FIRE) commentary on AI bills and First Amendment (Wave of state-level AI bills raise First Amendment problems _ The Foundation for Individual Rights and Expression.pdf) (Wave of state-level AI bills raise First Amendment problems _ The Foundation for Individual Rights and Expression.pdf).

· International Trademark Association Board Resolution on Deep Fakes (Feb 2025) (20250226_INTA-Board-Resolution-Legislation-On-Deep-Fakes-Digital-Replicas.pdf) (20250226_INTA-Board-Resolution-Legislation-On-Deep-Fakes-Digital-Replicas.pdf).

· 47 U.S.C. §230(e)(2) (Communications Decency Act §230 IP carve-out).

The estoppel provision of the American Invents Act (AIA) (35 U.S.C. § 315(e)(2)) prevents a petitioner in an inter parties review (IPR) proceeding from later raising before the Patent Office, a district court, or the International Trade Commission any invalidity “ground that the petitioner raised or reasonably could have raised” during that IPR.  Estoppel attaches when the Patent Trial and Appeal Board (PTAB) issues a final written decision in the IPR.  Parties and courts have since debated what the term “grounds” means, and recently in Contour IP Holdings, LLC, v. GoPro, Inc., the Northern District of California determined that GoPro was estopped from raising invalidity defenses based on physical prior art cameras when a final written decision had already been issued rejecting GoPro’s IPR invalidity defenses based on the printed manuals for those cameras.   

The dispute arose after GoPro successfully initiated IPR proceedings challenging the validity of two of Contour’s point-of-view digital video camera patents.  Contour snapped back by filing an infringement action in district court, and GoPro’s IPRs were ultimately unsuccessful.  In the district court action, GoPro raised obvious and invalidity defenses based on physical Sony, Canon, and Nikon cameras (the cameras were excluded from consideration as prior art in the IPR because only patents and printed publications qualify as prior art in those proceedings).  Because it was undisputed that the printed manuals for those cameras were available to GoPro (though not used as actual grounds for invalidity) during the earlier IPRs, Contour argued that GoPro was estopped under 35 U.S.C. § 315(e)(2) from asserting the cameras as prior art because manuals describing the cameras were available in the IPRs.  

The Court agreed with Contour that the cameras were materially identical to the manuals that were available to GoPro during the IPRs.  The Court then turned to whether § 315(e)(2) barred GoPro’s district court invalidity defenses based on the actual cameras.  This determination hinged on the interpretation of the term “grounds” in § 315(e)(2), which bars a petitioner from asserting invalidity on any “ground that the petitioner raised or reasonably could have raised” during an IPR.  The Court acknowledged that there is a split among district courts over the meaning of “grounds.”  Some district courts adopt a broad interpretation, applying estoppel to physical devices described by any patents or printed publications raised in the IPR.  Other districts have adopted a narrower view, limiting “grounds” to the specific patents and publications on which the IPR invalidity grounds were based. 

The Court adopted the broad interpretation of § 315(e)(2), finding that GoPro’s invalidity prior art theories were estopped because GoPro did not rely on any product specific features of the physical cameras beyond those disclosed in the manuals that were available to GoPro in the IPR.  In its decision, the Court emphasized that extending estoppel to bar litigants from asserting physical products as prior art that are materially identical to patents or printed publications available in IPR proceedings “better aligns with the purpose of IPR estoppel as it promotes efficient resolution of patent disputes by preventing repetitive challenges based on slightly rebranded evidence.” The Federal Circuit has not yet weighed in on this precise issue and it remains to be seen whether GoPro will appeal the Court’s decision and put the issue squarely before the Federal Circuit.  In the meantime, parties engaged in parallel IPR and district court proceedings should be aware of the district court split and familiarize themselves with any controlling precedent in their jurisdictions to understand whether IPR estoppel may extend to physical devices that are described in patents or printed publications that could have been raised in IPR proceedings.  Where a patent challenger finds themselves in a jurisdiction adopting the narrow interpretation of 35 U.S.C. § 315(e)(2), it is imperative that it identifies limitations in the physical device that are not fully described in the printed publication prior art to avoid IPR estoppel.  Conversely, when defending against an invalidity challenge based on a physical device in district court, a patentee might identify printed publications available as of the IPR petition date that disclose the same limitations as the device to leverage estoppel based on a parallel IPR proceeding.

Irwin IP is honored to announce that the firm has been recognized at the 20th Annual Managing IP Awards, held on April 24, 2025, in New York City. The event brought together leading intellectual property professionals from across the Americas to celebrate exceptional achievements in the field. During the ceremony, Irwin IP received the following awards:

Litigator of the Year (Illinois) – Barry Irwin

Barry Irwin was named one of only 12 state-level winners across the United States—and the sole recipient of this honor in Illinois. This award reflects Barry’s exceptional advocacy, litigation strategy, and unwavering commitment to his clients in high-stakes intellectual property matters.

Impact Case of the Year – LKQ Corp v. GM Global Tech (Fed. Cir. 2024) (en banc)

The Managing IP Awards recognize impactful intellectual property cases annually. This year’s award recognizes Irwin IP’s landmark victory in LKQ Corp. v. GM Global Tech, argued before the en banc Federal Circuit. On May 21, 2024, the court unanimously agreed with LKQ’s position, overturning over 40 years of precedent and remanding the case for further proceedings.

The ruling replaces the long-standing Rosen-Durling test with a new obviousness framework aligned with Supreme Court precedent, marking a significant shift in design patent law. This decision not only secured a major win for our client, LKQ Corporation, but also sets a new standard for design patent litigation nationwide.

These recognitions highlight Irwin IP’s ongoing commitment to excellence in intellectual property litigation and its significant contributions to the advancement of IP law. Congratulations to Barry Irwin on this well-deserved honor, and sincere appreciation is extended to the entire firm for their innovation, dedication, and expertise that made this achievement possible.

——–
Irwin IP specializes in mission-critical intellectual property and technology litigation, catering to a diverse client base, including Fortune 500 companies and innovative startups.  Our expertise extends to enforcing and protecting intellectual property portfolios, ensuring our clients’ product lines, worth hundreds of millions annually, remain secure.  Notably, we routinely, successfully litigate against the largest, most prestigious law firms representing the largest companies in the world on matters valued in the tens and hundreds of millions.

Be careful when selecting a name for your product, otherwise you might find yourself cooked at the United States Patent and Trademark Office (“PTO”)!  Enlisting an expert trademark attorney to oversee your trademark application, especially when there is a similar mark in a related field that is already registered, will increase your chances that your trademark will be granted registration.  The Federal Circuit (“CAFC”) recently affirmed the Trademark Trial and Appeal Board’s (“TTAB’s”) rejection of the mark “Chicken Scratch” (the “proposed mark”) for beer because the proposed mark resembled an identically registered “Chicken Scratch” mark (the “cited mark”) for restaurant services.  Specifically, the CAFC upheld the TTAB’s determinations that the cited mark was not conceptually weak, the proposed and cited marks provided similar commercial impressions, and that there was sufficient evidence to establish that the beer and restaurant services were related.   

In 2018, R. S. Lipman Brewing Company, LLC (“Lipman”) sought to register the mark “Chicken Scratch” for beer with the PTO.  An examining attorney at the PTO rejected Lipman’s mark because the proposed mark was likely to be confused with the cited mark for restaurant services based on the similarity of the marks, “the related nature of beer and restaurant services, and the overlap of the relevant trade channels.”  The TTAB affirmed the examining attorney’s rejection.  Lipman appealed to the CAFC arguing that the cited mark was conceptually weak, the marks did not impart similar commercial impressions, and the goods/services were not similar to warrant that the marks would likely be confused by consumers. 

First, the CAFC held that the TTAB’s decision that the proposed mark was suggestive and not conceptually weak was correct.  The TTAB properly acknowledged that the cited mark’s website indicated its “chicken [was] made from scratch,” evidence that the proposed mark could coexist with a similar mark for distilled spirits, and reliance on dictionary definitions indicating that “chicken scratch” means “bad handwriting” supported the conclusion that the cited mark should have a normal scope of protection.  

Second, the CAFC held that there was sufficient evidence for the TTAB to hold that the proposed and cited marks impart “similar commercial impressions.”  Lipman argued that because the proposed mark is shown with a “chicken pecking at the ground” thereby suggesting that the same chicken feed ingredients are used to brew Lipman’s beer, while the cited mark is used in a restaurant menu to feature chicken dishes, the two marks must provide different commercial impressions.  The CAFC held that Lipman provided no evidence that consumers would think the marks provided different commercial impressions.  Additionally, both marks were identical in sight, sound, and appearance which weighed in favor that the marks would likely cause confusion.   

Finally, the CAFC rejected Lipman’s argument that the TTAB should have provided “something more” than the fact that the marks are identical marks based on the CAFC’s previous decision in In Re Coors which allowed a beer trademark to be registered alongside a similar mark for restaurant services.  In contrast to the applicant in In Re Coors, the CAFC held that Lipman failed to provide evidence comparing the number of breweries providing restaurant services to the total number of United States restaurants. 

In light of this decision, businesses should be conscious when choosing a new mark for their goods and services.  It is important to first conduct a trademark search to determine whether there are any similar marks in related fields that could likely cause confusion in the minds of consumers.  Hiring a skilled trademark attorney to handle your trademark prosecution and to distinguish your mark from any registered marks will likely save your business money from later having to change your brand in the future.  For example, after seven years of attempting to receive a trademark, Lipman will either have to rebrand or risk being sued for infringement.  

In McGucken v. Valnet, the Supreme Court is being asked to reevaluate a controversial copyright rule known as the “Server Test,” which determines liability based on how an image is displayed online—not whether it actually appears on your screen. Dr. Elliot McGucken claims that embedding his Instagram photos without permission still violates his rights, even if the images weren’t stored on the defendant’s server. With big implications for photographers, publishers, and the future of online content, this case could shake up how we think about copyright in the digital age.

Download the full article to unpack the arguments, policy debates, and what’s at stake if the Court takes the case.

The Federal Circuit recently opined on whether a stipulation in litigation can overcome a disclaimer made during the prosecution history of a patent.  The Hatch-Waxman Act allows generic drug companies to use clinical results from brand-name drugs in the FDA approval process.  In exchange, the brand-name drug companies get a short-term monopoly before the FDA approves any generics.  Alkem filed an Abbreviated New Drug Application (“ANDA”) with the FDA, and Azurity alleged the ANDA showed a generic drug that infringed its patent, U.S. Patent No. 10,959,948 (the “’948 Patent”).  Alkem argued there was no infringement because its drug contained propylene glycol, which Azurity disclaimed during prosecution.  The District Court ruled in favor of Alkem, and the Federal Circuit affirmed. 

The ’948 Patent claims a liquid formulation of vancomycin (oral antibiotics) “consisting of” a number of ingredients including a “flavoring agent.”  The ’948 Patent is a continuation of U.S. Patent Application No. 15/126,059 (the “’059 Application”), which disclosed a similar liquid formulation of vancomycin.  Although the ’948 Patent was issued without rejection, the District Court and Federal Circuit relied heavily on the prosecution history of the ’059 Application, which applies to the ’948 Patent as a continuation.  During that prosecution, the application was rejected four times over the prior art reference Palepu, which discloses a liquid formulation of vancomycin for intravenous use, and in relevant part, included “a polar solvent comprising propylene glycol….”  Azurity’s amendments and statements during prosecution of the ’059 Application included remarks that “[t]he absence of propylene glycol and polyethylene glycol in the claimed invention, in part, distinguish it from [Palepu].”  And notably, Azurity replaced “comprising” with “consisting of” in the preamble of the independent claims, which limited the scope of the patent to cover a liquid formulation having only the listed ingredients, and nothing else.   

Azurity argued on appeal that even if there was a disclaimer, it did not apply to the “flavoring agent” limitations of the ’059 Application and ’948 Patent, only to the “carrier” claims.  The Court disagreed and noted that “Azurity tried multiple routes to satisfy the examiner that unlike Palepu, its claimed invention lacked propylene glycol….  Azurity acquiesced by abandoning the ‘carrier’ distinction and adopting the ‘consisting of’ transition.”  Thus, the claims were allowed only because they excluded propylene glycol.  Azurity also argued that a pretrial stipulation with Alkem stating “[s]uitable flavoring agents for use in the Asserted Claims include flavoring agents with or without propylene glycol” meant flavoring agents with propylene glycol could infringe the asserted claims, and that Alkem’s stipulation surrendered any argument that propylene glycol was disclaimed.  Alkem argued that the stipulated fact did not waive any argument; it merely confirmed a flavoring agent as known in the industry need not have propylene glycol.  The Court agreed that Alkem’s interpretation was consistent with the rest of the case whereas Azurity’s interpretation conflicted with Alkem’s noninfringement arguments and the structure of other stipulated facts. 

The Federal Circuit noted the importance and binding nature of statements made during prosecution: holding a party to its statements protects the public and promotes the notice function of patents.  Thus, patentees should be wary of sweeping statements made during prosecution when distinguishing over prior art, and must recognize that they cannot stipulate their way around such positions in the event they become engaged in adverse proceedings once their patent issues. 

Navigating intellectual property issues as an entrepreneur or startup can be complex, and understanding your rights, obligations, and protections can make all the difference. We invite you to join us in our office, on May 9th from 10 AM to 12 PM, for our Spring Entrepreneurs’ Workshop where Irwin IP attorneys will explain the ins and outs of patents, trademarks, copyrights, and trade secrets—and discuss how each can add value to your business.

After the presentations, enjoy a complimentary lunch while connecting with fellow entrepreneurs. Attendees will also have the opportunity for one-on-one discussions with our attorneys to ask questions related to the topics covered.

Register for free today!