New Bill Aims to Regulate AI Cloning of Voices and Likenesses
A bipartisan group of U.S. House lawmakers unveiled a new bill on Wednesday (Jan. 10) that seeks to regulate the use of artificial intelligence (AI) for cloning voices and likenesses. The legislation, called the No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act of 2023 (“No AI FRAUD” Act), aims to establish a federal framework to protect individuals’ voices and likenesses while also outlining First Amendment protections.
The proposed bill comes as a response to the increasing prevalence of AI voice synthesis technology, which presents both opportunities and challenges for recording artists. While some view it as an innovative marketing tool or a means of engaging fans, it also opens the door for impersonations that may confuse or deceive the public. As a result, this legislation aims to safeguard artists from unauthorized use of their voice, image, or likeness.
Protecting Artists: The No AI FRAUD Act Explained
The No AI FRAUD Act was introduced by Rep. María Elvira Salazar (R-FL), with support from Reps. Madeleine Dean (D-PA), Nathaniel Moran (R-TX), Joe Morelle (D-NY), and Rob Wittman (R-VA). The bill draws inspiration from the Senate discussion draft Nurture Originals, Foster Art, and Keep Entertainment Safe Act (“NO FAKES” Act), which was announced last October.
Rep. Salazar emphasized the importance of addressing AI misuse, stating, “It’s time for bad actors using AI to face the music.” She believes that the bill will empower artists and U.S. citizens to protect their rights, creative work, and individuality online.
Although artists’ rights of publicity currently offer some protection against unauthorized use of their voice or likeness, this right can vary from state to state. The No AI FRAUD Act seeks to establish a standardized level of protection, ensuring that artists are shielded from exploitation. However, residents of states with even stronger right of publicity laws may still rely on their state protection, which can be easier to enforce through legal channels.
State and Federal Legislation Address AI Voice and Likeness Cloning
The introduction of the No AI FRAUD Act is part of a broader movement towards state and federal legislation concerning AI voice and likeness cloning. In addition to this bill, Governor Bill Lee of Tennessee is expected to announce a separate piece of legislation on the same issue. Gov. Lee has previously noted the importance of legal protection for artists and songwriters as technology evolves. These upcoming legislative actions reflect the growing recognition of the need for comprehensive safeguards in the AI domain.
The No AI FRAUD Act: A Step Towards Ethical AI
The Recording Industry Association of America (RIAA) strongly supports the No AI FRAUD Act. RIAA chairman Mitch Glazier lauded the bill as a “meaningful step towards building a safe, responsible and ethical AI ecosystem.” He acknowledged the value of AI in enhancing creativity but emphasized the need for guardrails to protect individual rights and preserve the integrity of generative AI.
The bill aligns with the calls from music industry executives, including Sony, ASCAP, and Universal Music Group, who have been advocating for regulations to address the misuse of AI. Recent incidents, such as the viral fake-Drake song “Heart On My Sleeve,” have highlighted the potential harm caused by deepfakes and unauthorized use of artists’ voices.
Industry Leaders Applaud the No AI FRAUD Act for Protecting Artists
The No AI FRAUD Act has garnered support from industry leaders, highlighting the importance of protecting artists’ rights. Lucian Grainge, chairman and CEO of Universal Music Group, praised the bill for its commitment to preventing the theft of someone’s image, likeness, or voice. Universal Music Group, while supportive of AI advancements, insists on authorization as a crucial element in any AI-related usage that involves an artist’s identity.
With the support of industry leaders and lawmakers, the No AI FRAUD Act represents a significant step towards maintaining ethical standards in the use of AI and protecting artists’ creative rights. By establishing federal guidelines and upholding First Amendment protections, this legislation aims to create a safer and more responsible AI ecosystem.
Analyst comment
Positive news. The No AI FRAUD Act aims to regulate AI cloning of voices and likenesses, protecting artists from unauthorized use. This bipartisan bill has support from industry leaders and lawmakers, and aligns with calls for regulations from the music industry. It represents a significant step towards maintaining ethical standards in AI use and creating a safer AI ecosystem. This legislation will establish federal guidelines and uphold First Amendment protections.