Several states have enforced legislation that addresses concerns related to misuse of biometric data, but there is still a need for more voice-specific protections to tackle unregulated AI-assisted voice cloning.
Voice cloning has emerged as a significant threat, raising concerns over privacy, security, and authenticity. In a Bloomberg Law report, Frost Brown Todd's William Morriss, an experienced Intellectual Property attorney, details how state privacy laws can become a crucial line of defense against the unauthorized use of voice cloning, providing legal frameworks to protect individuals and organizations from this sophisticated form of deception,
Voice cloning technology, which uses AI to create realistic replicas of a person's voice, has seen tremendous growth in recent years. According to a report by Grand View Research, the global AI voice cloning market is expected to expand significantly, driven by advancements in machine learning and the increasing demand for voice-based applications.
The Grand View Research data indicates that the AI voice cloning industry worldwide is expected to reach a projected revenue of $9.8B by 2030, from $1.9B in 2023, expanding at a compound annual growth rate of 26.1%.
The Dangers of Unregulated Voice Cloning
While voice cloning can be used for positive applications such as aiding speech-impaired individuals or enhancing virtual assistants, it also poses severe risks when misused. Unregulated voice cloning can lead to identity theft, fraud, and other malicious activities.
Voice Cloning Scams
One of the most concerning aspects of voice cloning is its use in scams. Tech.co reports AI voice cloning scams are on the rise, with criminals using cloned voices to deceive victims into transferring money or divulging sensitive information. According to a 2023 McAfee study, out of 7,000 people surveyed, one in four said that they had experienced an AI voice cloning scam or knew someone who had.
These scams exploit the trust people place in the authenticity of a familiar voice, making them particularly effective and dangerous.
Inability to Detect Deepfake Audios
A study conducted by University College London reveals that humans are not always adept at recognizing deepfake speech. In the study, participants were able to correctly identify deepfake audios only 73% of the time. This inability to distinguish real voices from cloned ones exacerbates the threat posed by unauthorized voice cloning.
How State Privacy Laws Can Help
According to Morriss, state privacy laws are emerging as a vital tool in combating the misuse of voice cloning technology. Even if they don’t specifically target voice cloning technology, these laws provide a framework for protecting individuals' biometric data, which includes voice prints, and establish penalties for unauthorized use.
Key Provisions
Biometric Data Protection: Many state privacy laws, such as the Illinois Biometric Information Privacy Act (BIPA), specifically address the protection of biometric data, including voice prints. These laws require companies to obtain explicit consent from individuals before collecting or using their biometric data. This consent-based approach ensures that individuals have control over their voice data and can prevent unauthorized cloning.
Transparency and Accountability: State privacy laws often mandate transparency in how companies collect, store, and use biometric data. Companies must provide clear information about their data practices and are held accountable for breaches or misuse. This transparency helps build trust and ensures that individuals are aware of how their voice data is being used.
Right to Sue: Some state privacy laws grant individuals the right to sue companies for violations. This legal recourse acts as a deterrent against unauthorized use of voice cloning technology. Companies are more likely to comply with regulations if they face significant legal and financial repercussions for non-compliance.
Relevant Legislation
Illinois Biometric Information Privacy Act (BIPA): Morriss examines BIPA in detail, explaining the two key principles that would be relevant to any attempt to use biometric protection to stop voice cloning:
Using AI to generate a voice with the same characteristics as the data subjects’ is actionable only if those characteristics are specific enough to identify the data subject.
The way an AI system processes information will determine if it uses a protected biometric identifier. Deciding if a voice cloning system uses data unique enough to qualify as a protected voiceprint would be left to a jury.
California Biometric Privacy Law SB 1189: Introduced in 2022 to complement the California Consumer Privacy Act, SB 1189 would significantly increase biometric privacy protection for consumers, prohibiting any private entity from selling, leasing, trading, using for advertising purposes, or otherwise profiting from a person’s biometric information, including their voice.
Washington’s My Health My Data Act: Introduced in 2023, the Act explicitly defines consumer health data as including biometric data and prohibits collecting that data without the data subject’s consent, unless doing so is necessary to provide a requested product or service.
Tennessee’s Ensuring Likeness, Voice, and Image Security Act: Enacted in March this year, the ELVIS Act codifies a right to control commercial exploitation of one’s own voice if it’s recognizable and attributable to a particular person, making it easy to allow recording artists and others to file lawsuits based on unauthorized use of their voice.
Future Directions
By providing specific regulations on the collection and use of biometric data, these laws offer essential protections for individuals and organizations. As voice cloning technology continues to evolve, the implementation and enforcement of these privacy laws will be vital in safeguarding against the potential harms associated with this powerful AI capability.