Can You Trademark Your Voice Against AI? What Celebrity Moves Mean for Deepfakes (UK and US Law)

Learn if you can trademark your voice against AI in the UK and US, and what celebrity moves mean for deepfake law.

Hide Me

Written By

Joshua
Reading time
» 6 minute read 🤓
Share this

Unlock exclusive content ✨

Just enter your email address below to get access to subscriber only content.
Join 119 others ⬇️
Written By
Joshua
READING TIME
» 6 minute read 🤓

Un-hide left column

Matt McConaughey “trademarks himself to prevent AI cloning” – what that really means for voice deepfakes in the UK and US

A Reddit post claims that Matthew McConaughey is moving to trademark his voice and likeness to deter AI cloning and deepfakes. Details beyond the LinkedIn headline are not disclosed, so treat this as an early signal rather than a confirmed legal strategy.

“McConaughey trademarks himself to prevent AI cloning.”

Even so, the idea raises a timely question: can you trademark a voice to stop AI deepfakes? And if you can, would it actually work? Here’s a clear, UK-focused look at trademarks, voice cloning, and the wider legal toolkit in both the UK and US.

Can you trademark a voice? UK and US basics

A trade mark (UK spelling) identifies the source of goods or services. It’s typically a name, logo, or slogan used in commerce. But “non-traditional” marks are possible too, including sound marks (e.g. jingles). Both the UK Intellectual Property Office (UK IPO) and the US Patent and Trademark Office (USPTO) allow sound marks in principle.

Key points:

  • Trade marks must function as indicators of origin in specific classes of goods/services, not as a general ban on use.
  • A “voice” could be protected if it is a distinctive sound mark used commercially (think a signature catchphrase or unique vocal identifier tied to a brand).
  • Registration is class-specific and territorial: a UK trade mark protects in the UK; a US mark protects in the US.

Useful resources: UK IPO – trade mark basics, USPTO – trademark basics, and the USPTO’s TMEP on sound marks.

What trade marks can and cannot do against deepfakes

What a trade mark might help with

  • Stopping commercial use that causes confusion: if someone uses a celebrity’s name, likeness, or signature sound to sell a product or service, a registered mark can support a claim of infringement.
  • Platform takedowns: a registered mark can simplify notice-and-takedown on marketplaces and social platforms for unauthorised ads or endorsements.

What a trade mark won’t solve

  • Non-commercial deepfakes: many deepfakes circulate as non-commercial content. Trade mark law is primarily about confusion in commerce, not a blanket right to control identity.
  • Lookalike/soundalike content that avoids confusion: unless a use suggests endorsement or origin, trade mark claims may be weak.
  • Jurisdictional limits: a UK mark won’t automatically stop US-hosted content, and vice versa.

Bottom line: trade marks are a useful part of the toolkit, especially for endorsers and advertisers. They are not a complete solution to voice cloning or identity misuse.

Beyond trade marks – UK and US tools for identity and voice

Issue UK US
Identity/likeness protection No standalone “right of publicity”. Common claims include passing off (misrepresentation of endorsement), defamation, malicious falsehood, and privacy-related claims. Many states recognise a right of publicity (controlling commercial use of name/likeness/voice). Scope varies by state.
False endorsement Passing off requires goodwill, misrepresentation, and damage. Useful when a deepfake suggests endorsement. Lanham Act (false endorsement/advertising) can apply if a deepfake implies sponsorship or approval.
Copyright Copyright protects recordings and performances, not the voice “itself”. Unauthorised use of copyrighted recordings can be actionable. Similar: copyright in recordings/performances; the voice timbre itself is not copyrighted.
Data protection and biometrics Voice data can be personal data and potentially biometric. Processing may trigger UK GDPR/DP law. See the ICO’s biometric data guidance. Privacy/data protection is state and sector-specific. No federal GDPR-equivalent; consent laws vary by state.
Platform obligations Evolving regime under the Online Safety Act and platform policies, but specifics on deepfakes are still developing. Platform policies and state laws increasingly address synthetic media; enforcement varies.

Practically, UK claimants often combine passing off (to tackle false endorsement), copyright (if recordings are used), and data protection arguments where voice data has been processed without a lawful basis.

Implications for UK readers: creators, brands, and developers

For creators and public figures

  • Audit your brand assets: names, logos, signatures, catchphrases, and distinctive sounds used in commerce. Consider registering trade marks in key classes and territories.
  • Contract for synthetic use: include clear clauses forbidding training, cloning, or synthesis of your voice/likeness without explicit written consent and separate fees.
  • Protect recordings: maintain clean rights in your audio/video, including performer’s rights and licensing terms that restrict generative use.
  • Monitoring and takedowns: set up search alerts, use platform tools, and prepare standard notices for trade mark, copyright, and passing off complaints.

For UK businesses using AI voices

  • Do not clone a living person’s voice without unequivocal consent, especially for ads or endorsements.
  • Use licensed synthetic voices with clear terms covering training, cloning, and redistribution. Keep documentation.
  • Mitigate confusion: disclosures such as “AI-generated voice actor” reduce false endorsement risk.
  • Data protection: if you process voice data, document a lawful basis, minimisation, and retention. Biometric data may require heightened safeguards.

For AI developers and platforms

  • Safety rails: block prompts and outputs that target a real person’s name/likeness/voice. Provide user attestations and audit logs.
  • Watermarking and provenance: adopt media provenance standards where practical; support rapid complaint workflows for impersonation.
  • Jurisdiction-aware enforcement: align policies to UK passing off and US publicity laws to handle cross-border content.

Will more celebrities follow, and will it work?

If the reported move signals a trend, expect more registrations of names, taglines, and signature sounds. That will strengthen enforcement against commercial deepfakes and misleading endorsements, especially on ad platforms and marketplaces. But pure trade mark strategy is unlikely to deter non-commercial deepfakes at scale.

The likely future is a layered approach: trade marks for commercial misuse, passing off/false endorsement for implied sponsorship, copyright for unauthorised recordings, and platform policy plus provenance tools for speed. For UK readers, that means planning across legal, technical, and operational lines now.

Further reading

None of the above is legal advice. If you’re facing a deepfake or planning a trademark strategy, speak to an IP lawyer experienced in AI and media.

Last Updated

January 18, 2026

Category
Views
0
Likes
0

You might also enjoy 🔍

Minimalist digital graphic with a pink background, featuring 'AI' in white capital letters at the center and the 'Joshua Thompson' logo positioned below.
Author picture
Discover why UK communities are blocking AI data centre developments and what this means for the country’s future.
Minimalist digital graphic with a pink background, featuring 'AI' in white capital letters at the center and the 'Joshua Thompson' logo positioned below.
Author picture
DeepSeek exemplifies the rise of Chinese AI models, covering pricing, sanctions, and global adoption trends.

Comments 💭

Leave a Comment 💬

No links or spam, all comments are checked.

First Name *
Surname
Comment *
No links or spam - will be automatically not approved.

Got an article to share?