© 2024 Blaze Media LLC. All rights reserved.
CAA talent agency partners with AI company Veritone to help celebrities secure their own AI likenesses
Image credit: YouTube screenshot

CAA talent agency partners with AI company Veritone to help celebrities secure their own AI likenesses

CAA is working to keep its talent from having their AI likenesses used without their consent.

The Creative Artists Agency — one of the most elite sports and entertainment talent agencies in Hollywood — is on the front lines of providing AI protection services to their top talent, according to Tech Crunch.

The development comes after many famous people have had their digital likenesses used without legal permission. CAA has reportedly built a virtual media storage for its A-list talent. The storage system would protect a person's names, images, digital scans, voice recordings, and other assets.

'Over the last couple of years or so, there has been a vast misuse of our clients’ names, images, likenesses, and voices without consent, without credit, without proper compensation.'

CAA has teamed up with AI tech company Veritone to create what is known as the "CAAvault," where actors record their bodies, faces, movements, and voices using scanning technology to generate AI clones.

Deadline Hollywood reported that Alexandra Shannon, CAA's head of strategic development, said: "Ethics-led and talent-friendly applications of emerging technologies, like AI, are a top priority for CAA, driving us to innovate new ways to support and protect artists."

“The launch of the CAAvault is one such innovation. By partnering with a trusted organization like Veritone, we can maintain the security of the artist’s assets, while working to ensure that AI is responsibly integrated into opportunities across the entertainment landscape," Shannon added.

The announcement comes after a slew of AI deepfakes of celebrities have surfaced online, which were generated without the celebrity's consent. One of the more recent incidents included Tom Hanks. An AI-generated video of him was used to promote a dental plan without his permission.

"Over the last couple of years or so, there has been a vast misuse of our clients’ names, images, likenesses, and voices without consent, without credit, without proper compensation," Shannon said.

"It’s very clear that the law is not currently set up to be able to protect them, and so we see many open lawsuits out there right now."

Tech Crunch reported that a substantial amount of personal information is required to create digital clones of people. This raises questions about the misuse of sensitive information, and whether this could present safety risks to victims.

The "CAAvault" can only be accessed by those authorized to do so, allowing those involved to legally share and monetize their content as they choose.

“This is giving the ability to start setting precedents for what consent-based use of AI looks like,” Shannon said.

“Frankly, our view has been that the law is going to take time to catch up, and so by the talent creating and owning their digital likeness with [theCAAvault] ... there is now a legitimate way for companies to work with one of our clients. If a third party chooses not to work with them in the right way, it’s much easier for legal cases to show there was an infringement of their rights and help protect clients over time.”

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?