Voice actor Cissy Jones started to notice something troubling around 2021: her voice, particularly from her work on Disney series Owl House and a few video games, was being used on various websites without her consent or compensation.
“I had no idea that was an option,” she says. “And it really freaked me out. It really freaked me out.”
In the last couple of years, the use of artificial intelligence to replicate and use actors’ voices has become even more frequent – and gotten much more attention, particularly with it becoming a major issue in both recent Hollywood writers’ and actors’ strikes. But it was back then in 2021 that Jones, whose voice has appeared in games including Life Is Strange, Starfield, and Baldur’s Gate 3, really started to think more about this, knowing that they couldn’t avoid the evolving technology entirely.
“The genie is out of the bottle,” she tells IGN. “You can't stuff it back in. How do we make sure that we are part of the conversation so that we don't get completely shut out?”
With this in mind, Jones started working with other voice actors like Tim Friedlander and the National Association of Voice Actors (NAVA) to create a framework for voice actors to actually be involved in the ways their voices could be used with AI technology. It eventually led to a waiver posted on NAVA’s website, which any voice actor can use for free, and later, Morpheme.ai.
Jones is a co-founder and vice president of strategic partnerships at Morpheme, whose goal, as she describes, is the “ethical coexistence of voiceover and artificial intelligence.” The process, essentially, is to create a “digital double” of an actor with their consent.
First, they schedule a recording session with an actor, getting all new data – this, Jones says, is something unique to Morpheme, and ensures the data won’t come into legal disputes later down the line.
The second part comes in when a client wants to use an actor’s digital double. Here, the actor is informed of the project, and then gets to choose whether or not they’d like to be part of it. Then, when a client utilizes the voice, they pay a generation fee, part of which goes to the actor.
It’s all part of what Morpheme adviser Scott Mortman calls the three Cs: “compensation, consent – which includes control – and then clarity or transparency.”
“There's a paradigm shift that's been going on in a very short period of time with AI, which is that originally you had all of these AI companies coming out that were using – they're basically scraping the internet and developing synthetic voices that couldn't be authenticated,” Mortman tells IGN. “And what you're now seeing are actors and authors and voice actors specifically that are saying, ‘Wait a minute, when you created this voice, you took part of my product, my livelihood used it without my consent, without compensating me.’ “
In addition to the client knowing how their voice is being used and having more control over it, it’s important for clients, too, to know they won’t be sued by a voice actor or company over a AI model illegally scraping their voice. Jones says internally, they’ve joked about it essentially being “grass-fed voiceover” – basically, the clients know where that voice has come from, and that the actor has signed off on its use.
Getting Past the Fear
Morpheme still has a few steps to go before it officially launches, however, and working with this technology in this landscape isn’t always easy. Animators have voiced their concerns over companies using AI technology to replace and/or steal their work; SAG-AFTRA made waves when revealing studios’ pre-strike proposal to use background actors’ likenesses indefinitely; and the WGA, like SAG-AFTRA, made protections against AI a key concern in their contract negotiations.
Understandably, there’s a ton of trepidation, especially amid creatives.
“Everybody's initial reaction is fear,” Jones admits.
“So we have worked long and hard internally at Morpheme to make sure that we have covered everything, that we've really thought about how to make the actors feel comfortable, how to make sure that they know we're not just out to take their voice and make a buck,” she adds. “So when I have a chance to sit down and talk with people and talk with them about it, and they get past the initial fear, it's actually been overwhelmingly positive. We haven't had a single person turn us down.”
Mortman notes that their goal at official launch is to open with a library of 50 voice actors. They’re not in a position to disclose those voice actors yet, but he says they should accomplish that goal “shortly.”
They, too, are working with SAG-AFTRA to have a union-approved contract in regards to the use of AI – but it’s not an overnight process, Jones says, especially as the guild was negotiating through a strike until very recently.
Still, she notes that they’ve been working with them “very closely,” implementing their feedback in an admittedly lengthy process. Mortman, who’s been directly involved with some of the discussions with SAG-AFTRA, adds that there’s an “educational element to this” as well.
“One of the reasons that SAG-AFTRA went on strike is concern among its membership over AI issues,” he says. “And those concerns are legitimate concerns, but there's also a huge benefit that AI provides that sometimes is overlooked. And we're working with SAG to communicate that benefit to its actors, which is the ability to enter markets using AI that actors do not have readily available to them now.”
“But again,” he goes on, “it comes down to the three Cs, which is you have to ensure for the actors to be willing to become a part of this market, you have to ensure they get compensated, you have to ensure they consent, and there has to be clarity.”
The whole situation facing AI, Jones notes, is made more precarious by the fact that legislation is “woefully behind the technology.” That’s part of the reason why Jones and others aren’t running away from said technology, but learning about it, and seeing how it can work for them.
“People are just like, ‘Wow, you actually have coherent statements and you're getting out there. You're talking to lawmakers, you're making statements left, right, and center about how to do this the ethical way,’ “ she says. “People are excited to see that because nobody has done it yet."
Thumbnail credit: Getty Images
Alex Stedman is a Senior News Editor with IGN, overseeing entertainment reporting. When she's not writing or editing, you can find her reading fantasy novels or playing Dungeons & Dragons.