Johansson

Scarlett Johansson’s row with OpenAI reminds us identity is a slippery yet important subject. AI leaves everyone’s at risk

OpenAI will be removing access to one of its ChatGPT voices, following objections by actor Scarlett Johansson that it sounds “eerily similar” to her own.

Earlier this week, the company said it was “working to pause” the voice of Sky, which is one of a few options users can choose when conversing with the app.

Johannson said OpenAI’s CEO, Sam Altman, had approached her in September and again in May, asking if she would allow her voice to be used in the system.

She declined, only to hear a voice assistant that sounded uncannily like her only days after the second request. She was, in her own words, “shocked, angered and in disbelief”. OpenAI replied by saying:

AI voices should not deliberately mimic a celebrity’s distinctive voice – Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice.

Johansson is known to have voiced an AI in the past, for a role in the fictional 2013 film Her – which Altman has declared himself a fan of. He also recently tweeted the word “her” without much further explanation.

Johansson said she had to hire legal counsel to demand the removal of Sky’s voice and information on how the company created it.

This dispute provides a prescient warning of the future identity harms enabled by AI – harms that could reach any of us at any time.

Identity is a slippery subject

Artificial intelligence is developing at an incredible pace, with OpenAI’s ChatGPT being a game-changer. It’s very likely AI assistants will soon be able to meaningfully converse with users, and even form all sorts of “relationships” with them. This may be why Johansson is concerned.

One thing has become unassailably clear: we can’t out-legislate AI. Instead, we need a right to identity, and with that a right to request the removal, deletion (or otherwise) of content that causes identity harm.

But what exactly is “identity”? It’s a complex idea. Our identity may say nothing about our specific personal traits or qualities, yet it is fundamental to who we are: something we build through a lifetime’s worth of choices.

But it’s also about more than how we see ourselves, as celebrities demonstrate. It’s linked to our image. It is collaborative – cultivated and shaped by how others see us. And in this way it can be tied to our personal traits, such as our voice, facial features, or the way we dress.

Minor attacks against our identity may have limited impacts, but they can add up like death by a thousand cuts.

Legal defences

As AI democratises access to technologies that can manipulate images, audio and video, our identities are becoming increasingly vulnerable to harms not captured by legal protections.

In Australia, school students are already using generative AI to create sexually explicit deepfakes to bully other students.

But unlike deepfakes, most identity harms won’t breach criminal law or draw the ire of the eSafety Commissioner. Most legal avenues afford ill-fitted and piecemeal remedies that can’t heal the damage done. And in many Western democracies, these remedies require legal action that’s more expensive than most can afford.

AI can be used to manipulate or create content that shows “you” doing things you haven’t (or would never do). It could make you appear less competent, or otherwise undermine your reputation.

It could, for example, make you appear drunk in a professional setting, as with former US House Speaker Nancy Pelosi. It could show “you” vaping, when doing so would disqualify you from your sports team, or place you inside a pornographic deepfake video.

Australian law lags behind

The United States Congress recently proposed an actionable right to privacy. But even without this, US protections exceed those offered in Australia.

In the US, privacy is defended through a combination of legal claims, including the publication of private facts, presenting a subject in a false light, or the misappropriation of likeness (as in, co-opting some part of another’s identity and using it for your own purposes).

Based on the limited facts available, US case law suggests Johansson could succeed in action for misappropriation of likeness.

One pivotal case from 1988 featured American singer Bette Midler and the Ford Motor Company. Ford wanted to feature the singer’s voice in an ad campaign. When Midler declined, Ford hired a “sound alike” to sing one of Midler’s most famous songs in a way that sounded “as much as possible” like her.

Midler won, with one court likening Ford’s conduct to that of “the average thief” who simply takes what they can’t buy.

Australian public figures have no equivalent action. In Australia and the UK, the law will intervene where one party seeks to profit by passing off lesser quality look-alikes or sound-alikes as “the real thing”. But this applies only if consumers are misled or if the original suffers a loss.

Misrepresentation might also apply, but only where consumers believe a connection or endorsement exists.

Australia needs a rights-based approach akin to that in the European Union, which has a very specific goal: dignity.

Identity or “personality” rights empower those affected and impose an obligation on those publishing digital content. Subjects may receive damages or may seek injunctions to limit the display or distribution of material that undermines their dignity, privacy or self-determination.

Johansson herself has successfully sued a writer in France on the basis of these protections (although this win was ultimately more symbolic than lucrative).

With AI, it’s now child’s play to impersonate another’s identity. Identity rights are immensely important. Even where these rights co-exist with free speech protections, their very presence enables people to protect their image, name and privacy.The Conversation

Elizabeth Englezos, Lecturer, Griffith Law School, Griffith University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Unlock the Future of Business with AI

Dive into our immersive workshops and equip your team with the tools and knowledge to lead in the AI era.

Scroll to top