shutterstock_2028945926_mdv_edwards
23 February 2023FeaturesCopyrightSarah Speight

Face off: Deepfakes, celebrity ‘appearances’ and the law

Technology may have advanced somewhat since the late Steve McQueen ‘appeared’ posthumously in a Ford Puma ad in 2004, his face having been digitally superimposed onto the body of another actor.

Fast forward to 2021, and Bruce Willis’ deepfake ‘appearance’ in a Russian telecoms advert in October—after his retirement—is a more sophisticated transposition. Instead of superimposing old footage (in McQueen’s case, from the film Bullitt), it uses artificial intelligence (AI) to create an entirely new work from Willis’ likeness.

The use of AI has exploded in recent years, with uses ranging from automated audiobooks and voice assistants to deepfake videos and text-to-speech tools.

Deepfake—a portmanteau of "deep learning" and "fake"—teaches AI through machine learning to create a person’s likeness in video and other digital media.

The tech is ripe for abuse, known perhaps most notoriously for its use in pornography, but also for fake news and bank fraud.

And it has the capacity to damage reputations. In January this year, British start-up ElevenLabs vowed to introduce safeguards after deepfake audio recordings of the actress Emma Watson reading Hitler’s Mein Kampf by Adolf Hitler, and the broadcaster Sir David Attenborough being racist, were released.

But deepfake has its positive uses, from entertainment, to educational to medical. For example, it was used in a video featuring David Beckham speaking in nine languages to spread awareness about malaria.

It also promises to revolutionise the film and audio publishing industries, making it possible to create media on a lower budget without the need for artists to be present. Plus, it can extend the life and legacy of artists, such as in the cases of McQueen and Willis.

Similarly, the voice behind Darth Vader, James Earl Jones, was reported to have sold the rights to his voice to Ukrainian start-up Respeecher, after stepping down from the role at age 91. In a bid to “keep Vader alive”, Respeecher worked with LucasFilm to use Earl Jones’s voice in the 2022 Obi-Wan Kenobi  series.

Performers’ rights

But what of the rights of those being impersonated without their consent? AI company Deepcake, which created Willis’ ‘digital twin’, says its technology enables A-list actors to virtually include their likeness in marketing campaigns without the need to physically appear in front of the camera.

However, not all A-listers are in favour. In an interview with  Wired, actor Keanu Reeves said of deepfakes: “What’s frustrating about that is you lose your agency. When you give a performance in a film, you know you’re going to be edited, but you’re participating in that. If you go into deepfake land, it has none of your points of view.”

In the UK, Equity, the performing arts workers’ union, launched its Stop AI Stealing the Show campaign in April 2022. The union found that 65% of performers (and 93% for audio artists) think the development of AI technology poses a threat to employment opportunities in the performing arts sector.

It argues that IP law in the UK has failed to keep pace with the rapid rise of AI, which it suggests is leading to performers being exploited and potentially becoming devalued.

With that in mind, a new TV show broadcast in the UK is facing criticism after using deepfake technology to replicate the likeness of various celebrities—without obtaining their consent nor offering compensation.

But ITV’s  Deep Fake Neighbour Wars, described by the broadcaster as a “high-tech impressions show”, offers a disclaimer at the start of each episode:

“The powers that be made us put this at the start otherwise we may not have been able to get it on the telly.

“This is all fake.

“The stories are all made up.

“The real celebs have had nothing to do with this.

“Our celebs are all played by actors.

“Their faces are all DEEP FAKED.”

The limits of parody

So if a celebrity is the subject of an unauthorised deepfake, what recourse in law could they turn to? In the UK, there are no image rights per se, as there are in the US. The current best option is the common law tort of passing off.

Should the subjects of Deep Fake Neighbour Wars (launched in January this year) wish to take action, they would have to consider whether passing off applies.

But ITV’s disclaimer states clearly that the celebrities “had nothing to do with” the programme, making any claims of false endorsement under passing off difficult, as Matt Hervey, partner and head of artificial intelligence law at Gowling, notes.

He suggests that the programme is an extension of traditions of impressionists (such as Rory Bremner) and caricature (such as Spitting Image, a British satirical puppet show from the 1980s which parodied famous people).

“It's just using technology to make a highly convincing likeness as opposed to a caricature,” he says.

And, if the programme is only broadcast in the UK, the celebrities would probably not have any course of action outside the UK, explains Hervey.

Rebecca O’Kelly-Gillard, a partner at Bird & Bird, argues that parody doesn’t necessarily protect against further misuse down the line.

If Deep Fake Neighbour Wars is meant as a parody, “is ITV liable for any subsequent misuse of that parody?” she asks.

“Or the person who's misusing it down the road—are they liable? Because they're not only infringing the celebrity’s rights, they're also potentially infringing ITV's rights in the broadcast by taking an element of that broadcast and adapting it for a further purpose. And that purpose would be infringing.”

Right to privacy

Hervey considers that there may be an argument in persuading the UK courts to follow the route that other countries take on privacy rights.

“But that would require convincing a court to expand the current law, to hold that you have rights in your likeness under the heading of your right to private life,” he notes.

Overall, O’Kelly-Gillard suggests that individuals would need to rely on the array of existing IP rights, depending on the circumstances.

“People are then going to have to try and jigsaw them together to try to get the most protection they can,” she says.

“Each of these IP rights have lots of mini legal tests within them. And it's going to be very much on a case-by-case basis as to which of the rights apply and whether the tests are met in each given case.”

Does the law need to change?

Jani Ihalainen, an associate in the IP & Technology team at law firm RPC, believes there is a gap in regulation.

“The law is very much playing catch-up with new technologies, and deepfakes are no exception,” he says.

He points to the UK government’s recent AI policy, which states: “There is … concern that AI will amplify wider systemic and societal risks, for instance AI’s impact on public debate and democracy, with its ability to create synthetic media such as deepfakes.”

“Despite this, the government has not made great strides in addressing the potential issues of deepfakes,” argues Ihalainen, “and it seems unlikely that any legislation will be passed on this anytime soon.”

He believes that current legislation is “nowhere near adequate” to address the potential IP issues.

“Legislators will have to try and be forward-thinking, but current steps in this direction seem small, and undoubtedly the issue will have to come to the fore more for legislators to address it—by which point it will be too late.”

Chris Fotheringham, a solicitor in the commercial and intellectual property team at Ashfords, believes that there’s “definitely a gap” in the law, but is doubtful that intellectual property law should change.

Because there are no image rights in the UK, there is no route via IP law, except perhaps defamation or harassment, “which is quite a high bar”, he says.

He refers to the privacy rights within the Human Rights Act 1998, and the balance of the freedom of the press with the privacy rights of celebrities.

“You expect that by putting yourself out there, you expect people to comment on you. It's just how far that commentary goes before it becomes slander and libel.”

A social issue

Hervey agrees that the law doesn’t necessarily need to change. “I think it's a social and political issue to decide whether it's right or wrong. And I don't think there's a clear answer.”

Rather, it’s an economic and social question whether people should be allowed to replicate real people's likenesses, he adds. “I don't think it's a legal question. It's something that governments have to decide, in consultation with stakeholders across society.

“There's no reason why you can't have a law saying you can't fake someone's likeness or voice. The question is, should we?

“And where do you draw the line culturally between deepfakes and our currently accepted practices of entertaining based on impressions and caricature?”

If, however, it becomes apparent that existing law is failing to protect people, “there might very well need to be a change in the law”, argues O’Kelly-Gillard. “One would hope, in extreme circumstances, that the courts would find a legal route through.”

She goes on to say that just because the technology exists, doesn't make it alright to use it. “The people who use deepfakes for pernicious purposes are going to do it irrespective of the law, in the same way piracy exists,” she notes. “It's not permitted, but it's still going to exist.”

She also raises the issue of ethics within the technology. “Even the datasets that you're putting into an AI system are important, so that if they're inherently biased does it make your AI system unethical?” asks O’Kelly Gillard.

“It's those kinds of questions that we're going to have to grapple with; it's whether we want to legislate around those ethical issues, or whether we just want the current laws to suffice.

“So I think it will almost become more of a social debate around the issue that will drive any legislative change.”

Today’s top stories

Rights rollback for novel's AI images poses more questions

'Satan shoes' creator wades into Jack Daniel’s toy quarrel

Already registered?

Login to your account

To request a FREE 2-week trial subscription, please signup.
NOTE - this can take up to 48hrs to be approved.

Two Weeks Free Trial

For multi-user price options, or to check if your company has an existing subscription that we can add you to for FREE, please email Adrian Tapping at atapping@newtonmedia.co.uk


More on this story

Copyright
2 February 2023   In part II of a series on art and AI, Muireann Bolger hears how lawsuits against machine learning ‘artists’ may end with Spotify-style databases of licensed works.
Copyright
30 January 2023   Case centres on emerging field of artificial intelligence known as generative AI | Plaintiffs argue that tech companies violated open-source licences and infringed IP rights.
Copyright
21 April 2023   As Stability AI, Midjourney, and DeviantArt retaliate against a lawsuit as well as their critics, Muireann Bolger delves into the febrile debate surrounding this technology and IP rights.