Stakeholders were unable to agree a code, handing the issue back to the government
A Stock Studio / Shutterstock
7 February 2024CopyrightSarah Speight

UK fails in bid to create AI voluntary code as talks collapse

Working group fails to reach agreement between rights holders and AI developers | Culture, Media and Sport committee slams government’s “woolly words” | Government shifts to providing a ‘mechanism’ for creators to discover if AI has been trained on their work.

Talks between rights holders and AI developers to establish a UK voluntary copyright and AI code of practice have collapsed, according to a government announcement made yesterday.

This comes just weeks after the government shifted its focus from creating copyright exemptions for AI developers, to agreeing a code of conduct that would instead protect and benefit both rights holders and AI developers.

It also follows on the heels of a statement by OpenAI that it would be “impossible” to train AI models without using copyrighted materials, in a submission of written evidence to the UK House of Lords Communications and Digital Select Committee’s inquiry into large language models.

The working group, convened by the UK Intellectual Property Office (UKIPO) to discuss the “interaction between copyright and AI”, failed to reach an agreement—which, according to some commentators, leaves the future of IP protection for creators and rights holders uncertain.

‘Period of engagement’

In the announcement—made by the Rt Hon Michelle Donelan, Secretary of State for Science, Innovation and Technology—said: “The working group has provided a valuable forum for stakeholders to share their views.

“Unfortunately, it is now clear that the working group will not be able to agree an effective voluntary code.”

She went on to say that the Department for Science, Innovation and Technology (DSIT) and the Department for Culture, Media and Sport (DCMS), “will now lead a period of engagement with the AI and rights holder sectors, seeking to ensure the workability and effectiveness of an approach that allows the AI and creative sectors to grow together in partnership.”

Donelan stated that “we recognise the importance of ensuring AI development supports, rather than undermines, human creativity, innovation, and the provision of trustworthy information”.

Trust and transparency would be needed between parties, she added, with “greater transparency from AI developers in relation to data inputs and the attribution of outputs having an important role to play.”

In a statement emailed to WIPR, a government spokesperson said: “This government is committed to supporting the AI and creative industries sectors so that they continue to flourish and are able to compete internationally.

“We are continuing to engage with stakeholders to work towards a shared approach which allows our AI and creative sectors to grow together. We will set out further proposals on the way forward soon.”

‘The IPO hasn’t found answers’

Reaction to the news has been met with concern among lawyers and the creative industries.

Nina O'Sullivan, head knowledge lawyer at Mishcon de Reya, said: "Identifying a middle ground as a baseline for developing a code of practice was always going to be extremely difficult.

“Now, it remains to be seen whether the further engagement to come led by [the] DSIT and DCMS will likewise struggle to reach a mutually acceptable solution.

“In the meantime, we await the government's further proposals on the way forward on this complex issue (and the risk that this issue may simply be left for a future government to resolve)."

Dame Caroline Dinenage MP, chair of the Culture, Media and Sport (CMS) Committee, said it is “disappointed” with the announcement.

“The creative industries have long been raising concerns that their IP is being unfairly used to train AI systems without consent and without compensation. The lack of even a voluntary code will not allay these concerns.”

She pointed to the committee’s report on the impact of AI on the creative industries, urging the government to take steps to regain the trust of creators and abandon the AI copyright exemption.

“Woolly words from ministers will not provide rights holders with certainty that their rights will be respected. The government must urgently reconsider its approach and bring forward a way of compensating creators for the use of their works in AI development, to ensure mutual benefits for both sectors.”

Reema Selhi, head of policy at the Design and Artists Copyright Society was part of the group tasked with devising the code. She told the Financial Times: “The industry is asking for transparency on what models have and haven’t been trained on, and what works are being used. The IPO hasn’t found answers to those questions.”

However, optimism was expressed by one commentator, who highlighted the government’s statement that it would explore “mechanisms” (if not establish a code) “for providing greater transparency so that rights holders can better understand whether content they produce is used as an input into AI models.”

Writing on LinkedIn, Richard Mollett, head of European government affairs at data analytics firm RELX, said: “Call it what you will, this is decisive support for rights holders, whilst also clearly promoting the idea that AI developers and creative sectors should work in partnership”.

Responding to this post, Dominic Young, founder at Axate, which enables publishers to monetise their content, said: “Of course the best mechanism already exists: copyright. We get clarity because our content is only used to train AI models if we have given permission for it.

“Any AI developer who has used content without permission can expect sanctions. This whole idea that it's OK to hoover stuff up and exclude only by exception is just unworkable.”

Broader consultations and debates

Yesterday’s announcement forms part of its broader consultation outcome, titled A pro-innovation approach to AI regulation: government response.

This document encompasses the government’s broader response to the AI White Paper Consultation, to which it received 409 written responses from organisations and individuals.

The UK government held an AI Summit in November 2023, at which members of the House of Lords urged it to address the rapid advancements and associated risks of AI.

This followed a debate on the risks of AI, in which House Lords members questioned the AI and IP minister Viscount Camrose’s lack of AI regulation.

The UKIPO has consulted on AI previously, including twice in 2021 (March and October).

Already registered?

Login to your account

To request a FREE 2-week trial subscription, please signup.
NOTE - this can take up to 48hrs to be approved.

Two Weeks Free Trial

For multi-user price options, or to check if your company has an existing subscription that we can add you to for FREE, please email Adrian Tapping at atapping@newtonmedia.co.uk


More on this story

Copyright
11 January 2024   Focus shifts to ‘code of practice’ for AI developers following criticism from creatives | Govt told that AI will continue to pose a threat to creators' IP without “a definitive course of action” | Trust of creatives must be rebuilt after “abortive attempts to roll out broad exception” plans, says report.
Copyright Channel
3 November 2023   Members of the UK’s Second Chamber propose putting creators first, share admiration for the EU’s AI Act, and want greater ambition over supercomputer plans. Marisa Woutersen hears more.