shutterstock_208457758_gagliardiphotography
2 November 2023FeaturesCopyright ChannelMarisa Woutersen

UK’s House of Lords is hungry for new AI rules

As global leaders, tech CEOs, and policymakers convene for the first Global AI Safety Summit in the UK, members in the House of Lords (HoL) are urging the government to address the rapid advancements and associated risks of AI.

The summit, which runs November 1-2 at Bletchley Park, could potentially feed into new AI regulations that Prime Minister Rishi Sunak will need to pass through the UK’s Second Chamber for scrutiny.

For its part, the HoL is primed to take a close look at how Sunak plans to deal with content creators and has already voiced the need for urgency.

In July earlier this year, the members debated AI regulation and associated risks with the technology, in a motion put forward by Lord Ravensdale, full name Daniel Mosley.

WIPR asked the HoL's key AI activists about what they want the rules to address.

Baroness Stowell of Beeston, full name Tina Stowell, is chair of the Communications and Digital Committee, which considers the media, digital and the creative industries.

She notes that: “AI is often characterised either by tech panacea perspectives suggesting it will solve all the world’s problems, or doomsday scenarios predicting the end of civilisation.

“It is unlikely either of those extremes will emerge—but that doesn’t mean we should underestimate the impact developments in AI will have right across our economy—good and bad.”

Lord Clement-Jones, full name Timothy Clement-Jones, says that at the time of the July debate, AI and IP minister, Viscount Camrose, “didn't give much indication about what the AI summit might contain, although now we know that it's going to be about rather bland things like international cooperation”.

Clement-Jones is on the Draft Online Safety Bill joint committee, the AI in Weapon Systems committee, and the Industry and Regulators committee.

The government published its White Paper, A pro-innovation approach to AI regulation, in March, which Mosley describes as a “good attempt to try and balance AI regulation” but he adds that the pace of change means that it has been left behind the curve.

"I strongly believe it's something that Parliament needs to get on top of and start taking much more seriously," Clement-Jones emphasises.

Can AI be governed by existing rules?

Currently, the government relies on sector regulators to oversee AI use within their specific industries.

Stowell is putting together an inquiry into LLMs and whether existing regulations can satisfy concerns, from which we can expect a report on in the new year.

And while Mosley stresses the importance of engaging with the creators of the LLMs to understand whether the way they work aligns with existing regulations, he also believes that AI presents significant risks that new AI regulations will need to address.

“We have an issue where LLM are scraping vast amounts of data from the internet and using that to train their models, which has to be taken into account in terms of how IP frameworks and regulation are developed,” says Mosley.

Clement-Jones says the fact that power is held by a small group of companies, is a problem. “It's only big businesses that can afford to develop these LLM, it needs huge datasets, huge computing power, semiconductors, and highly skilled workforces.

“There are only four or five businesses in the world that can do that, and yet, they're the ones ripping people off,” he adds.

Impact on the creative industries

Regarding words, images and other creative works that generative AI relies on, the Earl of Devon, Charles Courtenay, says the onus is on the technology companies to show they are willing to pay to protect rights owners.

Courtney is a member of the House of Lords and a partner at Michelmores specialising in IP and commercial disputes.

“Technology companies need to engage proactively with the creative sector to develop an effective licensing model that allows for protection of rights holders while also enabling the training of AI models,” says Courtenay.

Stowell highlights that AI is gradually assuming responsibilities in the creative industry, such as voiceover acting, writing, and composing background music, which have previously been performed by people.

And as AI evolves and becomes more capable, it has the potential to disrupt the traditional creative industry even further.

“If TV or film studios can get high quality scripts or music for free from a computer why would they pay a writer if there were little difference in the product delivered? And how will we ensure we can deliver new cohorts of world-leading figures in the creative sector if the ‘training ground’ in lower-level jobs is being taken by AI?” questions Stowell.

She insists the government has a role to play to ensure creators receive remuneration.

“It is important for the government to work with the industry to create a fair deal for publishers, writers, artists, musicians and everyone else whose works are being used by tech firms to develop LLM and wider generative AI tools,” says Stowell.

She believes the recent writers’ strike in Hollywood is “not likely to be the last time we see conflict between creatives and the technology sector.”

The development of these technologies is “straining existing legal concepts” of what copyright and IP law actually involves and the extent to which foundation AI models are legally compliant, she adds.

Risk-based regulations

Clement-Jones advocates for more risk-based regulation, where the degree of regulation depends on the risk posed by a particular AI.

Additionally, he wants to see “ethical principles embedded in standards in terms of testing, risk assessment and management”, which mirrors some of President Biden’s comprehensive Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence, published on October 30.

“One thing that we have in the UK, which others don't, is the ability to claim copyright if a work is created by AI under the supervision of a human being,” says Clement-Jones.

He describes this as “useful” but suggests “we need to go further”.

Furthermore, Stowell points out that lower-level creative jobs—often critical for young talented people to gain experience and earn a living—could be at risk as AI takes on more responsibilities.

This could affect the ability to nurture and develop the next generation of creative talents, she explains.

The conversation is still going on and “we can’t take our eye off the ball,” says Stowell.

Already registered?

Login to your account

To request a FREE 2-week trial subscription, please signup.
NOTE - this can take up to 48hrs to be approved.

Two Weeks Free Trial

For multi-user price options, or to check if your company has an existing subscription that we can add you to for FREE, please email Adrian Tapping at atapping@newtonmedia.co.uk


More on this story

Copyright
6 September 2023   Government publishes five objectives for the Bletchley Park event | No specific mention of IP or protection for creators.
Copyright Channel
22 December 2023   The courts of England and Wales have had to grapple with novel arguments on trademark law in 2023, says Varuni Paranavitane of Finnegan.
Copyright
11 January 2024   Focus shifts to ‘code of practice’ for AI developers following criticism from creatives | Govt told that AI will continue to pose a threat to creators' IP without “a definitive course of action” | Trust of creatives must be rebuilt after “abortive attempts to roll out broad exception” plans, says report.