Experts warn that an 'opt-out' AI copyright model could leave UK artists and creatives vulnerable to exploitation, calling for stronger legal safeguards to protect intellectual property and ensure fair compensation.
Report: AI, copyright, and productivity in the creative industries. Image Credit: Krot_Studio / Shutterstock
The UK government should resist allowing AI companies to scrape all copyrighted works unless the holder has actively "opted out., This would put an unfair burden on up-and-coming creative talents who lack the skills and resources to meet legal requirements.
According to a new report from the University of Cambridge economics, policy, and machine learning experts, this is. The experts also argue that the UK government should clearly state that only a human author can hold copyright—even when AI has been heavily involved.
The report, a collaboration between three Cambridge initiatives—the Minderoo Centre for Technology and Democracy, the Bennett Institute for Public Policy, and ai@cam—argues that unregulated use of generative AI will not guarantee economic growth and risks damaging the UK's thriving creative sector.
Researchers say that if the UK adopts the proposed 'rights reservation' for AI data mining rather than maintaining the legal foundation that automatically safeguards copyright, it will compromise the livelihoods of many in the sector, particularly those just starting out.
They argue it risks allowing offshore companies to scrape artistic content produced in the UK for endless reuse.
"Going the way of an opt-out model is telling Britain's artists, musicians, and writers that tech industry profitability is more valuable than their creations," said Prof Gina Neff, Executive Director at the Minderoo Centre for Technology and Democracy at the University of Cambridge.
"Ambitions to strengthen the creative sector, bolster the British economy and spark innovation using GenAI in the UK can be achieved – but we will only get results that benefit all of us if we put people's needs before tech companies."
Creative industries contribute around £124.6 billion, or 5.7%, to the UK's economy and are deeply connected to the tech industry. For example, the UK video games industry is the largest in Europe and contributed £5.12 billion to the UK economy in 2019.
While AI could lead to a new generation of creative companies and products, the researchers say little is currently known about how AI is being adopted within these industries and where the skills gaps lie.
"The Government ought to commission research that engages directly with creatives, understanding where and how AI is benefiting and harming them, and use it to inform policies for supporting the sector's workforce," said Neil Lawrence, DeepMind Professor of Machine Learning at the University of Cambridge and Chair of ai@cam.
"Uncertainty about copyright infringement is hindering the development of Generative AI for public benefit in the UK. For AI to be trusted and widely deployed, it should not make creative work more difficult."
In the UK, copyright is vested in the creator automatically if it meets the legal criteria. Some AI companies have tried to exploit "fair dealing" – a loophole based around use for research or reporting – but this is undermined by the commercial nature of most AI.
Now, some AI companies are brokering licensing agreements with publishers, and the report argues this is a potential way to ensure creative industries are compensated.
While performers' rights, from singers to actors, currently cover reproductions of live performances, AI uses composites harvested from across a performer's oeuvre, so researchers say rights relating to specific performances are unlikely to apply.
Further clauses in older contracts mean performers are having their work "ingested" by technologies that didn't exist when they signed on the dotted line.
The researchers call on the government to fully implement the Beijing Treaty on Audio Visual Performance, which the UK signed over a decade ago but has yet to implement. This treaty gives performers economic rights over reproduction, distribution, and rental.
"We propose mandatory transparency requirements for AI training data and standardised licensing agreements that properly value creative works. Without these guardrails, we risk undermining our valuable creative sector in the pursuit of uncertain benefits from AI," said Prof Diane Coyle, the Bennett Professor of Public Policy at the University of Cambridge.
The Cambridge experts also examine copyright issues for AI-generated work and the extent to which "prompting" AI can constitute ownership. They conclude that AI cannot hold copyright itself, and the UK government should develop guidelines on compensation for artists whose work and name appear in prompts instructing AI.
Regarding the proposed 'opt-out' solution, the experts say it is not "in the spirit of copyright law" and is difficult to enforce. Even if creators opt out, it is unclear how that data will be identified, labeled, compensated, or even erased.
It may be seen as giving "carte blanche" to foreign-owned and managed AI companies to benefit from British copyrighted works without a precise mechanism for creators to receive fair compensation.
"Asking copyright reform to solve structural problems with AI is not the solution," said Dr Ann Kristin Glenster, Senior Policy Advisor at the Minderoo Centre for Technology and Democracy and lead author of the report.
"Our research shows that the business case has yet to be made for an opt-out regime that will promote growth and innovation of the UK creative industries.
"Devising policies that enable the UK creative industries to benefit from AI should be the Government's priority if it wants to see the growth of both its creative and tech industries," Glenster said.
Source:
Journal reference: