UK working on rules for training of AI models with creative work


Stay informed with free updates

UK ministers are working on plans to increase transparency over how tech companies train their artificial intelligence models after the creative industries voiced concerns over work being copied and used without permission or a fee.

Culture secretary Lucy Frazer told the Financial Times that the government would bring forward its first attempt to create rules around the use of material such as TV programmes, books and music by AI groups.

Frazer said ministers would initially focus on ensuring greater transparency over what content was being used by AI developers to train their models, which will in effect allow the industry to see if work it produces is being ripped off. 

Rishi Sunak’s government is caught between competing objectives of boosting the UK’s position as a global centre for AI, and protecting the country’s world-leading creative industries sector. A general election expected this year, with Sunak’s Conservatives trailing in opinion polls, is also likely to limit the work ministers and officials can do.

Frazer said in an interview that she recognised AI represented a “massive problem not just for journalism, but for the creative industries”. 

“The first step is just to be transparent about what they [AI companies] are using. [Then] there are other issues people are very concerned about,” she added. “There’s questions about opt in and opt out [for content to be used], remuneration. I’m working with industry on all those things.”

Frazer declined to say what mechanisms would be needed to bring greater transparency so that rights holders can understand whether content they produced was being used as an input into AI models.

Better transparency around the fast-evolving technology will mean that rights holders can more easily track intellectual property infringement.

People close to the work said the government would try to bring forward proposals ahead of the election, which is expected in the autumn. Asked about timing, Frazer said she was “working with industry on all those things”.

Executives and artists in music, film and publishing are concerned that their work is being unfairly used to train AI models under development by tech groups. 

The EU is already preparing to introduce similar rules under its AI act, which will require developers of general purpose AI models to publish a “sufficiently detailed” summary of content used for training and implement a policy to respect the bloc’s copyright law.

By contrast, the UK has been slow in drawing up similar rules. Officials have admitted to a conflict between ministerial ambitions to draw fast-growing AI companies to the UK with a more benign regulatory environment and ensuring companies in the creative industries are not exploited.

An attempt to create a voluntary set of rules agreed between rights holders and AI developers failed last year, leaving officials needing to rethink the next steps.

Frazer said the government wanted to create a “framework or policy” around transparency but noted “very complex international problems that are fast moving”. She said the UK needed to ensure it had “a very dynamic regulatory environment”. 

Leave a Reply

Your email address will not be published. Required fields are marked *