On Monday morning, a couple of hours after the UK government’s AI opportunities action plan was published, I started getting messages from artists. “We are not going to continue drawing so that the founders of [AI companies] can get rich. I am quitting this job now,” said one. Another told me: “I have given up at this point.” I’ve heard similar messages from people for months – they are abandoning their creative jobs because AI companies are taking their work without asking, and using it to train models that compete with them.
There is actually a lot I agree with in the action plan, which seeks to make the UK a global leader in artificial intelligence. Written by the venture capital investor Matt Clifford, it proposes making it easier for British AI companies to access “compute” (essentially the servers needed to train AI models), updating visa regulations to bring more AI experts to our shores, and much more besides. These are sensible ideas that could boost our economy.
But the recommendation that has created despondency among artists concerns copyright. Clifford proposes that copyright law be reformed to favour AI companies. He seems to align himself with the government’s recent proposal that we hand our creators’ copyrighted works – their art, their music, their books – to AI companies, for them to train their technology on, free of charge, unless creators proactively opt out. In essence, this means flipping copyright on its head, so that every work in British creative history will become usable by AI companies unless its creators go through some as-yet-undefined process to say they’d rather that didn’t happen.
It is easy to see why this has been received poorly by British creators. If you train a large language model on short stories, it can write new short stories; if you train AI on pop music, it can write new pop music. The proposal lets AI companies take people’s work and use it to build highly scalable AI models that will outcompete them. We already know that generative AI is reducing the demand for human creative work. This proposal will dramatically accelerate that process.
Let’s say I want to set up a new company. We’ll call it Great British AI. Our mission is simple: to replace the country’s creators with AI. Doing so is easy. We will scrape the internet for any and all creative work by British creators. We’ll respect opt-outs, but hardly anyone ever opts out when given the chance. We will use the work we scrape to train cutting-edge AI models. And we will use these models to generate and sell vast amounts of new work in similar styles for a fraction of the cost.
Running a business such as this is currently, and rightly, illegal in the UK. It is clearly hugely exploitative. Under the new proposals, though, it becomes legal. I don’t think the government intends this to be the outcome. I don’t think it wants to put large swathes of the creative industries out of work by legalising theft. But this is what will happen if its proposals are enacted.
The core issue is that opt-outs of the kind outlined here are unfair and unworkable. Here’s an example. I write choral compositions, which are distributed by a music publisher. A choir buys the sheet music and records one of my pieces. That recording is broadcast on the radio. Can I opt out of the recording being used for AI training? Of course not. From the moment I publish the piece, I have no control whatsoever over who uses it, for what purpose. And this applies to a million pieces of media, across the creative industries. You can only opt out where you control your work – but, most of the time, you don’t.
The US has a much fairer solution: “fair use”. Some unlicensed uses of copyrighted works are allowed, and some aren’t. Whether a given use is permitted is determined by a number of factors – and, critically, one of these is the competitive effect that use has on the original. The proposal in the UK totally misses this nuance, instead opting for a blunt copyright exception that would do an incredible amount of damage.
But what is most frustrating is that there is no real need to change copyright law at all. The UK can be a global leader in AI by adopting Clifford’s other recommendations, which are ambitious and laudable. Lots of AI – the type the government cares about, such as in healthcare, science and defence – is not trained on the work of the world’s creators. The work that won Demis Hassabis the Nobel prize last year, AlphaFold, wasn’t – it was trained on a database of protein structures. The only upside to upending copyright law in the manner proposed is attracting a few large, foreign AI companies to set up offices here. These are the companies that have been lobbying the government heavily for this change. They are also, incidentally, some of the companies that have ecstatically welcomed the action plan.
Disappointingly, it seems the government has “committed to implement” Clifford’s recommendation to reform copyright, along with his other recommendations. This throws the government’s consultation on AI and copyright, which still has six weeks to run, into disarray. Why engage with a consultation if the government has already made up its mind?
I think that Clifford has the country’s best interests at heart with this action plan. But his is not the only voice that should be heard. The government should listen to Paul McCartney, Kate Mosse, Kate Bush and the countless other musicians, actors, writers, artists and other creators who – quite understandably – reject the notion that AI companies should be able to use their works, unlicensed, to build their competitors.
I hope the government will reconsider. It would surely not be a ruinous U-turn to implement49 of the 50 recommendations. Doing so is the only way to ensure that the UK’s AI industry and creative industries can both equally prosper.
-
Ed Newton-Rex is the founder of Fairly Trained, a non-profit that certifies generative AI companies that respect creators’ rights, and a visiting scholar at Stanford University
Source link
lol