The Creative Industry’s Struggle Against AI Exploitation
“No one should have to opt out of being robbed,” expresses author Jeanette Winterson, voicing her outrage at government proposals that would enable artificial intelligence companies to utilize creators’ works without compensation. This sentiment resonates with a diverse group of creatives, including television presenter Richard Osman and music legend Sir Paul McCartney. In a striking act of defiance, a collective of 1,000 musicians has produced a protest album that captures the eerie silence of vacant recording studios, serving as a poignant reminder of the potential fate awaiting one of the UK’s most cherished export sectors.
How did Sir Keir Starmer’s administration find itself at odds with the creative industries? Historically, Labour has been a friend to the arts, often welcoming stars like Noel Gallagher, Eddie Izzard, and Lenny Henry to gatherings at No 10. Arts luminaries such as Cameron Mackintosh and Jeremy Irons have been notable financial supporters of the party, while authors like Ken Follett have rallied behind Labour’s creative agenda.
However, in its eagerness to embrace the AI revolution, the government has alienated many creatives who fear that this technology is being wielded as a tool to pilfer their work and livelihoods. Labour views AI as a crucial driver for economic growth, aiming to position the UK as a leading international AI hub, facilitated by a framework of minimal regulations. Yet, in pursuing this goal, they risk jeopardizing the flourishing creative sectors that contribute over £125 billion annually to the economy and provide 2.4 million jobs.
The crux of the issue lies in the tech companies’ insatiable need for extensive online data to train their generative AI models, such as OpenAI’s popular ChatGPT. The government is contemplating a contentious exemption to UK copyright law that would permit tech firms to harvest images, films, music, and text from online sources without charge, unless the original creators have explicitly opted out of this process.
This potential arrangement would please major tech companies, which, despite fiercely guarding their own copyright, appear to believe they should have unrestricted access to train AI’s large language models (LLMs) without needing to compensate content creators. If big tech were granted the ability to scrape UK online material under a new text or data mining (TDM) exemption, the burden would fall on creators to monitor whether their work has been utilized unlawfully. This scenario has already unfolded in the European Union, where creators have found it nearly impossible to ascertain whether tech firms have complied with their requests to opt out.
The previous Conservative government, sensing an AI gold rush, had also considered a similar exemption last year but retracted its plans due to backlash from the arts and media sectors, alongside a critical report by the House of Lords Communications and Digital Committee titled At Risk: Our Creative Future. Following last summer’s election and after lobbying from Silicon Valley, the creative industries were dismayed to find that the TDM exemption was once again on the table. Science minister Lord Vallance and the UK’s AI tsar Matt Clifford are among the government figures advocating for the UK to emerge as a leader in AI.
In December, the government initiated a public consultation aimed at bridging the gap between AI developers and the creative sectors. However, Dan Guthrie, director general of the Alliance for Intellectual Property, contends that there is “no market failure” in UK copyright law and asserts that it simply needs to be upheld. Many creatives believe that by strengthening its already stellar reputation as a global creative hub, the UK can ensure that AI thrives alongside its rich content.
Big tech undoubtedly possesses the capacity to track the material it has used to train its models and the financial resources to compensate creators accordingly. Notably, Microsoft has forged agreements with publishers like Financial Times, Reuters, and others for the training of its CoPilot AI model. Similarly, OpenAI has struck partnerships with Condé Nast and Rupert Murdoch’s NewsCorp. Yet, large quantities of content, including works from independent freelancers, are still being harvested for free by big tech.
In a display of unity, UK news brands have recently launched the Make It Fair campaign, advocating for the principle that content should only be collected with explicit permission and appropriate payment.
Labour now faces a critical decision: whether to submit to the pressures of big tech. This includes figures like Elon Musk, who is developing his AI chatbot Grok3, while simultaneously denouncing Starmer’s Britain as a place where free speech is stifled. This portrayal is, of course, misleading. Yet, the Creators’ Rights Alliance, representing over 500,000 creators, warns that “we are in danger of falling into a world of mimicry.” If copyright law is weakened, we may find ourselves teetering on the edge of a dystopian future.