The music industry has been here before. When Napster disrupted music distribution in the early 2000s, the legal and contractual frameworks in place were completely unequipped to handle it. Then came streaming in the 2010’s, and Eminem’s publisher had to sue Spotify to establish that digital downloads weren’t covered under existing licensing terms. Each technological wave arrives faster than the last, and each time, music industry contracts are left in the background waiting to catch up with the narrative. AI is the latest disruption, and it may be the most consequential one yet.
A Regulatory Gap That Contracts Have to Fill
Governments are taking their time to observe how AI evolves before legislating. AI, however, moves fast by nature. The UK government’s hesitation to establish a firm position on AI and copyright is a textbook example of institutional caution meeting technological urgency. Policy is being determined piece by piece, often by individual music industry organizations, with little consistency across the board.
The result leaves a vacuum, and music companies can’t afford to wait for legislation to fill it. They’re already in the middle of the problem. When BMG filed suit against Anthropic over the alleged use of copyrighted lyrics to train its AI model, it wasn’t a hypothetical legal scenario playing out in a law school classroom. It was a live demonstration of what happens when music industry contracts simply don’t account for a technology that didn’t exist when they were written. In a survey by the British music copyright collective, 93% of creators said they believe they should be compensated when their work is used in AI training. What this means is that the gap between that expectation and current contractual reality is significant.
What’s Missing from Current Agreements
This is where the real problem lives. Most existing music contracts were built for a world where the primary concerns were physical distribution, digital downloads, and streaming royalties. AI introduces a different set of questions entirely, and current agreements are largely silent on all of them. Who holds the right to license a catalog for AI model training, and under what terms? If a model generates revenue using material it was trained on, how is that revenue accounted for? Who owns derivative works shaped by protected material?
These cases will keep surfacing as AI becomes more embedded in the creative and commercial music landscape. The reality is that most catalogs have probably already been fed into training models somewhere along the way. This might mark the starting point for figuring out what needs to change going forward.
The Operational Cost
These gaps in contractual language create more work for organizations. Legal and rights management teams are now being asked to audit AI usage, verify licensing compliance, and flag potential infringement without clear frameworks to work with.
The less defined the contractual language, the more manual the oversight. And manual oversight at scale, across large catalogs and complex licensing relationships, is expensive and ineffective. Without the right infrastructure to monitor and analyze these relationships, and the contracts attached to them, it becomes a game of cat and mouse, reacting to problems rather than preventing them. As is the case with other industries, lack of proper contract infrastructure costs money, compliance standing, and organizational bandwidth that should be going toward higher-value work.
What Modern Agreements Need to Account For
The path forward starts by building AI-aware agreements right this minute. That means being explicit about AI licensing terms, covering training data rights, voice cloning, AI-assisted production, and any other scenario where AI interacts with the work. It means defining ownership provisions for AI-generated derivative content, building in remuneration structures tied to AI commercialization, and including audit rights and renegotiation clauses for when the landscape shifts again.
Companies have to leave no stone unturned. Every point of interaction between AI and a piece of music, whether it’s being used for training, generation, or distribution, is a point where contractual clarity matters.
The music industry has navigated technological disruption before and emerged stronger on the other side. The organizations that do it without a lawsuit forcing their hand will be better positioned for whatever comes next. Since 2019, Zeal has been helping entertainment organizations manage complex music industry contracts and their challenges before they become expensive ones. If you’re ready to build a more resilient process, schedule a demo here.