top of page

GFF 2025: Copyright & Contracts in the Age of AI

I recently attended the Glasgow Film Festival 2025 to present this year's opening keynote for the incredible Industy Focus programme. It was a fantastic opportunity with an engaged, thoughtful audience. I'm always humbled when around creative and tenacious film folk, but Scotish talent is truly something else - I left feeling so inspired.


Here, you'll find an abridged version of my speaker’s notes from the session. It's been refined for clarity and readability, but I've kept the writing style just as I presented it on the day — colloquial, conversational, and hopefully not too boring! Following my introduction, it's roughly divided into three sections: (1) copyright basics; (2) why AI complicates the scene; and (3) the role of a good contract. I then offer a few concluding thoughts about why this all matters (and why I'm so passionate about my work).


A huge thank you to the Glasgow Film Festival team for inviting me, and to everyone who attended for such an insightful and lively discussion.

 

AI sparks strong opinions. Some people are enthusiastically optimistic; some are cautiously curious. But many — perhaps most in the film industry —might be feeling something closer to existential dread.


My normative judgement on AI is not important: I won't say it's good or bad. I won’t try to convince you that it’s the key to slashing production costs or writing the next Oscar-winning script. My job is to help creatives meet the reality and protect their rights and the integrity of their work, so they can have a bit more control.

Now, lawyers are not often invited to film festivals for good reason. We don’t push creative boundaries. We don’t write compelling dialogue, shoot atmospheric visuals, or evoke deep emotion.


We do words. Lots of them. And they're often words no one wants to read.

So, how did I find myself giving the keynote presentation at the Glasgow Film Festival 2025?


Scarlett Johansson, actually. In 2018, deepfakes emerged as a technological force. At the time, the immediate concern was non-consensual deepfakes—image-based abuse. But beyond that, a larger question loomed: if studios can convincingly replicate an actor’s likeness or voice with AI, who needs human artists anymore?


By 2019, pre-COVID, pre-ChatGPT, this problem had become an obsession of mine. Every spare moment went into researching, writing, and trying to solve it and I became one of the first practising lawyers to be peer-reviewed on the subject of AI and publicity laws. The issue wasn’t just copyright: it spanned trade marks, reputation, data protection, privacy and beyond.


And since then, it’s been my privilege to advise governments, think tanks, tech companies and creatives on the opportunities, risks, and implications of using AI.

COPYRIGHT BASICS


I love copyright. There’s something reassuring and solid about it— an anchor in a world where ideas and creativity are intangible. But copyright wasn’t always there. It had to be created, invented as a reaction to evolving technologies and socio-economic factors.


For centuries, if you wanted to share an idea, you had to write it out, word by painstaking word. Books were hand-copied, fragile things: labourious, rare, and expensive. Then came Johannes Gutenberg, and in 1440, everything changed.


His printing press wasn’t just a machine—it was a rupture in how we value the expression of human intellect and creativity. Those expressions are worth more than the paper and ink used to create the manuscript by scribes in candlelit monasteries. But what happens when words, once bound to the hand of the writer, become incredibly easy to copy?


By the early 1500s, books were beginning to flood Europe. Printers weren’t just publishing new works; they were reprinting old ones—without paying the original authors a single coin.


A crisis emerged.


In 1557, Queen Mary I granted a royal charter to The Stationers’ Company, giving them exclusive control over England’s printing industry. The charter allowed the Stationers to police unauthorised printing, ensuring that books aligned with state and religious doctrine. But it was also a lucrative commercial monopoly: membership in the Company became the only way to legally operate as a printer, consolidating both power and profits within a closed group. This was reinforced under the Licensing Act 1662.


Fast forward to 1710, and we get something radical: the Statute of Anne, the world’s first formal copyright law. This time, the power shifted—for the first time, the rights belonged to the author, not the printer. Copyright was no longer about controlling presses. It was about protecting creativity itself.


Today, thanks in large part to the Berne Convention of 1886, copyright can be thought of as a bundle of rights that protect creative expression — provided that certain basic criteria are met. To be copyrightable, work must be:

  • Human-made – copyright requires human authorship (see the famous Monkey Selfie case).

  • Original – the work must be independently created and not a direct copy of someone else’s.

  • Creative – copyright does not protect facts, which are discovered rather than created, nor does it extend to scientific formulas or equations, though these may be protected with patents or trade secrets.

  • Expressed – the idea must be fixed in a tangible form and capable of being perceived with the senses: it cannot remain abstract. Copyright protects the specific way an idea is expressed and crystalised, whether in writing, visual art, music, or recorded performance.


Once these conditions are met, the work automatically attracts copyright protection and the creator (the 'rightsholder') is entitled to certain exclusive rights, including copying, adapting, translating, displaying or performing the work in public, as well as creating derivative works (more on this below!). They are also free to licence or otherwise sell some or all of their rights to another party, who in turn becomes the new rightsholder.


Of course, we’ve come a long way since the printing press. Every major technological leap since then has tested copyright. Photography; cinema; television; CGI —each raised questions about ownership, originality, and artistic control.


To illustrate this point, I often mention one of my favorite copyright cases: Burrow-Giles Lithographic Co. v. Sarony, 111 U.S. 53 (1884).


Napoleon Sarony, a photographer, had taken a striking portrait of Oscar Wilde—moody, contemplative, perfectly framed. A lithography company copied it without permission, and used it for an advertisement.


So, Sarony sued.


The lithographic company's defense? “Sarony didn’t create this. His camera did. It's not capable of copyright protection.” The U.S. Supreme Court ultimately disagreed, ruling that photographs are capable of copyright because a camera is just a tool. The creative vision—the choice of light, angle, composition—belongs to the artist, and this should be reflected in the rights granted to them.

HOW DOES AI COMPLICATE THE SCENE?


So here’s a big question: If photography, CGI, and digital tools are just extensions of human creativity, what makes AI different?


Unlike these other "tools", AI can do more than just assist the creative process — it can generate new work, and often with minimal human interferrence.


This creates two urgent legal questions:


  1. Scraping & Training – If an AI is trained on copyrighted works without permission, is that fair use, or is it infringement?

  2. AI-Generated Output – If an AI-produced script, song, or visual is too close to an existing work, where is the line between influence and infringement?

(A third question — whether AI-generated works can themselves be copyright protected — is a whole other mess for another day.)

A recent investigation from The Atlantic suggests that 138,000 films and TV episodes were discovered in AI training datasets, in a process of unlicensed copy-paste-creativity. Award-winning films, indie projects, classic cinema—all scraped, all ingested, all used. No permission. No payment. No framework.


And yet, in the example shown here, the image of me was clearly inspired by Rachel from Friends.


AI firms say this is all fair use. Their argument is that their algorithms are just reading. "Our pet web crawlers", I can imagine OpenAI's Sam Altman saying, petting a tarantula resting by his keyboard, "are simply out there, learning. You never told us we couldn’t..."


The next question concerns if new AI-generated content - the output itself - crosses into infringement.


Copyright doesn’t just protect against direct copying. Rather, courts move beyond the question of whether a work is identical, and consider if just how much was 'borrowed' or 'remixed'. Naturally, art (and culture itself) is iterative, and necessarily requires some flexibility in how ideas and expressions of them evolve across years and borders.


But this precisely where AI tests the boundaries of copyright law in ways we’ve never seen before. If an AI algoirithm is trained on a work and spits out a close (but not identical) copy, is that an infringement? AI developers claim this is transformative—that it takes existing works and produces something new.

But is it?


AI-generated works almost always fall into the grey areas of these tests and existing legal doctrines. Courts will have to determine whether that kind of replication is legally problematic, and this is going to take a long time.


If copyright were a magic solution, AI companies wouldn’t have been able to scrape shows for training without permission... but they did.



For example, copyright protects specific creative expression, not ideas, themes, or concepts: generally speaking you can’t copyright a filmmaking style, or a gritty, handheld aesthetic. (On this point, see my earlier blog post about AI and artistic styles.) Copyrighting characters is also an entirely different legal issue, too.


We also know that copyright must be balanced gainst freedom of expression (free speech) and other public interests. Thanks to concepts such as fair use (fair dealing), Journalists and film critics can quote from scripts or film reviews, and documentary filmmakers can show copyrighted material for historical context.


If we apply blanket prohibitions on this type of activity, we risk chilling (limiting) freedom of expression for everyone more broadly – not just AI companies.


Another key point is that copyright on its own typically exists, quietly in the background, until something goes wrong: it's often seen as a defensive fall-back to counter infringement. This can easily escalate and cost hundreds of thousands of pounds to resolve, not to mention time and mental (or indeed emotional) energy.


AI companies know this. They know that most individual rightsholders don’t have the time or resources to challenge them in court – unless they really are like Getty Images or Reuters or the New York Times.


So, what do the AI companies do? They scrape, train, and wait for someone to try and stop them.


Contracts, on the other hand, allow for much more control from the outset.


If copyright has gaps, contracts can help fill them.



THE ROLE OF A GOOD CONTRACT

Contracts shape the boundaries of ownership, accountability, and trust. In industries like film and AI, where technology evolves faster than regulation, they aren’t just formalities—they’re strategy. They determine who profits from creative work, who takes on risk, and whether a business is set up to thrive or fail. A well-drafted contract bridges the gaps between legal principles and the real-world.


But contracts only work if they are done right. In today’s landscape, I think a few key principles are non-negotiable:


📌 1. If your contract is silent on AI, assume it’s not in your favour.

In the context of contracts covering audiovisual content, AI clauses are simply essential. Even just a few sentences defining what AI can and cannot do with creative work can make all the difference. Without clear terms, there’s a risk that creative material could be used to train AI models, that AI could replace human contributions, or that the scope of AI use is left dangerously undefined.

📌 2. ELI5: Explain Like I’m Five.

Legal jargon isn’t a sign of sophistication; it’s often a power move. Dense, ambiguous clauses create an imbalance, where one party fully understands the risks while the other is left nodding along. A contract should be direct, transparent, and impossible to misinterpret. If you don’t fully understand a contract, you can’t negotiate it. So, ask questions; challenge complexity.

📌 3. Use your commercial leverage to negotiate your legal terms.

Filmmakers often think contracts are rigid, but they’re not. The deal you negotiate should drive the contract, not the other way around. I like to think of contracts as mixing boards in a recording studio—each element can be adjusted during the negotiation phase, to achieve the right balance. If one party is pushing for a high revenue share, the other might insist on stronger creative control. Remember that it’s a negotiation, not a fixed script, and every adjustment shifts the overall balance of the deal.



By now, it should be clear that contracts are one of the most effective tools for safeguarding assets and creative businesses. That said, they are not the only solution—industry-wide action, legislative developments, and evolving business practices will all play a role.


As an important first example, regulatory shifts are already underway. The Labour government in the UK is reassessing the “innovator-friendly” approach of previous years, potentially introducing stronger protections for consumers and, hopefully, creatives. One of the most significant changes on the horizon is the question of consent for AI training data. Will opt-out remain the default, requiring rightholders to take active steps to remove their work from AI datasets? Or will stricter opt-in requirements emerge, forcing AI developers to seek explicit permission before using copyrighted material? The answer will determine how much control creators truly have over their work.


Beyond the UK, international frameworks are taking shape. The EU's landmark AI Act is already in force, imposing obligations on companies both within and outside Europe. Meanwhile, U.S. states like California are introducing their own AI-specific laws, expanding rights and responsibilities in ways that will ripple across the industry. Transparency is a recurring theme in these efforts, with growing pressure for clear labelling of AI-generated content and ethical guidelines to ensure accountability. As regulatory activity in this area intensifies, creatives will need to stay informed and proactive—because the future of AI in filmmaking won’t just be shaped by technology, but by the legal and business frameworks built around it.


But contracts remain fundamental, and this is where good legal guidance becomes indispensable.


Look: any lawyer can draft a contract—that is the bare minimum for surviving law school. But a truly effective lawyer does much more than that. The right legal counsel:

  • helps clients anticipate challenges before they escalate into disputes.

  • supports clients in negotiations, ensuring contractual terms align with both creative and commercial realities.

  • assesses risk pragmatically, balancing protection with practical implementation.

  • empowers clients to make informed decisions that serve not only legal compliance, but also long-term artistic and financial goals.



WHY CARE?


Because at its core, this is not just a conversation about contracts, copyright, or AI. It is about something far more significant. It is about stories - about how we make sense of the world; about how we understand the past, imagine the future, and connect with each other in the present.


And I think today, more than ever, stories matter.


Things feel so uncertain and quite frankly frightening these days, not just in places like Ukraine or Gaza but also in the States, where narratives are being rewritten or silenced, where truth is contested.


Stories like the ones filmmakers produce and create create empathy, bridge differences, and hold a mirror up to society—sometimes reflecting beauty, sometimes confronting us with things we’d rather not see. The stories we tell today will shape how future generations understand this moment in time.


And that’s why the rise of AI in filmmaking raises such profound questions. And if we don’t set the terms, if we don’t define what’s acceptable and what’s not, we risk a future where stories are reduced to just content—something endlessly scraped, remixed, and reassembled by systems that have no understanding of what they mean. As someone who loves cinema, who believes in the power of human creativity, who cares about the world my son is growing up in—this is about more than just legal protections.


AI is not going away. The technology will keep evolving. But that doesn’t mean we don’t have a say in what happens next.

Comments


Sign up to receive notifications

Thanks for submitting!

© 2024 Kelsey Farish

bottom of page