In the world of illustration, AI has very quickly transformed from a future existential risk to a set of products having a tangible impact on the industry’s present.
Its effects are being seen in the legal landscape of image licensing and usage, to the copyright implications inherent in new “illustrator-friendly” AI tools, to the fees that illustrators are able to charge for their work.
And while many practitioners are now embracing the technology in their process in some form or another, others are more concerned than ever that AI poses a significant threat to the future of their craft.
In 2024, a survey of the members of The Society of Authors found that 26% of its illustrators had already lost work to AI. And 37% said that illustrated work had decreased in value due to the prevalence of Gen AI products.
In early 2025, in response to the UK Government’s open consultation on Copyright and Artificial Intelligence, the Association of Illustrators (AOI) surveyed its membership and found that over 32% of respondents had lost work to AI at an average of £9,262 per affected artist.
A frustrating lack of transparency
It is, says Rachel Hill, CEO of the AOI, difficult to have a nuanced conversation about AI at the moment. But these figures make it impossible to argue that the tech isn’t having a negative impact on the industry.
That doesn’t mean the AOI is inherently anti-AI – you can read its detailed stance on the issue here. It does, though, have no plans to allow AI-generated work to be submitted to the World Illustration Awards until there’s a legal framework preventing the exploitation of illustrators’ work in AI training.
Rather, Hill and her team are frustrated by the constant lack of transparency shown both by AI product providers, national governments and legislators around the world, which leaves clients, illustrators and agents all exposed to knotty legal issues.
For a body that seeks to represent, protect and advocate for the rights of the nation’s illustrators, this is making the AOI’s job more important than ever.
Karina Cao’s piece for Digital Frontier
New questions for agents
The same is true for agents representing commercial illustrators, whose whole approach to creating contracts and licensing agreements has been thrown into disarray due to ambiguity around AI.
“Adobe has been building AI into a lot of their products,” says Jeremy Wortsman, founder of the Jacky Winter Group, an international illustration agency headquartered in Melbourne with offices in the UK and US.
As a result, “We’ve had clients try to extend illustrated images or add extra animation frames to finished work. These things weren’t within the licenses, and artists are getting really upset by clients mangling their work.”
On the flipside, Wortsman has seen clients respond negatively to artists who have clearly used AI to create preliminary sketches for projects.
“At the same time, there’s a huge amount of interest from clients wanting to commission artists to make custom models for them, just like clients wanted entire asset libraries 10 to 15 years ago.
“But now they want it on the scale of AI, which is essentially saying, ‘We’ll pay you a tonne of money so we can replicate your style as much as possible’.”
This is the same promise on offer from “illustrator-empowering” off-the-shelf AI apps like Tess and Exactly.AI, which allow illustrators to train their own model exclusively on their own work.
In theory, this allows illustrators to generate unlimited images without any ethical concerns, but in fact, the legal issues are more complex.
“Some people think this kind of work is the Holy Grail of branding projects right now,” says Wortsman. “But the legal implications are so dicey and we’re starting to see lawsuits, like the Anthropic one,” in which the AI company was forced to pay $1.5billion to settle authors’ claims it had pirated their work to train chatbots.
For Wortsman, it’s a headache because his role is to make money for his artists.
“We represent artists with different approaches and opinions on AI,” he says. “The challenge of running an agency is that you can’t impose your own morals, and I really want to leave it to our artists to decide. We don’t want to shut down any opportunities.”
One of Charlotte Cripps’ pieces for Digital Frontier
A clear AI policy
To this end, Wortsman has spent the past few weeks co-authoring an agency-wide AI policy to clarify the group’s position to illustrators and clients alike.
It guarantees that all final work will be human-made.
If an illustrator uses AI during any part of the process it will be discussed with the client beforehand, and the specifics of that usage clearly disclosed.
The agency will consider all AI usage on a case-by-case basis, so as not to close down any work with clients interested in exploring its use.
In return, they expect their clients:
To respect the rights of their artists, as outlined in their contracts, and not to use in-built AI functionality in tools to alter and amend work.
That final images are not used for any unauthorised training of AI models and are not uploaded to platforms where they’re likely to be scraped.
They agree to notify both agency and artist before using any form of AI in the project workflow.
Wortsman expects to amend this policy on a quarterly basis to allow for the rapid pace of change in AI tech.
By the time he makes the next update, the UK government may have finally decided on a course of action following its Copyright and Artificial Intelligence consultation. But Hill doesn’t believe legislators are in a rush to make a decision.
Following on from its survey, the AOI has been proactive in joining the Creators Right Alliance and the British Copyright Council in pushing back forcefully on the UK government’s proposals to exempt AI development companies from existing copyright laws. This would allow them to scrape any online visuals, unless creatives opt their work out on an image-by-image basis.
Over 92% of UK illustrators say that opting out of training datasets would seriously harm their business by reducing their discoverability online.
Currently, illustrators are reliant on publishing large numbers of their images to the web to attract business, making opting each image out of training a logistical impossibility.
“The government’s closeness to a few big names in tech is influencing their decision-making,” says Hill. “So creators have really come together in force to oppose that.”
Not all of them of course.
AI enthusiasm
“I don’t think it’s like the big juggernaut trying to stamp on people’s creative abilities,” says designer, illustrator and art director, Charlotte Cripps. “I think it’s important to embrace it, otherwise you’re just going to be left behind.”
Until recently, Cripps was design director of Digital Frontier, a UK-based platform that explored the intersection between technology, business and society.
Until she landed the role, she had no experience at all with AI, but over the last two years has honed her skills across Midjourney, Runway, Flora and others, to produce illustrations and animations that won the publication a loyal following and various industry awards for its rich visuals.
During that time, Cripps says that AI allowed her to produce imagery in a way that kept pace with a relentless online publishing schedule, while allowing her to be more experimental and tangential with her creative thinking.
One of Charlotte Cripps’ pieces for Digital Frontier
She is, however, keen to stress that none of the illustrations she produced at Digital Frontier were just made with AI. Rather, AI formed part of her process, with final images developed and finished with human input – and she always commissioned actual illustrators for the print magazine.
“You’ve got to see it as something that can aid you to propel yourself further and set yourself apart from everyone else doing the same trade,” she says. “If you don’t, you’ll find people will be able to generate work much faster than you.”
Now that she’s left Digital Frontier to go freelance, she says clients are approaching her specifically for her ability to amplify her design and illustration abilities with new AI skills.
“Reassuringly shit”
This has not been the case for another UK-based illustrator, Matt Blease, whose hand-drawn work has recently been used to promote IBM’s latest AI tools.
“I’ve just finished a big campaign for IBM where they used a real illustrator and a real frame-by-frame animation studio to create beautiful animations to advertise their AI products. And that did all feel a bit odd.”
Unlike Cripps, Blease is unconvinced by the value of AI for image-makers. “It’s reassuringly shit,” he says, smiling. “When I’ve been struggling for a concept or something and I’ve gone to ChatGPT, the ideas have just felt so basic.”
As a result, he neither feels threatened by AI – “I’m still insanely busy” – nor compelled to use it to stay competitive. But he is frustrated by the way it’s exacerbating an already messy problem in the industry – plagiarism.
Blease reports seeing his work plagiarised on an increasingly regular basis since the arrival of AI-powered image generation, compounding an already significant industry trend towards designers expecting illustration to be free.
In fact, he readily admits to having been part of the problem in the early days of his career.
“When I was a graphic designer commissioning illustrators, if there wasn’t a budget for imagery, then my senior designers would come to me because they knew that I could draw. I’d adapt my drawing to someone’s signature style and it would go on a billboard somewhere. I was basically AI.”
Wortsman agrees that AI itself is not the issue, but as a tool it amplifies a lot of existing negative practices across the creative industries. And it’s making life harder for many people, forced to balance competing pressures.
“If I use chat GPT now for anything, there’s an environmental cost, there is an ethical cost, but I’m also able to serve my artists better, because I’m able to work faster or communicate better,” he says.
“But the original sin of AI is that there’s no way to use it ethically right now, because any data set that you intersect with is using compromised information. You have no idea where these images have been scraped from, and there’s just no way that you can use it commercially.
“Life now just requires endless trolley problems in every aspect of our life. We always have to ask, ‘What’s the least-worst thing we can do’?”