Models gear up for an AI legal battle

Models depend on their appearances to make money, but as AI tools enable easier photo manipulation, the need for laws and guidelines is coming into sharp relief.
Model Jade McSorley
Photo: Brian Rolfe

This article on models and AI is part of our Vogue Business membership package. To enjoy unlimited access to our weekly Technology Edit, which contains Member-only reporting and analysis and our NFT Tracker, sign up for membership here.

When Taiwanese American model Shereen Wu was cast last minute in a runway show in Los Angeles in October, she agreed to walk for free in exchange for exposure — a compensation arrangement often dangled in front of hopeful talent hungry for experience. The next day, she was surprised to find that designer Michael Costello shared a picture of her on the runway to Instagram, in which her face had been altered past the point of recognition, she says. “It clearly had this eerie quality that is reminiscent of AI-generated images.”

More models are encountering AI-generated or altered images of themselves. Nassia Matsa recognised an uncanny version of herself in an ad for an insurance company she’d never worked with before. Jade McSorley (pictured at top) has been 3D scanned — before realising that she didn’t ultimately own the output.

The field of modelling is already overdue for further oversight, says Sara Ziff, founder and executive director of New York-based advocacy group The Model Alliance. Photoshop has been around for a long time, Ziff points out, but the new technologies are “compounding concerns about the exploitation of a workforce that’s extremely vulnerable because it’s lacked baseline protections for a very long time”.

With AI, anyone can now easily create and manipulate model imagery, including generating entirely fictional models (based off images of human ones), dramatically editing existing images and creating 3D avatar versions of human models. Proponents hope that these tools can streamline workflows, decrease waste and foster creativity, but modelling is one of the first industries to sound the alarm on the technology’s more sinister side: their likenesses are being used without their consent and without compensation.

“Models and actors are some of the first to be impacted, because their image is their livelihood,” McSorley says.

Models say that there is currently little recourse for if they feel that an image has been inappropriately used or altered. “As of today, there is no legal framework that specifically protects models from being ‘AI-ed’,” says Elizabeth Peyton-Jones, CEO and founder of advocacy organisation Models Trust.

In response, models, agencies and advocacy groups have begun developing guidelines and legislative frameworks to help the industry navigate the arrival of AI. The Model Alliance has added AI-specific amendments to its Fashion Workers Act, which is moving through the New York legislature and includes protection for models. The British Fashion Model Agents Association (BFMA) is also in the process of developing a 10-part code of conduct around the development and use of model avatars. Additional ongoing global regulations in more than 90 countries are in various stages of development.

New guidelines for transparency and consent

New York’s Fashion Workers Act, if passed this year, will have national implications. The bill was recently amended to incorporate language that defines a “digital replica” and requires that management companies and brands obtain written consent to create or use a model’s digital replica while detailing the scope, purpose, rate of pay and duration of use. It prohibits them from creating, altering or manipulating a model’s digital replica via AI without clear written consent, and prevents management companies from holding power of attorney over the model’s digital replica.

State senator Brad Hoylman-Sigal, who sponsored the bill, called models “the essential workers fuelling the industry” during a Fashion Alliance press conference held this week to introduce the changes. “Everyone should have the right to control their own image, especially when their livelihood depends on it.”

Currently, the norm is that modelling agencies have a blanket power of attorney over models, says model Alex Shanklin, even with previous instances of alteration that were done before these new AI tools became available. “As models, we don’t sign off on these things.” It’s not just odd; it can also hurt their career, as models build portfolios to gain work.

A 3D scan of model Jade McSorley

Models are being increasingly 3D-scanned, with the client or brand ultimately owning the resulting content.

Photo: Callum Toy

In the UK, McSorley is helping shape the code of conduct that is meant to serve as a resource for when a model’s digital scan is captured. It’s being developed by the BFMA, which works with the British Fashion Council as well as many local modelling agencies, and hopes to address the need for transparency on how a digital model is created and used, including control over how the model’s image is represented, the need for secure storage of personal data and the inherent value that digital models provide.

Broader-scale AI regulations reference topics that could in practice apply to using AI to create modelling imagery. The EU AI Act — expected to be passed in the coming months and go into effect in 2026 — is likely to take a “light touch approach” to AI regulation, but will acknowledge deep fakes, says Jamie Rowlands, partner at intellectual property lawyers and patent attorneys Haseltine Lake Kempner. Similarly, US president Biden’s AI Executive Order — serving only as a direction to Congress — calls for new safety standards that prevent deception.

“Creative rights must be top of mind when thinking through adopting generative AI systems,” advises Joy Buolamwini, author of Unmasking AI: My Mission to Protect What is Human in a World of Machines. “Creative rights should uphold the four Cs: consent, compensation, control and credit. Artists should be paid fairly for their valuable content and control whether or how their work is used from the beginning — not as a post-hoc opt-out.”

What models can do

In the UK, there isn’t a protection mechanism for image rights as there is in some jurisdictions, Rowlands says, so protecting image rights can require “an imperfect patchwork of rights to try to get a claim off the ground”, including privacy rights, trademarks, “passing off” (meaning passing off one thing as another) and more. Peyton-Jones adds that data privacy and protection laws such as the General Data Protection Regulation in the EU and the California Consumer Privacy Act may apply, as they “emphasise the importance of consent and right to use when processing the data”. Still, she adds, since tools such as text-to-image generation service Dall-E process data from diverse sources, there’s no way to know the data that was used to produce an image.

model Shereen Wu

Model Shereen Wu, left, says that when she saw an image of her runway looked shared on social media, the face had been edited beyond recognition, seemingly using and AI tool.

Photos: Tommy Lu, courtesy Shereen Wu

Wu says that after contacting Costello, who shared the altered runway image, the photographer and the organisation who produced the show, she is still unclear who altered it; the designer initially offered to compensate her, but has yet to do so, she says. “As an independent model who’s just starting out their career, I was walking for exposure that I ultimately did not get,” which she attributes to the alterations made to her face. (Representatives for Costello and the Art Hearts Foundation, who produced the event, did not respond to requests for comment.)

For now, experts advise awareness and proactive measures, because it can be especially intimidating for models to speak up in the moment on set if, for example, a brand asks them to be 3D scanned, McSorley says. McSorley’s work has primarily been for e-commerce brands, and a number of peers have told her that they have arrived on set to see a 3D scanner without knowing the details about what they are giving away or how it will be used. “I feel like models don’t really have a choice at the moment,” says Robyn Lawley, who is used to seeing her image digitally altered. “You can speak to the company, but you lose jobs as a model, because a model is supposed to be open to looking a different way.”

Buolamwini says that within the last year she has been working alongside her agents to update her contracts to include biometric and AI-derivative protections. “I would highly encourage all fashion models and agencies to adapt contractual language to account for generative AI capabilities.”

Why brands are exposed to risk

The legal recourse of misusing AI-generated imagery is not yet clear, as legislators call for new rules on how, or if, AI companies licence protected IP. But bad press is enough to send a warning shot to brands.

“All of this is inextricably linked to brand value and trust. If you are using an AI-generated model and not paying any fees to the actual model in real life, if that model goes viral or public, companies have to think how this will impact their brand,” says Dominique Shelton Leipzig, a privacy and cybersecurity partner at law firm Mayer Brown. She says a key theme across proposed legislation is that individuals have a right to their own image. She advises that companies whose business models are not deeply reliant on using this type of material or technology tread lightly until laws, lawsuits and new norms play out.

Models acknowledge that there are some potential positives of digitising their likenesses, including the ability to continue to work without being physically present. Already, Eva Herzigová and Kendall Jenner have digitised their appearances, and agencies such as LA-based Photogenics are promoting avatar services for their rosters. “Models are definitely interested in the opportunities,” McSorley says. “If they can work for multiple brands and be on holiday? Absolutely.”

The difference though, is having a choice. “What we’re looking for is transparency from all parties involved,” Wu says. “What we’re asking for isn’t too difficult.”

To receive the Vogue Business newsletter, sign up here.

Comments, questions or feedback? Email us at feedback@voguebusiness.com.