This article on Europe’s AI Act is part of our Vogue Business membership package. To enjoy unlimited access to our weekly Technology Edit, which contains Member-only reporting and analysis and our NFT Tracker, sign up for membership here.
The European Parliament has approved the world’s first comprehensive regulations for the development and use of artificial intelligence. The AI Act marks a turning point for how the technology is used by all industries in the European Union going forward, including fashion.
The AI Act serves to “protect fundamental rights, democracy, the rule of law, and environmental sustainability from high-risk AI”, all while ensuring innovation remains rife in Europe, the EU Parliament said in a statement on Wednesday. “The AI Act is a starting point for a new model of governance built around technology,” said co-rapporteur Dragos Tudorache.
The AI Act was initially proposed by European lawmakers in 2021 but began gaining traction over the last year due to the rapid advancement of powerful large language models, such as OpenAI’s ChatGPT and Google’s Gemini chatbot. In December 2023, European lawmakers reached a provisional decision on the bill, which was followed by another vote from the internal market and Civil Liberties to endorse the provisional agreement in February 2024.
Some tech industry leaders have voiced concerns about stifling innovation. In June 2023, EU leaders were presented with an open letter issued by over 160 executives from tech companies around the world, warning that the proposed rules could cause heavy regulation and incur both liability risks and high compliance costs for the companies developing the technology. In November, a coalition of businesses and tech companies signed another letter that stressed similar concerns over too-stringent regulations potentially driving away innovation across various industries from the region.
In an attempt to assuage some of the concerns, the European Commission in January launched a package of measures to support EU startups and SMEs in the development of “trustworthy” AI. This includes the formation of an AI Office within the commission to foster uptake and innovation. (Other governments, including the US and the UK, have included provisions for fostering innovation while under the guidance and participation of regulators.)
The EU Parliament argues that the AI Act’s risk-based approach offers a framework for the responsible development of the tech. Businesses can leverage this framework to build trustworthy AI tools, lawmakers say, with clear communication about data usage and consumer consent at the forefront.
While this legislation applies only to EU companies, its implications are likely to inspire a broader ripple effect across global brands and retailers, similar to how the EU’s General Data Protection Regulation (GDPR) resulted in a wider change in strategy for brand data policies. While the US has not enacted similar AI legislation, President Biden’s administration has issued an executive order that calls on the federal government to address concerns while fostering US-led innovation.
The AI Act could “potentially raise awareness and instil confidence in the responsible use of AI technology across a broad spectrum of participants within the fashion industry; from designers to manufacturers, retailers, e-commerce platforms, fashion influencers, celebrities, and consumers”, says Hong Shi, counsel at law firm Haynes Boone Hong Kong and co-chair of the firm’s AI practice. “As AI becomes increasingly integrated into various aspects of their operations, the Act is likely to stimulate discussions about ethical AI usage across the various players.”
What brands need to know
The AI Act measures risk in four distinct tiers: unacceptable, high-level, limited and minimal. Anything at the level of “unacceptable” risk is effectively banned, according to the EU. This includes “all AI systems considered a clear threat to the safety, livelihoods and rights of people”, such as biometric surveillance tools; motion recognition in the workplace and schools; social scoring; and predictive policing. Bans will start coming into effect from November.
High-risk AI systems include technology used in employment and the general management of workers. “Hiring decisions that involve the use of AI tech will be subject to strict obligations,” says James Brown, partner at Haynes Boone UK. This will also include CV-sorting systems and other software used for similar purposes, which Brown says will need to be risk assessed with “mitigation systems in place with high levels of robustness, security, and accuracy”.
“Limited risk” includes using AI for marketing, design, manufacturing, retailing processes, trend analysis and personalisation. “Low risk” includes AI-enabled video games or spam filters. These more general-purpose AI rules will apply from March 2025.
There are several implications for fashion, experts say. Brands and retailers that use AI to provide personalised recommendations or styling advice will need to review how that is disclosed to the customer. Clear labels will also need to be applied to any AI-generated content. “It is likely we will see websites, emails, advertisements and other material created containing a disclaimer specifying that the content observed, or platform being utilised, has been developed with AI becoming the norm,” says Brown.
This might encourage fashion brands to “thoroughly understand” the AI tools they are using and document how they are being deployed, says Agatha Liu, partner at law firm Duane Morris, who focuses on intellectual property. “Fashion brands need to strive to offer hard facts and sufficient details to customers, while still appealing to the senses.” She recommends a customer-centric approach: considering what customers would want to know about how recommendations are being made.
Data transparency and eradicating bias
The Act also mandates disclosure of training data used to develop AI models. For example, if a brand leverages AI to predict upcoming colour trends, this new legislation compels disclosure to regulators of the data used to train the AI. This would include the historical data sets, consumer preferences and social media trends that inform its predictions. This transparency not only informs users, but could potentially shed light on biases within the data that could skew design choices or marketing strategies. (The report defines “artificial intelligence” as “a machine-based system that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments”.)
As many brands use external software-as-a-service (SaaS) providers for some of these capabilities, the burden of following these guidelines might be shared with external tech providers.
AI tools will need brands to translate the steps they are already taking for inclusion into the models, says Dominique Leipzig, privacy and cyber partner at Mayer Brown law firm. It would give brands the ability to actively address these biases, ensuring their AI is inclusive and reflects diverse consumer preferences. “This can be done in the form of guard rails (for example, code) added to the AI tool itself that provides the brand’s values for inclusion and fairness,” Leipzig says. “The only way for the company to ensure that its values are included is if it continuously tests for bias in output and inputs every minute of every second of every day by dropping the code into the tool.”
The path to responsible AI adoption won’t be a solitary journey for fashion brands; collaboration with technology providers will be crucial, experts say. As AI becomes a mainstay in industry practices, brands will need to partner with AI developers who prioritise responsible data sourcing, transparent algorithms and ethical practices, which should ultimately lead to industry-wide discussions establishing best practice.
Comments, questions or feedback? Email us at feedback@voguebusiness.com.
EU act will set boundaries on AI — with room for innovation
US’s AI executive order sets tone, but not rules, for fashion
