As fashion’s reliance on AI deepens — from design and forecasting, to digital clienteling and chatbots — new risks are emerging. The same systems that promise efficiency and personalisation are also expanding the industry’s potential exposure to issues like cyber attacks and costly, misguided inventory forecasting.
McKinsey estimates that generative AI could add up to $275 billion in operating profits to apparel, fashion and luxury within the next three years, but that potential comes at a cost. Nearly three-quarters of organisations say they experienced a confirmed AI-related breach in 2024 — a sharp rise from 67 per cent the year prior — underscoring the urgency for companies to secure their AI systems, according to HiddenLayer’s second annual ‘AI Threat Landscape’ report. Nearly half (45 per cent) of those affected chose not to disclose the incident, citing fears of reputational damage, while highlighting the growing tension between transparency and trust in the age of intelligent systems.
Fashion’s embrace of AI is no exception; innovation may be moving faster than governance. In the rush to stay ahead, brands often treat emerging technology like a trend. But this mindset risks undermining the real potential of AI tools as well as the stakes, says Dan Gardner, co-founder of technology-first creative agency Code and Theory. Vulnerabilities once limited to payment systems or e-commerce platforms now extend into algorithms, data sets and consumer interfaces — making cybersecurity a strategic priority for fashion, not an IT issue.
Two startups are using AI to drastically improve the visual quality of virtual try-on avatars, removing one of the biggest barriers to uptake.

“Fashion brands can fall into the trap of treating technology the same way they treat seasonal trends, like it’s something to make a splash with instead of building lasting value,” says Gardner.
Rather, brands should approach AI initiatives with the same strategic rigour and long-term investment mindset they would apply to a new product line or market expansion. Real innovation in next-gen fashion tech demands secure infrastructure, commitment and patience — not just a headline-making moment.
Stakes and vulnerabilities
Fashion trades on intangible capital — image, trust, cultural relevance. Small misjudgements, whether in forecasting or merchandising, can unravel millions in accrued equity. Simultaneously, scrutiny from regulators, investors and consumers is tightening.
“Fashion companies are particularly exposed because they’re not banks — they’re not staffed or regulated for this level of digital risk,” says Anthony Lupo, chair of ArentFox Schiff, a law firm that focuses on fashion, entertainment and tech. “They often don’t have the budget or the technical infrastructure to protect themselves the way financial institutions do.”
Gina Bibby, corporate partner and head of the global fashion tech practice at international law firm Withers, points to the opacity of third-party “black-box” systems as a legal headache: brands often cannot see how a vendor’s AI models operate, which complicates compliance. “Vendor AI systems may store data on servers that are not maintained or controlled by the brand, which means the data may be vulnerable to hacking beyond the brand’s control or visibility,” she explains.
AI vendor relationships also introduce dependency, with fashion companies locking themselves into AI systems that may not be interoperable, says Lupo. “Some vendors are offering deep discounts in the first year, but once you’re in, you’re stuck — and the costs skyrocket later.”
Though traditional threats like ransomware and third-party breaches persist, as evidenced by recent incidents affecting Victoria’s Secret, Marks Spencer, Dior and Pandora, AI introduces unique vulnerabilities: adversarial attacks on models, data poisoning, exposure of customer information via chatbots or augmented reality tools, and potential theft of proprietary algorithms.
A recent spate of attacks has underlined the potential consequences for fashion retail. Experts weigh in on the best defences.

“AI will take traditional attacks — like ransomware, domain hijacking, and supply chain disruptions — and elevate them to a much more sophisticated level,” says Lupo. “I’m very nervous for companies that are still in what I call ‘stage one’, the early awareness stage of AI experimentation.”
Adversarial attacks subtly alter input data to trick systems into producing incorrect output — manipulating sizing or recommendations, for example. Data poisoning injects flawed or misleading data into training sets, degrading model performance or biasing recommendations.
“As amazing as these AI systems are, they’re also going to make it easier for bad actors to perpetrate fraud and gain access to systems in ways they never could before,” says Lupo. “The only real chance companies have to protect themselves is to use AI to combat AI.”
Brands must embed AI-specific safeguards, rigorous governance and adversarial testing, or risk compromising both operational efficiency and consumer trust. Explainable AI tools provide transparency into model decisions; secure infrastructure protects proprietary algorithms and customer information; and strict oversight of continuous learning prevents harmful inputs. Robust incident response plans tailored to AI breaches ensure swift recovery and consumer confidence.
ArentFox Schiff’s Lupo stresses planning for defense as seriously as offense. “Every brand needs a SWAT-style plan for what happens in a breach — not just IT, but finance, HR and legal,” he notes. “You can’t only focus on AI’s creative or cost-saving potential; the defensive side deserves equal attention.”
Managing AI responsibly
Human oversight remains the simplest, most effective guardrail against AI mismanagement. Designers, merchandisers and curators must remain the final arbiters of taste. Matthew Drinkwater, head of the Fashion Innovation Agency at London College of Fashion, warns that algorithms can reinforce the status quo and miss real trend shifts unless fed diverse cultural signals. “Culture is messy, fluid and subversive; algorithms can struggle to catch that,” he says. Human gatekeepers will be indispensable to preventing echo chambers and preserving the idiosyncrasies that make one brand distinguishable from the next.
For any responsible fashion company, data hygiene and governance are non-negotiables. Brands must invest in representative data sets, provenance tracking, and clear policies on intellectual property and model usage. Shirlene Tsang, co-founder of jewellery brand Objkts, suggests treating an AI roadmap like a collections calendar: clear objectives, staged tests, kill switches and post-mortems.
Thoughtful deployment is crucial. Automation should only be applied where it adds value and brands must be candid with consumers when AI is involved. Cybersecurity safeguards must include encryption, segmentation of sensitive assets and careful oversight of third-party systems. Given how quickly AI tools and models are evolving, legal counsel is no longer optional. The twin tools of disclosure and generous returns policies will be brands’ best short-term friends while consumers adjust to AI-augmented fashion retail experiences, says Michelle Mancino Marsh, co-leader of ArentFox Schiff’s consumer products group. As regulations evolve, brands must be ready to demonstrate not only that their systems are effective, but that they are fair, explainable and safe.
“I do think we’ll see fashion companies appoint chief AI officers,” says Lupo. “Part of their responsibility will be brand and data protection, but most CEOs will expect them to focus primarily on growth and efficiency — and that imbalance creates risk.”
More fashion brands are appointing execs with a direct AI remit. So what does a CAIO do, and does every brand need one?

The central imperative for executives overseeing AI initiatives is simple but urgent: move fast, but deliberately. “I’d argue the hidden cost for executives is moving too slowly and doing only incremental experiments,” says Adam Behrens, co-founder of agentic commerce start-up New Gen. The industry does not need theatre — flashy pilots that produce headlines but no learning. It needs focused investment, rigorous measurement and governance that preserves creative distinction.
“Pandora’s box is open — you can’t avoid using AI,” adds Lupo. “But you can control how you use it. Keep financial and customer data in a sandbox, separate from generative tools, and make sure your systems are insured and auditable.”
Ali Furman, consumer markets industry leader at PWC, reinforces that managerial mandate. “We advise fashion executives to move quickly and responsibly — with the right governance and risk protocols in place — treating AI as a growth and transformation engine with prioritised initiatives across the value chain,” she says, “rather than isolated experiments or siloed pilots.”
Comments, questions or feedback? Email us at feedback@voguebusiness.com.
