To receive the Vogue Business newsletter, sign up here.
This weekend at its annual conference, Adobe debuted Project Primrose, a sparkly, silver interactive dress that can change its fabric pattern, colours and style while on the wearer using generative AI. Videos of a model (who was actually one of the engineers on the project) wearing the dress as its rows of modular tiles change and react to reflect new designs in real time went viral on social media after the presentation. It was the most notable display of how technology could change the way we dress in recent memory. So why does “smart” fashion feel so stuck?
“It just seems gimmicky, where all the budget has gone to the launch instead of the actual technology,” says Samantha Taylor, founder of more sustainable sportswear company The Good Factory, who first posted her reaction to the dress on LinkedIn. “Fashion is supposed to inspire, akin to art and self-expression. But this was a sequin dress in grey with basic patterns.”
For the Adobe team that made the dress — Gavin Miller, VP of Adobe research, research scientist Christine Dierk and emerging devices group lead TJ Rhodes — the focus was on the launch, and, admittedly, the dress is certainly not poised to scale at the moment. “I’ll be honest, we don’t have a plan,” Miller says. “We had a plan to show it at [Adobe] Max and to listen to the reaction — and then to make a plan. We have a plan for a plan.”
“We’re certainly not there yet,” he adds. “This was a very first low-hanging fruit,” Miller says, designed to display what the technology could be.
Smart fashion appears to be stuck in a cyclical loop of small-scale experiments that sputter out, failing to launch beyond initial demos that draw attention online, whether at tech conferences or on red carpets or runways. In 2016, Levi’s partnered with Google’s smart fabric initiative, Project Jacquard, on interactive denim jackets woven with conductive fibres. In 2019, Saint Laurent partnered with the same programme on its Cit-e Backpack, which enabled it to control music, drop pins on the go, and take photos via gestures. Also in 2019, Loomia debuted the Loomia H1, a heated wool jacket powered by a two-hour battery and washable by hand. Smart textiles buzz has been relatively quiet since. In March 2023, 9to5 Google reported that Google was quietly shutting down the Jacquard app. By April, users were no longer able to use their Jacquard accessories.
According to the release, Adobe’s Project Primrose’s premise is to create a “wearable, adaptable, and flexible dress” that content creators and designers can use in the design process. The team emphasises that it’s a prototype, a first edition. Project Primrose uses non-emissive textiles that can display content created with Adobe applications. Designs can be static or animated, controlled by a button. It also has embedded sensors to respond to the wearer’s movements. Designers might use Adobe Firefly to generate imagery from text prompts using generative AI, for example.
In the works since pre-pandemic, the dress is made up of what the team calls “petals”, which are actually PDLC smart glass (a type of liquid crystal), which can switch between being transparent and frosted, Miller explains, giving the appearance of a colour change — though it’s limited to two shades of silver. More colours are top of mind. “It’s one of those future directions that, given the success of this, we’d love to pursue,” Miller says, flagging that, while technologically possible, it’s likely a multi-year effort.
Underlying its foray into fashion is a bid to introduce what Miller calls a “new canvas” for Adobe software. “Because it didn’t exist, it was necessary to invent it,” he says. “But now we have, we’re hoping that the industry embraces this at scale so that we can do our main task, which is building great design tools and software and content.” Will the industry buy in?
Beyond marketing?
Some online commenters were impressed with the tech but called the dress “ugly” or pointed out that the shifting petals only appear on the front, while the back is plain. One wrote: “It’s giving Hunger Games”, in a nod to the outlandish outfits worn by those in the Capitol. Others defended the dress as a proof of concept.
Madison Maxey, founder and chief innovation officer of New York-based Loomia, was impressed with the technology. “I think the dress is beautifully done,” she says. “I’m excited to see an electrical engineer and computer scientist at the reins of this prototype. I was impressed with how natural and smooth the animations were.”
Albert Ayal, who runs the account @UpNextDesigner on Instagram that spotlights emerging designers, was also impressed. “I think for bigger fashion companies, it would be a good — and smart — investment,” he says. “Designers will be able to cut down the design process, which consists of multiple drafts and a lot of back and forth. It will help with the creative process and also give more freedom to experiment with different designs and ideas.”
Taylor, however, is unconvinced that it’ll be good for designers. “The sequins are large, which makes it difficult to create patterns of interest and detail,” she says. “The fashion industry is already struggling with price and getting people to pay for something. Are we really likely to see designers being paid fairly for each new pattern they make? Like most commercialised inventions, it will probably dilute the income of the designer pretty quickly.”
She also flags sustainability and useability concerns. “If they’d done something using conductive yarns, that would have been interesting, but they decided to go with something that will catch on jewellery, loose threads, zips on over garments, etc.,” she says. ”It’s lacking a repair element, the longevity of it looking new. All the things we know fashion needs to do in order to reduce its impact.”
Whether or not the dress can go beyond a marketing moment comes down to economics, Maxey says, asking the following: Can it be mass-produced at an affordable price? Can it be used reliably? Will consumers adopt it? It’s too soon to say, she says.
For the Adobe team, the major reaction has driven them to want to innovate in two streams: to figure out a practical version for people to be able to use and to develop colour capabilities. But from the sound of it, these innovations are a way off. “It’s very, very early days,” Miller flags.
And if it can’t scale? “Projects like this are crucial for inspiring people and driving vision,” Maxey says. “I think it’s also OK if it’s just a social media moment that reminds everyone how cool these materials can be.”
Comments, questions or feedback? Email us at feedback@voguebusiness.com.
More from this author:


