When Aishwarya Lahariya studied fiber chemistry at university, she learned there was a standardized way to process cotton. But when she co-founded artisanal fashion brand Jiwya and started working with artisans across India, the drawbacks of this singular approach, which relies on significant amounts of water and chemical inputs, quickly became apparent.
The method used by the artisans was both quicker and less resource-intensive than what she had been taught. They skipped scouring (the stage at which waxes and oils are removed with hot water and cleaning agents) and bleaching (chemical whitening) — both considered unnecessary when working with natural dyes, as Jiwya does. “A lot of Jiwya’s water savings happen because we do not do those two steps,” she says. And yet, when Lahariya received her formal education, this alternate process was never mentioned.
Much Indigenous knowledge is undocumented within prescribed Western channels and requires meaningful outreach and enquiry to discover. Now, with the rise of AI and its increasing prominence in research, information gathering, and decision-making, the bias toward Western ideals is being magnified, and fashion faces the risk of further detaching from crucial Indigenous and traditional knowledge as it outsources more processes to the technology.
“AI cannot replace the lived human experience,” says Virginia Keesee, senior director of global fashion and nature initiatives at Conservation International. “Indigenous peoples and local communities are a huge part of the fashion value chain as stewards of nature, biodiversity, and climate. Partnership with them and support of them is critically important, not only for people, but for the future of our planet.”
Fashion has a history of sidelining Indigenous peoples. In 2022, Textile Exchange reported that only 5% of 252 fashion and textile companies surveyed said they were consulting with Indigenous peoples and local communities as part of developing their sustainability strategies. To combat this lack of engagement, Conservation International, Textile Exchange, and Kering collaborated on a guide to Indigenous partnership principles in 2024. The idea was to protect Indigenous communities from exploitation at the hands of fashion, which in the past has included encroaching on their land, depleting the biodiversity they rely on, and copying their traditional textiles and designs without due credit or compensation. It was also intended to promote the incorporation of Indigenous knowledge systems into sustainability strategies, which can range from wild rubber tapping that maintains tree health over time, to natural dye derived from cassava bark and land preservation.
The enmeshment of AI in fashion production and design — from personalized ads, virtual fittings and campaign imagery, to demand planning, supply chain transparency and nature-based solutions — only further complicates matters: fashion queries are much more likely to be informed by data from US or European research bodies, industry standards or brands, than Indigenous knowledge. Trained on data produced by people, AI absorbs — and amplifies — the human biases present in said data, heavily favoring dominant Western perspectives.
When I asked ChatGPT for a list of experts in cotton and water stewardship, the result consisted entirely of Western academics and climate NGOs. Another prompt asking where water data savings were sourced read: “The training data is not evenly distributed globally. Indigenous, local, or unpublished farmer knowledge is under-represented.” (OpenAI, the US developer of ChatGPT, did not provide comment in time for publication.)
ChatGPT’s new shopping recommendation features could be a breakthrough moment for conscious consumption — but the risk of greenwashing is high.

It’s not so simple as inviting traditional communities and Indigenous peoples to the table, either. Many don’t want their knowledge exploited by AI even if asked to participate. But if underlying biases are left unaddressed, they have the potential to undercut progress toward both sustainability and diversity and inclusion.
Who does AI benefit?
Taylor Sparklingeyes is a senior data sovereignty specialist (which relates to the collection, ownership, and use of data) for environmental and community development consulting firm Shared Value Solutions, and a member of Goodfish Lake First Nation, part of Treaty 6 territory in Canada. After members of the Indigenous communities she works with began asking what AI is and whether they should be using it, Sparklingeyes enrolled in the Indigenous Pathfinders in AI program run by Montreal AI research institute Mila, designed to empower First Nations, Inuit, and Métis participants to learn Indigenous-centered approaches to engaging with AI.
Sparklingeyes warns that the speed at which the technology is moving (it is the fastest-spreading technology in human history) risks aspects of safety, security, and privacy being overlooked among Indigenous communities. “That’s one thing working with Indigenous communities, if you want to be a true ally, sometimes you have to not worry about time and expectations. It takes a long time to build those trusted relationships, and they should be the foundation of this work, whether it’s around the co-design of governance, of data, or what impact these systems will have on communities,” she says.
New guidelines set out to help the industry avoid cultural exploitation as well as forge better sustainability strategies.

Some experts worry that AI’s bias is not only present, but intentional. Deepak Varuvel Dennison, an AI researcher and PhD student at Cornell University, argues that AI platforms have a direct economic incentive to pay for knowledge that reflects the majority of their paying user base rather than niche or underrepresented subjects. Reaffirming user biases is more likely to keep people on the platform, because their beliefs go unchallenged, and a user base concentrated in the Global North further fuels the “silicon gaze” and marginalizes Indigenous knowledge. “What’s economically valuable to the people in power gets promoted and [what isn’t] gets delegitimized,” Dennison says.
Reckoning with access
Complicating Indigenous representation in AI is a bigger question of whether or not traditional communities want the technology to have access to their data and insights. For many creators in the Global North, this is the first time they’ve reckoned with how their data is used and how to claw back ownership. For Indigenous communities, however, the fight for data sovereignty is nothing new.
“[Indigenous communities] all have unique experiences when it comes to the historical harms of knowledge and data extraction,” says Sparklingeyes, noting that many communities don’t even have access to data about them, because it is often extracted by force or under unfair and misleading terms. This data can span from maps to art pieces, some of which may have been used to train AI if it is present online, in scientific journals, or on government databases, all of which are mined for training. It is likely, however, to be removed from its original context and presented in Western papers and research materials, as freely accessible English language research from high-income countries is over-represented, according to a ChatGPT context note on source materials.
To ensure any attempted rebalancing of the AI landscape is not rooted in the inequitable extraction of data, Indigenous-led non-profit Earth Daughters advocates for the establishment of safeguards. “When we refer to safeguards, we mean community‑defined protections such as free, prior, and informed consent, Indigenous governance over data and knowledge, fair compensation, and the genuine right to refuse participation,” the Earth Daughters team tells Vogue Business over email. “These safeguards must exist before engagement begins, and cannot be reduced to technical or checklist‑based solutions.”
In practice, this might mean Indigenous communities deny fashion or technology companies access to their data. Intellectual property lawyer Monica Boţa Moisin founded the Cultural Intellectual Property Rights Initiative (CIPRI) in 2018, to support the recognition of cultural intellectual property rights related to traditional garments, designers, and manufacturing techniques.
In 2019, the Oma — an ethnic minority group located in northern Laos — accused a high-end Italian fashion brand of selling clothes featuring facsimiles of their traditional designs. CIPRI®, in partnership with the Traditional Arts and Ethnology Center, supported them in creating a digital database to protect their traditional knowledge and cultural expressions, while taking control of how they are accessed and commercialized. When a researcher approached them about using the dataset to train an AI system their institution was developing to avoid the risk of cultural misappropriation in fashion design, the Oma and their support team were able to thoroughly assess the request.
Ultimately, the Oma declined the request, concluding that the benefits to the community were insufficient to warrant access. Once data was used for training, it could dissuade further direct outreach and engagement from the fashion industry. “Yes, you cannot escape the use of technology. But it’s also important [to ask], is this something that can be beneficial to the Oma? Do they have the necessary infrastructure to be able to benefit from this as well? And how?” says Boţa Moisin.
“Free, prior, and informed content is iterative. It’s not just a singular engagement with Indigenous peoples. You have to be constantly in partnership with these communities, updating them and informing them,” says Quinn Manson Buchwald, director of the Indigenous and Traditional Peoples program at Conservation International, and an enrolled citizen of the Little Shell Tribe of Chippewa Indians of Montana and the Manitoba Métis Federation. One-time access to data simply doesn’t meet these terms.
A refusal to participate in AI training at any stage shouldn’t be seen as a barrier to progress, says the Earth Daughters team, but an expression of sovereignty and care. “Rather than asking whether AI is inherently good or bad, we focus on who controls it, who benefits, and who is exposed to harm.” Likewise, Sparklingeyes is cautious about simply feeding Indigenous knowledge into centralized tools. “Being approached [by an institution] saying, ‘We have this system, help us by uploading your data,’ that’s where the imbalance will always remain,” she explains. “They need to dial it back to the co-design phase to really understand if it’s what the communities want.”
An Indigenous-centered approach
Indigenous erasure on AI platforms, models, and tools is an extension of Indigenous erasure throughout society, so education programs are often needed to bring Indigenous communities into the fold. “So often within Indigenous cases, we’re playing catch-up for years and years of being excluded from spaces,” says Lynnsey Chartrand, head of Indigenous initiatives at Mila, which runs the Pathfinders program, first launched in 2024. “What’s exciting about AI is that, for once, there’s an opportunity for Indigenous voices to get in from the ground up, and to really influence and shape the space as it evolves.”
One of the projects created by Pathfinders is Green Circle, an AI-based tool integrating traditional agricultural knowledge with climate and soil data to offer tailored guidance for crop selection, planting, and trading. This could be useful for brands working with natural fibers, says Chartrand, who is also a citizen of the Manitoba Métis Federation, a federally recognized government for Indigenous Red River Métis in Canada. “One of the big things that struck me after year one, and continues to surprise me, is the power of giving Indigenous talent the time, resources, tools, and creative freedom to think about how AI might benefit their communities, as well as showcasing the power of having this technology developed by us and not just for us. The projects are undertaken with a level of care that I think would be impossible to replicate if it was being done by someone non-Indigenous.”
Although there is a risk that the responsibility for leveling the playing field could fall on the shoulders of Indigenous peoples, Chartrand has hope. “I think there are good actors out there who are non-Indigenous, who are stepping up to be allies,” she says.
The possibilities for more balanced and equitable AI will continue to increase as the capacity to store data safely on locally governed systems builds; as grassroots Indigenous participation and advocacy grows; as Indigenous-led frameworks emerge; and as underrepresented cultures and voices challenge systemic biases. But its success will require continuous and evolving efforts from the fashion industry, alongside some challenging introspection on rights to access, benefits, and purpose.
“Whenever I ask myself how we can make these AI models more representative, the immediate question that comes to me is, what’s the point of it? Is it so a corporation in the US can make an ad that is more appropriate sounding for an audience in India? Who is it going to serve?” says Dennison. “That’s the value question I’m trying to ask.”
Clarification: Clarifies the Oma’s decision not to share data with an external researcher. (March 5, 2026)



