By External Source
Aug 5 2025 (IPS-Partners)
Artificial Intelligence is changing how we live, learn, work – and who gets heard.
It holds promise for humanity but, without safeguards, it risks becoming a new tool of domination.
For Indigenous Peoples, the stakes are not abstract – they are ancestral, material, and urgent.
Indigenous knowledge, images, languages and identities are already being used to train AI systems.
Much of this is happening without consent, consultation, or benefit-sharing.
In 2023, researchers identified over 1,800 AI training datasets containing Indigenous cultural content.
Most without evidence of Free, Prior and Informed Consent.
This is not inclusion – it is extraction in digital form.
When AI systems absorb Indigenous content without consent, they replicate colonial logic through code.
The dangers are not only cultural – they are also territorial and environmental.
AI requires data centers, rare earth minerals, and immense electricity – often sourced from Indigenous lands.
Over 54% of critical mineral projects worldwide are located on or near Indigenous territories.
In Chile, AI-optimized lithium mining threatens Atacameño water sources and sacred lands.
The environmental costs of AI include toxic e-waste, land degradation, and resource depletion.
When built without Indigenous participation, AI becomes a force multiplier for dispossession.
Meanwhile, Indigenous Peoples are excluded from decisions about AI governance, ethics, and policy.
They are rarely consulted – yet deeply affected.
But Indigenous Peoples are not passive victims in this story.
In New Zealand, Māori-led teams are using AI to revitalize te reo Māori.
In the Arctic, Inuit communities use AI to monitor ice patterns and adapt to climate change.
In Polynesia, Indigenous reef monitors combine traditional knowledge with machine learning to protect marine ecosystems.
These efforts show what AI can become – when rooted in rights, culture, and consent.
Indigenous Peoples have called for digital sovereignty, ethical frameworks, and funding for culturally-led innovation.
They must be co-creators of AI, not its collateral damage.
The future of AI is not just a technological question – it is a question of justice.
On August 8, join the global conversation. Defend rights. Shape futures