What happens when digital tools meet holistic, learning principles?
We know that generative AI can offer speed and access to vast amounts of information – but learning rooted in place and story moves at a different rhythm. As generative AI becomes more and more present in classrooms, we must consider how we can honour Indigenous ways of knowing, alongside these emerging technologies.
Why First Peoples Principles of Learning matters in AI integration
The First Peoples Principles of Learning (FPPL) is not an add-on to our curriculum – it is a worldview that centres around connection, responsibility, and balance. When we use AI in education we must align with these values, including reciprocity and reflection.
We know that generative AI tools reflect the bias that is present in the data – which leads to many of these tools lacking Indigenous perspectives and the reinforcement of colonial narratives. By using generative AI without a critical lens, it can lead to cultural appropriation, misinformation, or the invisibility of Indigenous voices.
Ways to embed FPPL into AI tasks
- Use AI to explore themes – not to tell or create cultural stories
- AI should never generate sacred stories or community-held stories
- Instead, use it to explore themes or ideas that align with the FPPL, such as respect, interconnectedness, or place
- Reflect on knowledge as a relationship – not just information
- Remember to frame AI as a tool, not a truth-teller
- Emphasize reflection – how does this output connect to lived experience or community values?
- Practice critiquing AI through FPPL lenses
- Questioning: whose knowledge is this? what is missing? is this shared respectfully?
- Critiquing AI can be used as a tool to deepen student understanding of FPPL
Examples of FPPL-aligned prompts to use with students
- What is missing from an Indigenous lens?
- How can this AI-supported response show respect for memory and story?
- Use AI to generate ideas about stewardship.
- Reflect on how these ideas align with teachings from your land or family
- What would AI get wrong if it tried to explain who you are?
Encourage student agency in exploring the limitations and impacts of using generative AI. Consider having students explore the bias in training data and erasure of Indigenous perspectives, how to decolonize tech spaces through critique and creative re-imagination, or the idea of story as relational vs. transactional and how this shift in perspective changes how we use and interact with AI.
Technology and land-based thinking can co-exist
AI does not need to be completely rejected in the classroom, but it needs to be reshaped through Indigenous-informed frameworks. It is essential to use AI in a way that honours story, place and people.
When we slow down and reflect – we can bridge digital tools like AI with ancestral ways of knowing.
Recent Comments