You are now in the main content area

Kiwaapamin e-kana-waapamiyan: “I See You Looking at Me”

A Socially Assistive Robot Co-Design for Indigenous Language Revitalization and Reclamation
By: Mykelle Pacquing
May 26, 2025
Two rez dogs in Bearskin Lake First Nation

Two rez dogs in Bearskin Lake First Nation, a fly-in community approximately 1400 km northwest of Toronto, on October 5, 2023. Photo by Mykelle Pacquing. 

The robot design must respect Indigenous data sovereignty, reflect Anishinaabe culture, and ensure that Indigenous control can be exerted upon it to carry out its intended purposes for its communities ('good robot').

As a Filipino-Canadian born and raised in Waatoopikonk (Etobicoke), I had the privilege of being surrounded by a diverse, multilingual city. Hearing French, Spanish, Portuguese, Polish, Ukrainian, Chinese and Tagalog being spoken in school, among friends, and in stores was normal. But none of these languages are Indigenous to Canada; why was I not hearing any languages Indigenous to Canada? During my undergraduate studies, I learned the answer—and it broke my heart. Since then, I have been on a beautiful journey to learn Anishinaabemowin, help others learn the language, and bring the stunning ancient world of the Great Lakes Region—Anishinaabe ahki—to life, through the language.

ᒪᔾᑫᓬ  ᐸᑭᐣᐠ  ᓂᐣᑎᔑᓂᐦᑲᐢ᙮  ᐊᐧᑐᐱᑯᐣᐠ  ᓂᑐᐣᒋ᙮  ᑲᐦᑭᓇ  ᓂᔓᒥᓴᐠ  ᐱᓬᐃᐱᐣᐢ  ᐅᐣᒋᐊᐧᐠ᙮  ᑕᑲᓬᐅᐠ  ᐁᑲᐧ ᐃᓬᐅᑲᓄ  ᓂᑕᔭ᙮

Mykelle Pacquing nintishinihkaas. Waatoopikonk nitooncii. Kahkina nishoomisak Philippines oonciiwak. Tagalog ekwa Ilocano nitayaa.

My name is Mykelle Pacquing. I am from a place of alder trees (Etobicoke). All of my ancestors are from the Philippines. I am of Tagalog (People of the River) and Ilocano (People of the Bay) descent.

During my time as TA and RA to the late Alex McKay from Kitchenuhmaykoosib Inniwug First Nation at the University of Toronto, one of the projects that I worked on, in 2019, was to research the potential of AI in revitalizing the Anishinaabemowin language. After some initial research, I decided that a socially assistive robot design would be the most practical research question to pursue from this, and I put it into my application to the ComCult doctoral program. Later that year, Alex McKay joined his ancestors, and I gained admission into the ComCult program.

Since then, I have been traveling to remote fly-in communities in Northwestern Ontario for my Indigenous language revitalization consultancy work—learning, experiencing, and understanding the challenges and resiliencies in Anishinaabe communities. Internet? Hope the clouds don’t cover the satellite receivers, or that Doug Ford doesn’t cancel the Starlink contracts. Groceries? Hope you brought it with you, or pay 2-3 times the price. Tap water? Drink it and get ready to be sick. Need a trip to the ER? That’s at least a one-hour plane ride—if weather allows the flight at all. All the while, you are expected to revitalize an Indigenous language that has been wracked by residential schools, the Indian Act, and is not officially recognized by either federal or provincial governments. And yet, Elders and language speakers still show up in their communities, determined to ensure that future generations understand that their languages provide them the knowledge to maintain sustainable relationships with the land, water and Creation. As a Canadian, living on Anishinaabe land, I feel it is my duty to go through the challenges with them—and help them with their languages.

Socially assistive robots can be found in schools, hospitals, restaurants, malls, and more (Breazeal et al., 2008). The research field is interdisciplinary, and not solely claimed by engineering or computer science disciplines. The field encourages use of relevant disciplines in which the robot would depend upon to ensure its success. In my case, Indigenous Studies, Linguistics, Education, and Sociology play prominent roles in my dissertation.

Employing a Two-Eyed AI framework (Bourgeois-Doyle, 2019), and biskaabiiyang methodology (Geniusz, 2009, p. 9-11), I have conducted 11 interviews with individuals from the Anishinaabemowin community, and 11 interviews with individuals from the social robotics research community. For the Anishinaabemowin community members, I asked for their perceptions, experiences, and positions towards robots, AI, and Indigenous data sovereignty. For the social robotics experts, I asked their thoughts on using a social robot for Indigenous language revitalization, what I would have to consider in designing one, and how to protect and maintain Indigenous data sovereignty.

Anishinaabemowin participants were supportive of the use of a robot and AI for the language. Popular culture played a prominent role on perceptions of robots, particularly Star Wars (“good robots”) and the Terminator series (“bad robots”). However, when it came to Indigenous data sovereignty, positions split—with openness towards language data on one side, and restrictions towards language data on the other side. They were generally excited and willing to hear about the introduction of a potential new tool to reclaim and revitalize their Indigenous languages.

The social robotics participants worked on a wide variety of social robots. They shared their experiences and challenges in their research, including the potential and limitations of social robots. A few participants had direct experience with social robots in Indigenous language revitalization. Their biggest challenges was the lack of available Indigenous language data to allow fully automated human-robot interactions that were both accurate and effective. However, they noted that Indigenous children generally look at social robots positively and were excited to learn their languages with the robots—alongside their human teachers.

As I begin the analysis phase of my dissertation, there are going to be several interconnecting questions to consider:

  • What features of the robot will I be including in the design?
  • Will I use a pre-existing robot platform like NAO (Ahmad et al., 2017), or create a new one from scratch?
  • If I create a new one from scratch, what will the robot look like? Can it be made with accessible, affordable, and durable materials like hitchBOT (Smith & Zeller, 2017)?
  • Rather than replacing human teachers, how will the robot enhance the intergenerational transmission relationship between Elders and language learners?

At this stage, what I do know is that, to protect Indigenous data sovereignty, this robot must be closed off from big tech generative AI such as ChatGPT (“bad robot”). Not only is Indigenous data sovereignty lost when language data is fed into those systems, it can hallucinate Indigenous language responses, providing the means for further misinformation and harm to Indigenous communities (Chaka, 2024). The robot design must respect Indigenous data sovereignty, reflect Anishinaabe culture, and ensure that Indigenous control can be exerted upon it to carry out its intended purposes for its communities (“good robot”).

About the Author: Mykelle Pacquing (ᐄᐧᐣ, siya, he/him) is an Anishinaabe-trained Anishinaabemowin language and culture teacher. He is currently a Sessional Lecturer in Indigenous Studies at the University of Victoria and a research assistant for the Canada-UK Socially Assistive Robot research project at The Catalyst. Mykelle is currently researching a socially assistive Indigenous language robot design to assist the People of Turtle Island in their Indigenous language reclamation efforts.

Insights & Ideas is a ComCult blog series showcasing the research and expertise of ComCult students. Designed to engage a broad audience, the series features op-ed-style posts that connect academic insights to real-world issues, making complex ideas accessible and relevant. Each entry highlights the unique perspectives and innovative thinking within the ComCult program. We invite you to explore more stories that amplify research and inspire ideas! (News and Events Archives)

References

Ahmad, M. I., Mubin, O., & Orlando, J. (2017). Adaptive social robot for sustaining social engagement during long-term children–robot interaction. International Journal of Human–Computer Interaction33(12), 943-962.

Bourgeois-Doyle, D. (2019, March). Two-eyed AI: A reflection on artificial intelligence. The Canadian Commission for UNESCO’s IdeaLab.  (PDF file) https://en.ccunesco.ca/-/media/Files/Unesco/Resources/2019/03/TwoEyedArtificialIntelligence.pdf (external link) 

Breazeal, C., Takanishi, A., Kobayashi, T. (2008). Social robots that interact with people. Springer handbook of robotics, 1349-1369.

Chaka, C. (2024). Currently available GenAI-powered large language models and low-resource languages: Any offerings? Wait until you see. International Journal of Learning, Teaching and Educational Research, 23(12), 148-173.

Geniusz, W. M. (2009). Our knowledge is not primitive: Decolonizing botanical Anishinaabe teachings. Syracuse University Press.

Smith, D. H., & Zeller, F. (2017). The death and lives of hitchBOT: The design and implementation of a hitchhiking robot. Leonardo50(1), 77-78.