Russ Fisher-Ives was showcasing NAVCaR at the Saturday event. Photo by Valley Daily Post
Multi-Lingual Robot Used To Teach Native Students
Amongst a group of student-built robots that were showcased at a robotics event Saturday at the Northern New Mexico College (see earlier story HERE)
One small but particularly clever robot named NAVCaR also attended but did not compete against others. NAVCaR, which stands for “Native American Voice Controlled Automated Robot” was built with a microphone so it could listen to spoken commands. What made this robot particularly clever is that it could understand and obey commands spoken in three different languages, all from Northern New Mexico pueblos and tribes. NAVCaR can understand two dialects of Keres spoken by southern pueblo tribes, and Diné, the indigenous language of the Navajo nation.
NAVCar. Photo by Valley Daily Post
Speakers of these languages can give NAVCaR simple commands. And more importantly, students from those communities are able to design and program this robot using their communities’ native language. For Native American communities that are facing the possible dying off of their indigenous language, this little robot offers a new tool for teaching modern engineering and programming in the students’ home dialect.
NAVCaR was created by a team of engineers and educators in Albuquerque who work with RoboRAVE and a sister organization called Inquiry Facilitators. Inquiry Facilitators is a non-profit which is working to “Enhance science, technology, engineering, and mathematics education through academic competitions, support of student projects, and professional development for teachers.”
Fisher-Ives, a former physics teacher who spearheaded the creation of RoboRAVE over 10 years ago, serves as the Executive Director for the Inquiry Facilitators. Fisher-Ives said that they are currently working on a fourth language for NAVCaR, and have recently begun discussions with Ohkay Owingeh about adding Tewa to the mix. To add a new language Fisher-Ives explained, native language speakers are asked to record specific words multiple times. The robot designers in Albuquerque then use an algorithm to identify patterns of each spoken word so the computer in the robot will be able to recognize the command no matter who is speaking it. Fisher-Ives says it takes over two hours to program each word in a language, but the impact on the students is priceless.
See this LINK for more details about this amazing little multilingual robot