Research has shown that approximately 95% of children who are deaf or hard of hearing are born to hearing parents. The lack of common communication tools between the parents and the child in early years can cause catastrophic implications on literacy achievement and academic success in later years.
We are developing an interactive mobile application for learning American Sign Language (ASL). This app is intended to foster early language acquisition and communication for young deaf children and their families. The app, called SmartSign 2.0, is designed to be an interactive gaming application that provides a friendly interface for toddlers, parents, and service providers to learn essential and functional signs in ASL. SmartSign 2.0 is distinct from existing ASL-teaching mobile and Web applications because it will provide immediate and appropriate feedback to the user based on machine learning and pattern recognition technologies.
This interdisciplinary project allows us to work collaboratively with undergraduate students in computing and deaf education. Creating the prototype of the app involves activities such as identifying most commonly used ASL signs and phrases, creating graphics and animations for the selected signs and phrases, programming the app using a game engine, and programming functions that track the user’s activity to examine their learning behavior and progress. The computing student work with Dr. Chuan to program the app while the student in deaf education work with Dr. Guardino to survey the best practices in using touch screen apps for children at different ages to learn ASL. Everyone in the team works together to iteratively improve the prototype by conducting experiments to observe how children and parents interact with the app.
Faculty Project Leader
Dr. Caroline Guardino