CopyCat: Sign Language Game

January 2016 - Present


Advisors/Collaborators: Thad Starner

Notes:
  1. [In Progress.] Vishwanath et al. "CopyCat: Learning through Signing." IEEE Pervasive Computing 2018. [Paper draft is available upon request. Please contact me for more details.]

Ninety percent of deaf children are born to hearing parents who do not know sign language or have low levels of proficiency. Unlike hearing children of English-speaking parents or deaf children of signing parents, these children often lack the serendipitous access to language at home which is necessary in developing linguistic skills during the “critical period” of language development. Often these children’s only exposure to language is from signing at school. CopyCat is a game that uses our sign language recognition system to augment early classroom teaching for developing American Sign Language (ASL) skills in young deaf children.

CopyCat is designed both as a platform to collect gesture data for our ASL recognition system and as a practical application which helps deaf children acquire language skills while they play the game. The system uses a Microsoft Kinect camera to collect data of American Sign Language (ASL) gestures, and translates the gestures into text using the HTK Speech Recognition Toolkit.

Over the last one year, I developed the complete gesture recognition pipeline and also adapted the HTK to sign language input. We also modified the traditional recognition pipeline to perform verification gestures given our expectation and this improved the accuracy of the game by a significant amount.

Tasks Performed:
  1. Gesture and Pattern Recognition using HMMs
  2. Segmented Boosting of HMMs on ASL Dataset
  3. Supervised Learning with Temporal Data
  4. Data Visualization
  5. User Studies and System Evaluations with Children
  6. Video Game Design