ASL ⭤ English Translation w/ MediaPipe, PointNet, ThreeJS and Embeddings

lulu66

New member
https://reddit.com/link/1d1k7cw/video/qmotjy81mw2d1/player

Hey! I'm Kevin Thomas, a Grade 11 student at Burnaby South Secondary School (also home to British Columbia School for the Deaf)!

Over the last few months, I have been developing a tool that translates between American Sign Language (ASL) and English. Most existing ASL translation tools are built on the misconception that ASL is the same language as English. Basically, they only view Deafness as a disability and only seek to overcome the inability to hear, but not to translate to the language of ASL itself.

With guidance from my ASL teacher, I have been working on a project that facilitates this translation while respecting and preserving ASL as the primary language. For ASL reception, I augmented over 100,000 images of ASL alphabets using Google MediaPipe and trained a PointNet model to classify handshapes fingerspelled by Deaf individuals. For ASL expression, I augmented over 9,000 videos of ASL signs, embedded their corresponding words, and then used ThreeJS to sign words said by hearing individuals. I also used LLMs to improve accuracy and translate between English and ASL grammar.
I only started looking into ML/AI over the last few months! I would appreciate any feedback, opportunities or resources to continue learning and growing! Feel free to reach out to me in Reddit DMs or at kevin.jt2007@gmail.com! Also liking this Linkedin post will go a long way 🙏🫶
 
Back
Top