The idea for deaflingo began at HackKU 2024 where we threw together a python web-app that used a small model trained on 200,000 images to recognize American sign language letters. We ended up taking the third-place prize for our themed track. A pleasant surprise, we had no idea the demo video we posted would hit roughly 200k views on YouTube. After reading many flattering comments and realizing the potential of the app, we decided that we wanted to make something much bigger. So, we are now taking two tracks with this. One, we will be participating in the NeurIPS 2024 high school research paper submission to showcase our approach of a new hybrid model that we are currently working on to recognize any sign language (far outside the scope of alphabetical gestures). There will be more to this, so be on the lookout for updates. Two, we want to make this a full-fledged app. This demo site was the easiest and most accessible way to access our idea, however we would like to make it into the Android/IOS app store with a fully functioning sign language learning app that utilizes vision and a deep learning model to help you learn sign language in varing langugages. However, this is why we need a little aid. If we want to grow, costs to host and develop are a specific issue. We are looking to crowdfund the datasets to achieve multiple languages. Sure, some exist already, but to get where we want, we have to look for bigger and better. So, feel free to check the kickstarter out and maybe leave a tip (no requirement though, enjoy the demo site)!