Hackathon Project | UHackathon – University of Washington Tacoma
24 hours (Hackathon)
4 people
Azure Custom Vision, JavaScript, Python, Webcam API, Smart-Home APIs
This hackathon project at UHackathon – University of Washington Tacoma focused on creating an innovative smart-home controller that uses American Sign Language (ASL) gestures to control home automation systems. The goal was to develop an accessible interface that allows users to control their smart home devices through ASL hand gestures.
Integrated real-time gesture interface tied a webcam feed to our Azure Custom Vision model and routed its output (A-Z ASL letters) through JavaScript to invoke Python services and smart-home APIs.
This project demonstrated the potential of combining computer vision, machine learning, and accessibility design to create innovative smart-home solutions. It showcased how ASL can be used as a natural interface for controlling technology, making smart homes more accessible to the deaf and hard-of-hearing community.