A shared sensory system for mixed-language families
Signsight is a smart home communication system that enables visual environmental feedback through gesture and voice recognition. Designed for Deaf and hearing family members, it bridges communication gaps without changing anyone's native way of interaction. Through projection and lighting feedback, the system translates presence, events, and intent into a shared language of light and motion.
Recycled ABS Plastic (housing)
Acrylic lens (projector module)
Brushed aluminum armature (camera rotation ring)
Built with MediaPipe, this system tracks hand gestures and body movements in real time, enabling seamless, non-verbal interaction through sign language.
What if homes could speak without sound?In Deaf and hearing families, communication is often limited by sensory mismatch. Instead of designing another translation tool, Signsight reimagines home interaction as a shared sensory space—a world where presence is seen, gestures are heard, and moments are felt.
This section documents the evolution of Signsight—from early interviews and field observations to ideation sketches and working prototypes. You’ll find how user insights shaped the system’s sensory interaction, and how each iteration responded to real communication challenges in mixed-language households.
View Full Research ArchiveThesis BlogMedia Pipe“ In mixed-language families, the core issue is the imbalance in language proficiency, which reduces communication. While technology can aid information exchange, face-to-face emotional conversations—especially between parents and children—remain difficult and limited.”
“The water keeps running. The door stays open. By the time someone notices, the moment's already gone.
In mixed-language families, it's not just communication that's delayed — it's awareness.”
How Might We Create Shared Awareness at Home?
Can communication happen without the need for active operation—like air, flowing naturally through a shared space?
Why SignSight?
In mixed-language families, traditional alerts often disrupt more than they connect. I designed SignSight—a projection and vision-based system—to enable quiet, ambient communication at home. It creates shared awareness through light and movement, without intrusive prompts or voice commands. The goal is to build a sense of connection that feels natural, responsive, and non-invasive.
A modular support stand designed for independent camera operation.
Provides rotation, elevation, and power support in homes without wall mounts.
The central sensing device equipped with gesture tracking and audio recognition.
It captures sign language, spoken language, and contextual cues for projection.
The Smart Light Node functions not as a translator, but as a signal
a visual call between rooms, alerting family members that something has happened.
"Since projections can’t cover every corner of the room, the smart bulb provides visual cues through light signals to indicate events."
Product feature
Using senario video
Wake-up method
Unlike voice-based devices like Siri, this one uses a clap to activate motion capture and send data."It also responds to voice input — you can say “Sight” to wake it up.