Signsight Interactive Hub

A spatial learning tool for sign language and gesture-based communication

Signsight Interactive Hub is a gesture-based learning and communication platform designed for families navigating both spoken and signed languages. It extends the original Signsight system into a more interactive, educational experience—breaking down sign language into visual, trackable components through real-time motion feedback. Built for the home, this tool helps users explore gestures, learn sign language together, and visualize communication in shared space.

Interaction Format:

Use hand gestures and body movement
Practice signs with live feedback
Designed for shared learning between Deaf and hearing family members

Core Technology:

Powered by MediaPipe, it tracks your handshape, direction, movement, and facial expression in real time—breaking signs down into clear, visual parts.

Reframing the Learning Experience:

“Most sign language apps feel like dictionaries. They show you signs, but don’t teach you the grammar or flow—so learning feels shallow and hard to retain.”

Process / Research / Prototypes

Research

Smart devices help with convenience, but the real barrier is language imbalance. To truly address it, we need to focus on sign language itself.

If spoken language is like a line, sign language is a cube—a 3D language composed of five components.

Sign language relies on five components working together, and its syntax is different from spoken language—often using a time-topic-comment structure. This makes it harder for hearing people to learn.

View Full Research ArchiveThesis BlogMedia Pipe

Key Quotation

“most apps feel like dictionaries—they show signs, but not grammar or flow. Key issues include inconsistent visuals, poor movement-handshape links, and no real-time expression capture.
- Yuyin Liang Professor in the Special Education DepartmentSecretary-General of the Chongqing Deaf Association.

Process/Prototypes

How can we build shared understanding through sign language?

Especially—how can learners grasp its spatial grammar through interactive, visual, and immersive experiences?

The project began as a standalone mobile app for learning sign language through interactive gestures.But as the Signsight system evolved into a smart home assistant with spatial awareness and projection capabilities, we saw an opportunity:

What if sign language learning didn’t live in a screen—but in space?

We shifted the concept from an app to a projection-based interactive hub, designed to run alongside Signsight Product System. This allowed us to create a shared, ambient learning experience—where families could see, move, and learn together in real-time.

We shifted the concept from an app to a projection-based interactive hub, designed to run alongside Signsight Product System. This allowed us to create a shared, ambient learning experience—where families could see, move, and learn together in real-time.

Control Through Sign and Touch

With projection as the interface and gesture tracking built in, we added something new:Use signs to control the system.
Adjust settings with your phone—or your hands.

MediaPipe Breakdown

MediaPipe doesn’t just track gestures—it breaks them down into exactly the five components that form the foundation of sign language: Handshape, Orientation, Movement, Location, Facial Expression (non-manual signals)

Use Case 1: Deconstruct Learning

You can activate the auto-tracking system. and you type in  “I love you,” it will show you an example. You can play, pause, and see which components form the sign. For deeper understanding, click the button to see detailed breakdowns of each gesture.this is how the frame learning works.

Use Case 2: Reordering Language Syntax

let’s explore how comprehensive learning operates.
This module addresses the difference between sign language and spoken language order, helping you structure your signs correctly.

For instance, if you type “I went to the library yesterday ,” the app will break it down into individual signs and reorder them according to sign language syntax. You can tap each sign to see how it’s performed and then replay them together. The app also provides hints, like “fingerspell A” or “move from mouth to ear.”

The most brutalist and efficient library
A Webflow library infused with the brutalist way
Just drag, drop and make your first MRR faster
Assets for Webflow builders.