Columbia GSAPP
Xinyu Jiao
Computational Design Practices: Colloquium 2 Fall 2025
A visual-to-touch wearable + app system for blind and low-vision users, inspired by Traditional Chinese Medicine.
How can visual information be translated into meridian-based vibration patterns (with minimal voice cues) on the wrist to help blind and low-vision people sense space and atmosphere through their body?
Moving & Navigation
Helps you move safely. The vibration feels like a smooth stream on your wrist when the path is clear. If there is an obstacle, the vibration "stutters" to warn you.
Feeling the Atmosphere
Helps you feel the "vibe" of a room. A quiet, empty space feels like slow breathing patterns. A busy, crowded place feels like rough, sandy textures.
Point & Identify
Helps you find specific objects. Just point your hand like a laser. The wrist gives a sharp "click" vibration when it locks onto an object (like a sign or door), and then reads out the name.
Meridian Ribbon is a wearable + app system for blind and low-vision users that turns visual information into sound and vibration on the wrist. The phone app uses the camera to read the environment and interpret simple states—like street vs. corridor, quiet vs. crowded, calm vs. tense. These states are translated into short sound cues and meridian-inspired vibration patterns, arranged along the wrist as a continuous tactile "ribbon." Rather than looking at a screen, users can orient themselves, sense atmosphere, and reflect on their day through hearing and touch. The project treats Traditional Chinese Medicine meridians not as medicine, but as a body-centered interface language for organizing haptic feedback.
Acupoints like Neiguan (PC6) sit on high nerve density areas. Mapping vibrations here ensures clearer feedback and higher tactile acuity.
The brain tracks continuous lines easier than random dots. Meridians act as a natural "highway" for information, reducing cognitive load.
Stimulating the Pericardium Meridian aligns with calming the heart. It turns the device into an emotional regulator, not just a navigation tool.
"I can't form a map of the room — everything feels scattered."
Blind users shared that their biggest challenge is not the lack of information, but the lack of structure. They need spatial feedback that is continuous, directional, and quiet, not fragmented descriptions.
"Meridians are the body's natural pathways for sensing direction."
A TCM practitioner explained that the wrist and forearm meridians behave like an embedded spatial coordinate system. Sequential tactile stimulation along these lines creates a clear sense of flow and direction, even without vision.
Blind users need structured, directional, silent spatial cues — and meridian pathways offer a natural framework for delivering them.
Input
The camera acts as a sensor. It calculates movement and crowd density from the video feed.
Logic
The code translates this data onto the Meridian lines of the arm. This organizes the signals logically.
Material
I will test materials like Silicone (soft) and Plastic (hard). Materials act like filters—they change how the vibration travels on skin.
Output
The final result is a Tactile Language: Smooth waves for safety, sharp clicks for objects.
My project employs experimental, participatory, and speculative methods to explore new forms of sensory interaction.
First, it is experimental because I use AI to translate visual data—like movement and crowd density—into vibration patterns. I am testing how these digital signals can be felt on the skin to create a new "tactile language" for navigation.
Second, it is participatory because I work directly with blind and low-vision users. Instead of just guessing what they need, I co-design with them. I learn from their real-world feedback to find which tactile patterns feel most natural and intuitive.
Finally, the work is speculative and slightly decolonizing. It imagines a future where our bodies—not screens—are the main interface. By using Traditional Chinese Medicine (TCM) concepts, I explore non-Western ways to connect digital information with the human body, challenging how we normally build technology.
A shape-changing table that reacts to human gestures, making data physical through actuator grids and computational mapping.
Demonstrates how physical interfaces translate abstract data into tangible experiences. My project focuses on wearable devices rather than table-scale installations.
View Project →
A chatbot app with adaptive, emotionally supportive voice/text interactions using machine learning for tone adaptation.
Shows how AI provides emotional support through voice. My project focuses specifically on blind users' needs with transparent, ethical design.
View Project →
A wearable system using flexible pressure sensor arrays to detect pulse at three TCM positions (Chi, Cun, Guan), creating 3D pulse maps similar to doctors' fingertip sensations.
My project reverses this: instead of measuring the body, I design a way for the body to feel the world—turning sound into vibration along meridian points.
View Research →A wearable haptic audio system that translates sound into body sensations through full-spectrum vibration, used by musicians and Deaf/hard-of-hearing users.
Demonstrates vibroacoustic communication. My project builds on this by focusing on spatial perception for blind users, not just music appreciation.
View Project →
A non-invasive TCM practice where small beads are placed on ear points to stimulate meridian pathways, used for stress, sleep, and pain relief.
Shows how TCM energy points can be applied to wearable devices. My project adapts this to wrist meridian points using vibration motors.
Learn More →
A platform using haptic technology and vibration patterns for therapeutic applications, supporting mental health and emotional well-being.
Demonstrates haptic technology for therapeutic support. My project combines this with spatial perception, aligning with the CARE mode for emotional support.
View Project →
Scenario: Walking & Commuting.
Clear Path: The vibration feels like a smooth water stream flowing down your arm. It means "Go ahead."
Obstacle: The vibration stutters or flows backward. It means "Stop, something is blocking the way."
Scenario: Standing still & observing.
Busy/Crowded: The wrist feels rough textures (like sandpaper). It means the place is chaotic.
Quiet/Empty: The wrist feels slow, breathing waves. It means the space is calm and open.
Scenario: Pointing at objects.
Action: You point your hand like a laser at a sign or object.
Feedback: The wrist gives a sharp "Click" (haptic impulse) when it locks onto a target, and the earbud reads the name (e.g., "Starbucks").
The mobile app interface translates visual information into intuitive controls. Users can switch between FLOW, SENSE, and FOCUS modes, adjust sensitivity settings, and receive real-time spatial feedback through the wristband.
A live preview of the wrist meridian shows where vibrations will occur. Sliders let users adjust vibration intensity and sensitivity to match their comfort and perception.
Users can switch between FLOW, SENSE, and FOCUS modes. Each mode changes the haptic behavior of the ribbon for navigation, ambient awareness, or precise object focus.
The home screen shows connection status, battery level, and whether the Meridian Ribbon is actively translating visual input into touch.
A storyboard showing how Meridian Ribbon changes a chaotic commute into a calm journey.
The system combines computer vision, meridian mapping, and tactile feedback—helping blind and low-vision users feel their surroundings through a visual-to-tactile translation loop.
Soft, flexible - Smooth, cushioned - Calming mode
Breathable - Warm, skin-friendly - Long wear
Customizable softness - Adjustable damping - Experimental
Textured surface - Luxurious feel - Sensory exploration
Designed for individuals seeking intuitive spatial awareness beyond traditional cane or audio navigation. Specifically for those who experience auditory fatigue from constant screen-reader usage and want a quieter, more somatic way to sense their surroundings.
For anyone seeking "Digital Detox" or Somatic Mindfulness. In a world of visual overload, Meridian Ribbon offers a new way to connect with the environment through the body—treating the skin as a canvas for information, rooted in the philosophy of TCM and embodied cognition.
Experimental plan for material testing and algorithmic validation.
Develop Type A: Soft, flexible silicone version for maximum comfort.
Develop Type B: Hard 3D-printed shell version for electronics protection.
Goal: Compare durability vs. ergonomics.
Conduct blind user tests to compare vibration patterns.
Variable A: TCM-Mapped Feedback (Structured).
Variable B: Random Feedback (Unstructured).
Goal: Validate that TCM logic is more intuitive.
Implement a 2-week wear test for continuous data collection.
Establish an AI Feedback Loop: The system learns from user comfort ratings.
Goal: Create a personalized, adaptive algorithm.
This paper presents a soft wristband that measures pulse at Cun, Guan, and Chi points in TCM, turning subtle pulse changes into digital data. It is a key precedent for the project, showing how meridian-based TCM ideas can be translated into a contemporary wearable sensing system.
Category: Traditional Chinese Medicine | View Research →
inFORM is a shape-changing interface that makes digital data physical through a dynamic pin array. It informs the project's interest in embodied interfaces, where abstract information becomes something users can physically feel and interact with.
Category: Embodied Interfaces | View Project →
This project sonifies complex scientific data so that patterns can be heard and sometimes felt, supporting the idea of sonic–tactile data interfaces. It helps frame how data can be translated into sound structures that blind and low-vision users can interpret.
Category: Data Sonification | View Project →
This study evaluates acoustic touch as a way to support blind users in perceiving their environment. It is directly relevant to sound-based orientation and non-visual interaction goals, showing what works and what challenges remain in acoustic guidance.
Category: Accessibility Research | View Research →
Braille Band proposes a wrist-worn haptic device that communicates information through Braille-like vibration. It supports the exploration of the wrist as an information surface and demonstrates how haptic patterns can become a language for non-visual communication.
Category: Haptic Interfaces | View Research →
SAS Graphics Accelerator turns data visualizations into screen-reader and audio-friendly formats for blind users. It relates to sonic-tactile data work by showing existing approaches to non-visual data access, and highlights where the project extends this into more embodied, haptic forms.
Category: Data Accessibility | View Summary →
Replika is an AI chatbot and voice companion that focuses on emotional support and adaptive responses. It is a critical precedent for ethical and emotional AI voice interface, especially for thinking about trust, intimacy, and the risks of manipulative or opaque emotional AI.
Category: Affective Computing | View Project →
Ear Seeds use small beads placed on ear acupoints to provide gentle, continuous pressure for emotional and physical support. This precedent connects TCM meridian logic to simple, wearable point-based stimulation, informing the use of meridian-inspired points on the wrist.
Category: Traditional Chinese Medicine | Learn More →
This article explains Gua Sha as a TCM scraping practice that follows muscle and meridian lines to improve circulation and release tension. It supports the use of meridian pathways as directional flows for vibration and heat along the arm, translating "energy movement" into interaction design.
Category: Traditional Chinese Medicine | Learn More →
Feel is a wristband that uses biosignals (like skin conductance) to monitor stress and emotional states, paired with an app for mental health support. It provides a strong reference for combining sensing, emotion analysis, and wearable feedback, parallel to AI-assisted sound–emotion mapping.
Category: Wearable Technology | View Project →
SubPac X1 converts low-frequency sound into deep vibration that can be felt on the body. It is a key precedent for the sound-to-touch concept, showing how immersive vibroacoustic feedback can extend listening beyond the ears and onto the skin.
Category: Haptic Audio | View Project →
Source / Provenance:
Real-time audio captured through the user's phone microphone, including speech, ambient noise, and urban sound environments.
Reference Dataset:
UrbanSound Dataset – https://urbansounddataset.weebly.com/
Politics / Limitations: Environmental recordings may capture people without consent. Sound varies across cultures and socioeconomic contexts, meaning AI interpretation may not generalize.
(a) RAVDESS – Emotional Speech & Song Dataset
https://zenodo.org/record/1188976
Used widely for emotion recognition in speech.
Politics: Primarily Western, English-speaking actors → cultural bias in emotional expression.
(b) CREMA-D – Crowd-Sourced Emotional Audio Dataset
https://github.com/CheyneyComputerScience/CREMA-D
Contains labeled vocal expressions across emotion categories.
Politics: Skews toward white, male voices → risk of misclassification for marginalized groups.
(c) ESC-50 – Environmental Sound Classification Dataset
https://github.com/karolpiczak/ESC-50
Common dataset for environmental sound classification.
Politics: Sound categories reflect Western soundscapes → may not represent global environments.
(a) WHO Standard Acupuncture Point Locations
https://apps.who.int/iris/handle/10665/43829
Official WHO documentation standardizing acupoint locations.
Politics: Global standardization may flatten regional differences within Traditional Chinese Medicine.
(b) NCBI Bookshelf – Meridian System Overview
https://www.ncbi.nlm.nih.gov/books/NBK92773/
An NIH-published English overview explaining the classical meridian system, including pathways, qi flow, and the relationship between meridians and acupoints.
Politics: As a biomedical framing, it simplifies traditional knowledge systems and translates TCM concepts through Western academic language.
Source / Provenance:
Feedback from classmates, blind/low-vision participants, and your own embodied testing of tactile patterns and materials.
Reference Study:
Acoustic Touch Research – https://www.researchgate.net/publication/374993206_An_investigation_into_the_effectiveness_of_using_acoustic_touch_to_assist_people_who_are_blind
Politics / Limitations: Embodied sensory preferences vary widely; disability communities are not monolithic. Consent and privacy are essential when collecting feedback.
(a) SubPac – Tactile Frequency System
Shows how low-frequency audio can be translated into full-body vibration.
Politics: Originally developed for music and gaming, not accessibility → sensory assumptions may not fit all users.
(b) Skin-Integrated Haptics (Science Advances)
https://doi.org/10.1126/sciadv.abd7887
Research on soft, skin-conforming haptic interfaces.
Politics: Most studies are based on able-bodied participants → embedded sensory bias.