
Over 1.1 billion people worldwide were living with some form of visual impairment, including 43 million who were completely blind. These individuals face significant challenges in independent mobility and social participation. This project explores how to create a wearable assistive system that not only helps with navigation but also empowers blind users to maintain independence, dignity, and confidence in complex, real-world environments.
Role / Skills
Product/UX Design
User Research
Prototyping
User Testing
Design Systems
Team
1 Designer
1 Software
2 Hardware
Timeline
1 months for research, 1 month for MVP, with 2 months of continued iteration
Tools Used
Figma
Dovetail
Maze
Fusion 360
Blind and visually impaired individuals struggle with safe, independent navigation due to noisy, privacy-risking, and non-adaptive tools, leading to reduced confidence and social exclusion.
Limited independence when navigating unfamiliar or complex environments.
Over-reliance on auditory navigation tools, which can overload users in noisy environments.
Privacy concerns with camera-based systems that unintentionally capture faces or sensitive surroundings.
Existing devices often lack adaptability to users’ personal preferences or grip styles with canes.
As a result, many users report reduced confidence, safety concerns, and social exclusion.
The goal was to design an assistive device that:
Create a transportation system that:
Auditory Simulation Study
I developed an immersive audio-based simulation to observe how blind participants interacted with environmental cues. The participants could choose when to request navigation prompts, revealing preferences for autonomy and selective assistance.
Users valued existing environmental cue including traffic lights as guide for direction.
visually impaired community really relied on existing navigation tools for dialy
Provide reliable navigation markers rather than step-by-step directions.
Address privacy by restricting cameras to ground-level targets.
Give users control over when and how feedback is delivered.
Reduces auditory overload in noisy environments.
Provides immediate, intuitive cues (e.g., directional vibrations).
Offers discreet interaction, preserving user privacy and dignity in public.
We propose an assistive device solution that uses a camera to detect navigation markers and werable deliver real-time tactile feedback. By emphasizing reliable cues over step-by-step instructions, it reduces auditory overload and enables blind users to navigate with greater independence and confidence.
Reliable ground-level detection paired with tactile feedback enables confident, independent travel without overwhelming users.

We overlaid a grid on the camera feed and mapped each zone to a vibration motor. When a navigation target appeared, the corresponding motor vibrated, creating a direct spatial link between visual cues and tactile feedback.

The motors corresponding to the orange area vibrate


7mm
2mm

Why Tactile Feedback?
We chose tactile feedback because it introduces an underutilized sensory channel into human–computer interaction. While vision and hearing dominate assistive technologies, touch offers a discreet, intuitive, and immediate way to convey spatial information. For blind users, it reduces auditory strain in noisy environments, protects privacy compared to voice prompts, and provides real-time cues for quick decision-making.
Future Improvement Directions
From the first prototype, we learned that clarity and comfort are critical—vibrations must be distinct, well-positioned, and adaptable to individual users. These insights point to the need for a more flexible and modular system, where the glove and cane can be adjusted for different grip styles, hand sizes, and personal preferences. This evolution led us to design a modular glove-cane system that empowers users with greater control over how and when feedback is delivered.
To provide discreet and intuitive guidance, I focused on developing a vibration module that could deliver clear cues without adding to the auditory load that blind users already manage in daily navigation. Early attempts to embed a vibration grid into the cane handle proved confusing, as signals shifted with grip variations and alignment. Moving the module to the back of the hand through a glove revealed a more natural solution: vibrations pressed directly against the skin offered immediate, private, and easily interpretable feedback. Iterations with fabric fit, array boundaries, and structural supports led to a final system that pairs a cane-mounted camera for environmental detection with a glove-based motor array for tactile feedback. This discreet combination allows users to receive navigation cues in real time without disrupting their reliance on auditory landmarks, balancing clarity with privacy to create a more intuitive and socially acceptable form of guidance.
Tactile feedback replaces constant audio prompts, reducing cognitive load and preserving awareness in noisy environments.
Cane Handle Integration



Problems Identified:
Insights:
Outcome: Cane-handle integration was abandoned. Shifted toward a wearable (glove) solution for clearer, more flexible tactile delivery.


Improve motor-to-handback fit

Change motor–hand contact from surface to point

Integrate the motors into the fabric through weaving

Weave the motors into the fabric
Conclusion: Ensuring the motors fit snugly against the back of the hand is key to improving the clarity of vibration perception; the fabric between the motors and the hand should be thinner.
Next Steps:
Clarity of Motor Array Feedback
Findings:
Problems:
Insights: Physical contact quality (fit + material) is as important as motor strength.
Exploration 2: How can the glove fit closely to the back of the hand?


Problems:
Loose fit on smaller hands, unclear motor boundaries, difficult to distinguish which motor vibrates.
Insights:
Outcome: → Established that glove-based wearables deliver more intuitive feedback than cane-handle integration. Key lesson: ergonomic adaptability and clear motor boundaries are critical to effective tactile feedback.
A modular glove-cane system lets users control how and when feedback is delivered, supporting diverse grip styles and preferences.
Cane Handle Intergration
Add a rigid frame and use a custom PCB.

Final Version

Design:
Problems with Final Prototype:
Insights:
Outcome: → Final solution combines glove + cane-mounted camera, providing discreet tactile guidance while maintaining independence.
Final Version
Lower Machine
Converting the motor number into a vibration matrix


Polycarbonate Shell
Moto matrix drive board
Soft Rubber Interlayer
Nylon Lining (water prove)
Elastic Fabric Strap

91.0mm
56.0mm
11.3mm
12.8mm



The final prototype integrates a glove-based vibration module with a camera-mounted white cane, forming a modular system that delivers discreet and intuitive navigation guidance. The glove translates visual data captured by the camera into tactile feedback through a structured motor array, allowing users to sense direction and obstacles in real time. This configuration minimizes auditory overload, preserves user privacy, and enhances autonomy by letting individuals control when and how feedback is received. The result is a lightweight, ergonomic, and socially acceptable assistive device that merges familiar tools with emerging technology—enabling navigation that feels both natural and empowering.
Booking time reduced to under 10 seconds.
time on task
Clear confirmation reduced anxiety.
Booking time reduced to under 10 seconds.
Users valued ride-sharing with friends and subscription pricing.
Booking time reduced to under 10 seconds.
Real-time tracking increased trust in system reliability.
Booking time reduced to under 10 seconds.
Understanding the sensory world of visually impaired users reshaped my design priorities. I learned that accessibility is not only about technology, but about respecting users’ intuitive navigation habits and emotional comfort.
Creating a device that provides clear guidance while remaining subtle and socially acceptable, I realized that true accessibility also involves preserving privacy and dignity in public spaces.
The final glove–cane system demonstrated that giving users control over their sensory feedback creates a more empowering and personalized experience. The design adapts to their pace rather than imposing one.
Future directions include pilot programs in Ithaca, refining accessibility features, and measuring the long-term impact on carbon emissions.
Quick Access
The Quick Access section includes shortcuts to Schedule Rides and Recent Rides in one click.


Tracking Bar
Handy tab displayed on the homepage that allows users to access their ongoing trip in one click.


Over 1.1 billion people worldwide were living with some form of visual impairment, including 43 million who were completely blind. These individuals face significant challenges in independent mobility and social participation. This project explores how to create a wearable assistive system that not only helps with navigation but also empowers blind users to maintain independence, dignity, and confidence in complex, real-world environments.
Role / Skills
Product/UX Design
User Research
Prototyping
User Testing
Design Systems
Team
1 Designer
1 Software
2 Hardware
Timeline
1 months for research, 1 month for MVP, with 2 months of continued iteration
Tools Used
Figma
Dovetail
Maze
Fusion 360
Blind and visually impaired individuals struggle with safe, independent navigation due to noisy, privacy-risking, and non-adaptive tools, leading to reduced confidence and social exclusion.
Limited independence when navigating unfamiliar or complex environments.
Over-reliance on auditory navigation tools, which can overload users in noisy environments.
Privacy concerns with camera-based systems that unintentionally capture faces or sensitive surroundings.
Existing devices often lack adaptability to users’ personal preferences or grip styles with canes.
As a result, many users report reduced confidence, safety concerns, and social exclusion.
The goal was to design an assistive device that:
Create a transportation system that:
Auditory Simulation Study
I developed an immersive audio-based simulation to observe how blind participants interacted with environmental cues. The participants could choose when to request navigation prompts, revealing preferences for autonomy and selective assistance.
Users valued existing environmental cue including traffic lights as guide for direction.
visually impaired community really relied on existing navigation tools for dialy
Provide reliable navigation markers rather than step-by-step directions.
Address privacy by restricting cameras to ground-level targets.
Give users control over when and how feedback is delivered.
Reduces auditory overload in noisy environments.
Provides immediate, intuitive cues (e.g., directional vibrations).
Offers discreet interaction, preserving user privacy and dignity in public.
We propose an assistive device solution that uses a camera to detect navigation markers and werable deliver real-time tactile feedback. By emphasizing reliable cues over step-by-step instructions, it reduces auditory overload and enables blind users to navigate with greater independence and confidence.
Reliable ground-level detection paired with tactile feedback enables confident, independent travel without overwhelming users.

We overlaid a grid on the camera feed and mapped each zone to a vibration motor. When a navigation target appeared, the corresponding motor vibrated, creating a direct spatial link between visual cues and tactile feedback.

The motors corresponding to the orange area vibrate


7mm
2mm

Why Tactile Feedback?
We chose tactile feedback because it introduces an underutilized sensory channel into human–computer interaction. While vision and hearing dominate assistive technologies, touch offers a discreet, intuitive, and immediate way to convey spatial information. For blind users, it reduces auditory strain in noisy environments, protects privacy compared to voice prompts, and provides real-time cues for quick decision-making.
Future Improvement Directions
From the first prototype, we learned that clarity and comfort are critical—vibrations must be distinct, well-positioned, and adaptable to individual users. These insights point to the need for a more flexible and modular system, where the glove and cane can be adjusted for different grip styles, hand sizes, and personal preferences. This evolution led us to design a modular glove-cane system that empowers users with greater control over how and when feedback is delivered.
To provide discreet and intuitive guidance, I focused on developing a vibration module that could deliver clear cues without adding to the auditory load that blind users already manage in daily navigation. Early attempts to embed a vibration grid into the cane handle proved confusing, as signals shifted with grip variations and alignment. Moving the module to the back of the hand through a glove revealed a more natural solution: vibrations pressed directly against the skin offered immediate, private, and easily interpretable feedback. Iterations with fabric fit, array boundaries, and structural supports led to a final system that pairs a cane-mounted camera for environmental detection with a glove-based motor array for tactile feedback. This discreet combination allows users to receive navigation cues in real time without disrupting their reliance on auditory landmarks, balancing clarity with privacy to create a more intuitive and socially acceptable form of guidance.
Tactile feedback replaces constant audio prompts, reducing cognitive load and preserving awareness in noisy environments.
Cane Handle Integration



Problems Identified:
Insights:
Outcome: Cane-handle integration was abandoned. Shifted toward a wearable (glove) solution for clearer, more flexible tactile delivery.


Improve motor-to-handback fit

Change motor–hand contact from surface to point

Integrate the motors into the fabric through weaving

Weave the motors into the fabric
Conclusion: Ensuring the motors fit snugly against the back of the hand is key to improving the clarity of vibration perception; the fabric between the motors and the hand should be thinner.
Next Steps:
Clarity of Motor Array Feedback
Findings:
Problems:
Insights: Physical contact quality (fit + material) is as important as motor strength.
Exploration 2: How can the glove fit closely to the back of the hand?


Problems:
Loose fit on smaller hands, unclear motor boundaries, difficult to distinguish which motor vibrates.
Insights:
Outcome: → Established that glove-based wearables deliver more intuitive feedback than cane-handle integration. Key lesson: ergonomic adaptability and clear motor boundaries are critical to effective tactile feedback.
A modular glove-cane system lets users control how and when feedback is delivered, supporting diverse grip styles and preferences.
Cane Handle Intergration
Add a rigid frame and use a custom PCB.

Final Version

Design:
Problems with Final Prototype:
Insights:
Outcome: → Final solution combines glove + cane-mounted camera, providing discreet tactile guidance while maintaining independence.
Final Version
Lower Machine
Converting the motor number into a vibration matrix


Polycarbonate Shell
Moto matrix drive board
Soft Rubber Interlayer
Nylon Lining (water prove)
Elastic Fabric Strap

91.0mm
56.0mm
11.3mm
12.8mm



The final prototype integrates a glove-based vibration module with a camera-mounted white cane, forming a modular system that delivers discreet and intuitive navigation guidance. The glove translates visual data captured by the camera into tactile feedback through a structured motor array, allowing users to sense direction and obstacles in real time. This configuration minimizes auditory overload, preserves user privacy, and enhances autonomy by letting individuals control when and how feedback is received. The result is a lightweight, ergonomic, and socially acceptable assistive device that merges familiar tools with emerging technology—enabling navigation that feels both natural and empowering.
Booking time reduced to under 10 seconds.
time on task
Clear confirmation reduced anxiety.
Booking time reduced to under 10 seconds.
Users valued ride-sharing with friends and subscription pricing.
Booking time reduced to under 10 seconds.
Real-time tracking increased trust in system reliability.
Booking time reduced to under 10 seconds.
Understanding the sensory world of visually impaired users reshaped my design priorities. I learned that accessibility is not only about technology, but about respecting users’ intuitive navigation habits and emotional comfort.
Creating a device that provides clear guidance while remaining subtle and socially acceptable, I realized that true accessibility also involves preserving privacy and dignity in public spaces.
The final glove–cane system demonstrated that giving users control over their sensory feedback creates a more empowering and personalized experience. The design adapts to their pace rather than imposing one.
Future directions include pilot programs in Ithaca, refining accessibility features, and measuring the long-term impact on carbon emissions.
Quick Access
The Quick Access section includes shortcuts to Schedule Rides and Recent Rides in one click.


Tracking Bar
Handy tab displayed on the homepage that allows users to access their ongoing trip in one click.

Over 1.1 billion people worldwide were living with some form of visual impairment, including 43 million who were completely blind. These individuals face significant challenges in independent mobility and social participation. This project explores how to create a wearable assistive system that not only helps with navigation but also empowers blind users to maintain independence, dignity, and confidence in complex, real-world environments.
Role / Skills
Product/UX Design
User Research
Prototyping
User Testing
Design Systems
Team
1 Designer
1 Software
2 Hardware
Timeline
1 months for research, 1 month for MVP, with 2 months of continued iteration
Tools Used
Figma
Dovetail
Maze
Fusion 360
Blind and visually impaired individuals struggle with safe, independent navigation due to noisy, privacy-risking, and non-adaptive tools, leading to reduced confidence and social exclusion.
Limited independence when navigating unfamiliar or complex environments.
Over-reliance on auditory navigation tools, which can overload users in noisy environments.
Privacy concerns with camera-based systems that unintentionally capture faces or sensitive surroundings.
Existing devices often lack adaptability to users’ personal preferences or grip styles with canes.
As a result, many users report reduced confidence, safety concerns, and social exclusion.
The goal was to design an assistive device that:
Auditory Simulation Study
I developed an immersive audio-based simulation to observe how blind participants interacted with environmental cues. The participants could choose when to request navigation prompts, revealing preferences for autonomy and selective assistance.
Users valued existing environmental cue including traffic lights as guide for direction.
visually impaired community really relied on existing navigation tools for daily travel like various GPS system.
Provide reliable navigation markers rather than step-by-step directions.
Address privacy by restricting cameras to ground-level targets.
Give users control over when and how feedback is delivered.
Reduces auditory overload in noisy environments.
Provides immediate, intuitive cues (e.g., directional vibrations).
Offers discreet interaction, preserving user privacy and dignity in public.
We propose an assistive device solution that uses a camera to detect navigation markers and a wearable to deliver real-time tactile feedback. Since the white cane is widely used, the device can be integrated with it, combining a camera for image capture and a vibration module for feedback into a unified design. By emphasizing reliable cues over step-by-step instructions, it reduces auditory overload and enables blind users to navigate with greater independence and confidence.
Empowering and Intuitive Navigation
Reliable ground-level detection paired with tactile feedback enables confident, independent travel without overwhelming users.

We overlaid a grid on the camera feed and mapped each zone to a vibration motor. When a navigation target appeared, the corresponding motor vibrated, creating a direct spatial link between visual cues and tactile feedback.

The motors corresponding to the orange area vibrate


7mm
2mm

Why Tactile Feedback?
We chose tactile feedback because it introduces an underutilized sensory channel into human–computer interaction. While vision and hearing dominate assistive technologies, touch offers a discreet, intuitive, and immediate way to convey spatial information. For blind users, it reduces auditory strain in noisy environments, protects privacy compared to voice prompts, and provides real-time cues for quick decision-making.
Future Improvement Directions
From the first prototype, we learned that clarity and comfort are critical—vibrations must be distinct, well-positioned, and adaptable to individual users. These insights point to the need for a more flexible and modular system, where the glove and cane can be adjusted for different grip styles, hand sizes, and personal preferences. This evolution led us to design a modular glove-cane system that empowers users with greater control over how and when feedback is delivered.
To provide discreet and intuitive guidance, I focused on developing a vibration module that could deliver clear cues without adding to the auditory load that blind users already manage in daily navigation. Early attempts to embed a vibration grid into the cane handle proved confusing, as signals shifted with grip variations and alignment. Moving the module to the back of the hand through a glove revealed a more natural solution: vibrations pressed directly against the skin offered immediate, private, and easily interpretable feedback. Iterations with fabric fit, array boundaries, and structural supports led to a final system that pairs a cane-mounted camera for environmental detection with a glove-based motor array for tactile feedback. This discreet combination allows users to receive navigation cues in real time without disrupting their reliance on auditory landmarks, balancing clarity with privacy to create a more intuitive and socially acceptable form of guidance.
Discreet and Intuitive Guidance
Tactile feedback replaces constant audio prompts, reducing cognitive load and preserving awareness in noisy environments.
Cane Handle Integration



Problems Identified:
Insights:
Outcome: Cane-handle integration was abandoned. Shifted toward a wearable (glove) solution for clearer, more flexible tactile delivery.


Improve motor-to-handback fit

Change motor–hand contact from surface to point

Integrate the motors into the fabric through weaving

Weave the motors into the fabric
Conclusion: Ensuring the motors fit snugly against the back of the hand is key to improving the clarity of vibration perception; the fabric between the motors and the hand should be thinner.
Next Steps:
Clarity of Motor Array Feedback
Findings:
Problems:
Insights: Physical contact quality (fit + material) is as important as motor strength.
Exploration 2: How can the glove fit closely to the back of the hand?


Problems:
Loose fit on smaller hands, unclear motor boundaries, difficult to distinguish which motor vibrates.
Insights:
Outcome: → Established that glove-based wearables deliver more intuitive feedback than cane-handle integration. Key lesson: ergonomic adaptability and clear motor boundaries are critical to effective tactile feedback.
Glove + White Cane with Camera Module
A modular glove-cane system lets users control how and when feedback is delivered, supporting diverse grip styles and preferences.
Cane Handle Integration
Add a rigid frame and use a custom PCB.

Final Version

Design:
Problems with Final Prototype:
Insights:
Outcome: → Final solution combines glove + cane-mounted camera, providing discreet tactile guidance while maintaining independence.
Final Version
Lower Machine
Converting the motor number into a vibration matrix


Polycarbonate Shell
Moto matrix drive board
Soft Rubber Interlayer
Nylon Lining (water prove)
Elastic Fabric Strap

91.0mm
56.0mm
11.3mm
12.8mm



The final prototype integrates a glove-based vibration module with a camera-mounted white cane, forming a modular system that delivers discreet and intuitive navigation guidance. The glove translates visual data captured by the camera into tactile feedback through a structured motor array, allowing users to sense direction and obstacles in real time. This configuration minimizes auditory overload, preserves user privacy, and enhances autonomy by letting individuals control when and how feedback is received. The result is a lightweight, ergonomic, and socially acceptable assistive device that merges familiar tools with emerging technology—enabling navigation that feels both natural and empowering.
Booking time reduced to under 10 seconds.
time on task
Clear confirmation reduced anxiety.
Booking time reduced to under 10 seconds.
Users valued ride-sharing with friends and subscription pricing.
Booking time reduced to under 10 seconds.
Real-time tracking increased trust in system reliability.
Booking time reduced to under 10 seconds.
Understanding the sensory world of visually impaired users reshaped my design priorities. I learned that accessibility is not only about technology, but about respecting users’ intuitive navigation habits and emotional comfort.
Creating a device that provides clear guidance while remaining subtle and socially acceptable, I realized that true accessibility also involves preserving privacy and dignity in public spaces.
The final glove–cane system demonstrated that giving users control over their sensory feedback creates a more empowering and personalized experience. The design adapts to their pace rather than imposing one.
Future directions include pilot programs in Ithaca, refining accessibility features, and measuring the long-term impact on carbon emissions.