By navigating our site, you agree to allow us to use cookies, in accordance with our Privacy Policy.

FRAMOS Empowers Visually Impaired with a New 3D Wearable Technology

Wearables with Real-Time 3D Technology Create New Way of Sensing by Translating Visual Information

FRAMOS Empowers 3D Wearable Technology

FRAMOS in cooperation with the CDTM institute of the Technical University of Munich (TUM) has developed an innovative wearable using real-time 3D technology to support visually impaired people in daily life.

The glasses are equipped with the latest Intel RealSense stereo cameras, intelligent algorithms translate the visual impression into haptic and audio information.

FRAMOS Empowers

While audio information relies on object and character recognition, the haptic feedback is provided by a wrist band equipped with vibration motors.

This new way of sensing enables visually impaired people to fully understand their environment and to have advanced guidance for safe navigation.

The eye is probably the most important human sense. But visual information is hidden for 108 million visually impaired people worldwide.

Shop names, street names, route numbers of public transport or traffic signs are invisible, navigation without this information is a true challenge.

The FRAMOS developed glasses now represent a new possibility for the visually impaired to explore the surrounding benefitting from the latest technology.

Dr. Christopher Scheubel, FRAMOS Business Development, said,“We are proud having found a way to bring state-of-art technology into an application, which provides a huge impact on the daily life of the visually impaired. This project hits the sense of innovation by really supporting humans and improving their lives. The exceptional beauty of this technology is the ability to provide visual information normally given by the human eye. Our technology creates a new way of sensing.”

The 3D enabled wearable creates a new way of sensing by translating visual information into haptic feedback on a wristband in real-time.

The prototype includes an Intel RealSense 3D camera and speakers for audio feedback. The setup is controlled by a processing hub with a GPS sensor and a LTE module for mobile data connection. Connected via Bluetooth, a micro-processing unit translates visual data into haptic-feedback through an 2D array of vibration motors. Based on the exact location and movement of the vibrating feedback on the arm, the visually impaired is informed about the position and distance of things in the surroundings. A voice controlled interface makes interaction easy and rechargeable batteries enable a full day of use.

The glasses are a smart assistant helping the visually impaired to master their life and provides a new level of safety and knowledge by text and object recognition enabled with intelligent algorithms.

The prototype leverages state-of-the-art vision technology and entirely reflects FRAMOS’ mission of making machines to see and think.

Web: Click here


Niloy Banerjee

A generic movie-buff, passionate and professional with print journalism, serving editorial verticals on Technical and B2B segments, crude rover and writer on business happenings, spare time playing physical and digital forms of games; a love with philosophy is perennial as trying to archive pebbles from the ocean of literature. Lastly, a connoisseur in making and eating palatable cuisines.

Related Articles

Upcoming Events