This project is an Electronic Travel Aid (ETA) that enables the user to navigate around objects in their path and view the profile of their surroundings.

The device delivers spatial information about a small area in front of the user via haptic feedback: range information from time-of-flight (TOF) sensors is mapped to the vibration intensity of vibration motors against the user’s skin. This device is intended to be an Electronic Travel Aid (ETA) to assist in the mobility of a visually impaired pedestrian. The sensing module contains six time-of-flight distance sensors arranged horizontally at relative angles to measure distance from the user’s hand to six points along an arc in front of him or her. The sensing module is held in the user’s head. The total angular range across the distance sensor is about 25 degrees, which gives information across a spread of one meter at one meter of distance from the sensor module. Six vibration motors convey the sensor information to the user through haptic feedback using a one-to-one mapping of sensors to vibration motors. Vibration intensity is varied through pulse-width modification (PWM). If an object is detected within range by one of the sensors, the motor corresponding to that sensor vibrates, allowing the user to identify an object’s location and general profile. Various strength of vibration are used to indicate proximity so that the user can sense the change in distance and steer away from obstruction.

The data from the six distance sensors is communicated to the PIC through a UART connection to the Arduino. The sensors are connected to the Arduino. Using I2C the sensors are able to convey information to the Arduino which then transmits it directly to the PIC. The PIC then processes the readings and scales it to a pulse width modulation (PWM) value which it output to the vibration motors. The main hardware components of the project include the TOF distance sensors and the vibration motors that are are connected to the PIC through an opto-isolator circuit. The motors create the haptic feedback interface and their intensity level varies with the proximity of the distance sensors to nearby objects. The software component consists of a thread that parses the sensor readings from the Arduino and uses a scaling technique to convert that measurement to a PWM output for the motors that corresponds to the location of a specific distance sensor.

By using touch as a substitute for visual input, the user will be able to learn to use the device to seamlessly gain information about the space in front of them which will facilitate their navigation.This device can also be used by non visually impaired humans for guidance in dark spaces as well as hearing impaired individuals who wish to receive haptic feedback in loud environments.”


Related Content