Main Content

This device classifies human emotions, snores, sign languages, visual objects using EI and provide haptic feedback through Neosensory Buzz.

When I first noticed Expand Your Senses challenge on Hackster, I told myself, this is a perfect marriage - sensory substitution with artificial intelligence!

Well, what is sensory substitution? In simple words, it’s a technique to supplement the lost of one sense by feeding it’s information from another channel. For example, when a person goes blind, generally does not lose the ability to see, they simply lose their ability to transmit the sensory signals from the retina to brain. Since the vision processing pathways are still intact, a person who has lost the ability to retrieve data from the retina can still see subjective images by using data gathered from other sensory modalities such as touch or audio. [Source wiki]

What is Neosensory Buzz?
When we become fluent in a language, learn to ride a bike, or refine our bat swing, we form associations with patterns of information from our physical world. Buzz is a wearable device that produces vibrational patterns on the skin. With practice, these associations become automatic and a new sense is born.

What Problem I am trying to solve?
Well, with the promise of Buzz and Edge Impulse, I don’t want to build just a feedback or alert system which would be super under utilization of such an advanced products. Microcontrollers with sensors and mobile apps are best suited for such applications. Buzz can produce haptic feedback which should be used for a bigger cause such as substituting lost senses or create new senses which humans don’t have organically.

In the past, it was believed that human brain does not grow or change after a certain age, acts like static organ but recent study and research has proved that neural network changes over time, creating new pathways and deleting old pathways. This is called brain-plasticity. Researchers have studied drivers brain before and after taking taxi-driving test in London city and they observed new neural pathways developed in brain after the test.

Similarly, haptic feedback can be used to send some visual signal or audio signal to brain using regular visual or audio neural pathways. Over the time and practice, brain will accept haptic feedback and process them as if they are received from retina or ears, substituting a lost sense.

It’s also proven that new senses can be developed. For example we don’t usually feel anything except smell while in deep sleep but can train our brain to respond to some haptic feedback even while we are in sleep, creating new senses.

There are lot of areas where sensory substitution can be used along with ML & AI. After doing few weeks of study I chose few use-cases which I felt very very common and can provide a better living.

1. Emotion interpretation for ASD
People suffering with autism spectrum disorder (ASD) often struggle to interpret different emotional reactions from other people. Ears or Eyes are not able to send correct electro-chemical signal to brain. This is a good use-case for sensory substitution. I have developed a machine learning model using Edge Impulse studio which classifies different speeches or words into predefined labels such as greeting, happy, angry, sad etc. Based on the classified label, device sends a predefined frame set to Buzz. With training and practice people can learn the meaning of different vibration pattern. New wiring will be created in human brain and people can see or hear the vibration.

2. Sign Language Interpretation
This is another very useful use case for people suffering with autism spectrum disorder (ASD). They often struggle to recognize different sign languages. This use case if very useful for visually impaired persons as well. I have developed another ML model using EI studio to classify different American sign languages (ASL) from images taken from Raspberry Pi camera. I have trained “Thank you”, “Please”, “Sleep” and “Drink” as proof of concept.

3. Reduce Snore And Sleep Better
Snoring once in a while isn’t usually a serious problem. It’s mostly a nuisance for your bed partner. But if you’re a long-term snorer, you not only disrupt the sleep patterns of those close to you, you hurt your own sleep quality. Snoring can itself be a symptom of a health problem like obstructive sleep apnea. When we fall asleep, our neck and throat muscles relax which tightens our airways and causes the vibrations in our throat. I have developed a ML model to classify snore and other background noise. When snoring is detected throughout the night, device sends haptic feedback to the Buzz. With practice, your brain will rewire pathways in your brain and you can feel snore even while you are sleep and you can change your sleeping posture to reduce snore.

4. General Purpose, Touch The Sound
Finally, this use case can be useful for different purposes, specially for people with visually impaired. ML model can classify different objects and animals such as Dog, Television, Cars etc. Device will send pictures every seconds and send to the classifier. Once object is detected, device will send haptic feedback This is a perfect example of sensory substitution.”

Link to article