TouchTheSound is an application for helping deaf people to understand and feel the sound through a smartphone. The aim of the application is to translate the audio captured by the microphone to visual and vibration cues. The application could be used also by educators to teach the sound features.
The sound captured by the microphone is represented through an audiogram (blue columns), where the tone is given horizontally and the intensity vertically. A set of typical sounds are represented over the audiogram by pictures. The voice region is given by a green curve.
Touch over the audiogram in order to select the part that you want to translate to vibration. The user selection is represented by green columns. When the sound captured and the selected one intersect, the application will generate a proportional vibration.
When automatic mode is activated, the application approximates vibration levels according to the average noise. This mode could help to lip-read in noisy environments.
If you are not interested in a specific tone band and want to translate all the intensity captured to vibration, switch to the simplified sound meter representation using the mode button.
In some smartphones could be a feedback between the microphone and the vibrator (the vibration generates a sound that is captured by the microphone). In order to avoid that and for improving the user experience, plug an external microphone (usually all headphones for smartphones integrate a microphone).
TouchTheSound is developed by the Mobile Vision Research Laboratory of the University of Alicante, in collaboration with Neosistec.
This work is under research. Please, send your comments and suggestions to the project manager Juan M. Sáez firstname.lastname@example.org
Special thanks to Sergio Bleda, Fernando and Víctor Navarro from DFISTS at the University of Alicante.