MetaModal LLC - The vOICe

MetaModal LLC (Pasadena, California) is active in the field of sensory substitution, in particular to enable blind users to "see with sound" by transforming images from a wearable camera into sounds, using a technology known as The vOICe. One focus area is to identify and establish benefits in the daily lives of blind users. Another focus area is the learning and adaptation effort involved in obtaining these benefits. Development and evaluation of effective training schemes is therefore a key element in both focus areas. MetaModal LLC was founded by Luis Goncalves and Enrico Di Bernardo.

Lecture by Shinsuke Shimojo, November 23, 2012:
 Sensory substitution, and the third kind of qualia

Note: most of the links below are external links that refer to the original website for The vOICe technology.

``There is increasing evidence from neuroscience for the extraordinarily rich interconnectedness and interactions of the sensory areas of the brain, and the difficulty, therefore, of saying that anything is purely visual or purely auditory, or purely anything. This is evident in the very tides of some recent papers – Pascual-Leone and his colleagues at Harvard now write of "The Metamodal Organization of the Brain," and Shinsuke Shimojo and his group at Caltech, who are also exploring intersensory perceptual phenomena, recently published a paper called "What You See Is What You Hear," and stress that sensory modalities can never be considered in isolation. The world of the blind, of the blinded, it seems, can be especially rich in such inbetween states – the intersensory, the metamodal – states for which we have no common language.''

Oliver Sacks, The New Yorker, July 28 2003, "The Mind's Eye: What the Blind See"
The vOICe vision technology for the totally blind offers the experience of live camera views through sophisticated image-to-sound renderings. In theory this could lead to synthetic vision with truly visual sensations ("qualia") through crossmodal integration, by exploiting the existing multisensory processing and neural plasticity of the human brain through training and education. The vOICe implements a form of sensory substitution where the goal is to bind visual input to visual qualia with a minimum of training time and effort. The vOICe also acts as a research vehicle for the cognitive sciences to learn more about large-scale adaptive processes and cross-modal neuromodulation in the human brain. Neuroscience research has already shown that the visual cortex of even adult blind people can become responsive to sound, and sound-induced illusory flashes can be evoked in most sighted people. The vOICe technology may now build on this with live video from a head-mounted camera encoded in sound. The extent to which cortical plasticity and dynamic rerouting allow for functionally relevant rewiring or unmasking of neural pathways in the human brain is under investigation. Apart from functional relevance, a cross-modal binding for inducing visual sensations through sound (mental imagery and artificial synesthesia) would also be of great psychological importance. The possible role of The vOICe vision technology in cross-modal neuromodulation and synesthetic effects is being explored and developed under the Open Innovation paradigm together with many R&D partners around the world.

about | contact |  Windows |  Android |  web app

YouTube video clips of training during 2010  SBIR phase I evaluation study on The vOICe by MetaModal LLC ("Viability of a sensory substitution device giving blind users sight through sound")

Blind man visually picks up objects through visual-to-auditory sensory substitution (PIP version) Blind man finds his cane through visual-to-auditory sensory substitution Blind man performs street crossing using visual-to-auditory sensory substitution Blind man scores a goal using visual-to-auditory sensory substitution Blind man analyzes abstract house shape using The vOICe (PIP version) Blind man plays visual tic-tac-toe using visual-to-auditory sensory substitution

This NSF-funded study was featured in the Pasadena Star News of November 2010 in an article titled  "Two Pasadena researchers seek integrated system to help the blind `see'", with  photo gallery. Results from this study were also reported by Luis Goncalves at CSUN 2011 in a panel presentation,  "Virtual environment to improve orientation skills in unknown spaces by blind people" (Microsoft Word file). Ken O'Sullivan covered the study in the ACB Braille Forum article titled  "The Pathway to Sight" (January 2012).