Wearable Computers for the Blind
Wearable Computers for the Blind
Now that we have come to the end of the series of posts inspired by my friend and colleague Daniel Kish, it is fitting that we discuss how wearable technologies might be used by blind individuals.
What would a wearable prosthetic look like for the blind, what options would it provide for the consumer?
Daniel Kish and I, with feedback over the years from Steve Mann, wrote a white paper about such a system in the 1990s. In this case, “white paper” refers to a speculative overview–an outline for potential development. We called the white paper “Project Hawkeye,” and referred to the wearable systems as CyberEyes. Part of what follows is taken from that earlier envisioning. Terms like “audification,” and “seeing eye people,” are discussed in the Hawkeye report.
1- New technologies will provide more options for blind navigation. The goal of these new tools will be to enable the blind traveler to move more efficiently, and with greater safety–ease of travel will be improved through technologies. A brief list of systems within this category include:
* GPS, embedded in wearables
* GPS tags: for locating “missing” items, embedded in personal objects
* Landmark assist technologies; audification and embedded messages
* Memory prosthetics for routes; mapping access
* Seeing Eye People- operator assist for difficult situations
* OCR for reading signage
We know from the work of Daniel Kish and his associates at World Access for the Blind, that blind individuals can evolve highly sophisticated auditory perceptual skills that enable self-sufficient navigation. Technologies can be developed to support and enhance this natural auditory capability.
* Audified environments
* Sonar systems for enhancing seeing with sound
* Bionic hearing- improved for echolocation
* Artificial vision systems (brain implants)
* smart canes–with tactile and olfactory assisted technologies
3. Communication systems will continue to shrink and become available in wearable units for the blind. This is nothing more than the porting of handheld technologies to the sensory zone (the ring of sensation on the head that includes the eyes, ears, vestibular organs, and the nose). Additional solutions will be provided for the blind, including:
* Face recognition and memory for faces
* Translation of facial expression and body language
* Voice capability for email, text, social networking, conferencing
* OCR units embedded in wearables will enable reading of print
4. The environment will get more accessible:
* technologies will enable the reading of touch screens
* technologies will evolve to allow voice control of touch screens
* Objects will talk on demand (as in a kindergarten classroom)
* Environments will be prescribed (smart rooms), especially for children.
* OCR will read Braille, signage, textbooks, wall posters, and calendars.
* Landmarks will contain embedded messages
* Magic mirror technologies will teach blind kids (“robotic” mirrors)
* Knowledge access will be embedded in objects (the painting that tells its history, etc.)
* Robot assistance will be available as needed–in the form of toys and magic friends for children.
All of the above suggestions can only come to fruition if:
* Professionals understand the technologies and can prescribe them.
* Training is ongoing as developmental abilities evolve, or as aging or disease cause detriment–adaptations to technologies and training have to occur.
* Upgrades are ongoing as technology evolves.
* Repair is available and affordable.
* Tech support is available and affordable.
* Training manuals and curricula are available and affordable.
* Technologies are tailored and prescribed for individuals.
* Technologies are under the control of the consumer.
Prosthetics for children in special education should be developed, in my opinion, around a concept called Humanistic Intelligence. This term was coined by professor Steve Mann at the University of Toronto. See his presentation on VIMEO.
Within Humanistic Intelligence (HI) theory, the computer is understood to be a second brain. It’s sensory modalities are considered to be additional senses. A synthetic synesthesia results when “computer senses” merge with the wearer’s native senses. The computer uses the human brain as one of its peripherals, just as the human brain uses the computer as a peripheral. This symbiotic relationship is the heart of Humanistic intelligence.
Because of the make-up of the human brain, the human being has to be in the decision–loop. This is because perception and all higher level processing work through a “motor-first” neural mechanism.
For example, Braille can only be perceived after the motor act of running fingertips over raised dots–motor precedes sensory/perceptual pattern recognition. Echolocation requires a tongue click, like a sonar blip, to be generated by the human being before returning sound reflections can be perceived–again, motor activity precedes sensation and perception.
Natural human processing requires cognition or movement to occur before perception can happen. The human being has to be in the loop, otherwise the system will be weak, faulty, or non-functional. Technologies cannot be “passively layered,” they must be integrated with the brain. And the way to do this is to keep the user in charge as much as possible.
How to bring about blind prosthetics
For over thirty years, Daniel Kish, Steve Mann, and many others have tried to bring some form of perceptual prosthetic to the blind consumer. This never came about and still awaits a breakthrough.
Perhaps the technology had not evolved sufficiently, or perhaps the rehabilitation field was reticent about the technologies until they were more promising, or perhaps blind consumer organizations were reluctant to follow the lead of engineers unfamiliar with the larger picture. The truth is probably a combination of factors.
Here are some suggestions for bringing these wearable technologies to children in special education:
1. Steve Mann and Niel Harbisson are pioneering cyborgs. They wear the technologies that address their personal needs. Perhaps the blindness field needs one or more individuals who are passionate about being a cyborg. Daniel Kish once said to me that trickle down does not work. Revolutions like Braille and echolocation are grassroots movements.
2. Perhaps an X-Prize?
3. Perhaps regular conferences, modeled after the Eye and Chip, could be held to showcase the development of prosthetics for the blind. Or the conference could be broader and showcase all prosthetics for children in special education.
4. Perhaps the “new” digital eyeglasses (Project Glass and Eyetap are examples) could be the substrate/frame for prosthetic applications (similar to phone apps). This route enables open source solutions.
5. Perhaps a university program, or a non-profit agency, or a commercial business could set up a lab or a project. Then, all of the above four suggestions could be addressed under one umbrella.