Categories
American Rehabilitation

Technology as a support system for orientation and mobility

Technology as a support system for orientation and mobility – Orientation and Mobility for Blind People

John Brabyn

The development and application of technology for orientation and mobility has a long history covering the postwar period. Although some early endeavors envisaged systems that might replace the cane or dog guide, more recent efforts have focussed on devices and systems designed to supplement and provide a support system for these basic mobility tools. The present paper is an overview of past, present and future technologies as a support system for orientation and mobility.

The problem of blind orientation and mobility is one that has fascinated engineers and technicians for many years. There has been a checkered history of many unique and innovative solutions being proposed and developed, with varying degrees of success. Although no technology has to date succeeded in being universally adopted by blind consumers, recent surveys (Blasch, 1994) have found that more use is being made of existing mobility aids than previously thought. The recent burgeoning of technology addressing the orientation or navigation aspect of mobility, and the accelerating pace of technological development in society at large, will provide steadily improving opportunities in this field, almost certainly leading to more pervasive adoption by blind consumers of technological adjuncts to their traditional mobility aids and techniques.

Mobility Focus

The era following World War II saw the widespread employment of the long cane as a primary mobility aid for blind people. The war stimulated radar, sonar, and other electronic technologies which, with the later advent of the transistor, could be reduced to a size that led to the development of devices that were small enough for a pedestrian to carry. By the sixties and seventies, a plethora of devices intended to assist blind pedestrians were developed at least to the prototype stage (Brabyn, 1985). These developments used two main sensor systems; sonar and video cameras. During this era, most of the technology development focus was on the mobility aspect of the “orientation and mobility” problem. The various sensor and display systems were intended to provide the blind pedestrian with information on the presence, range, direction, and, sometimes, the nature of the objects immediately in front of him. The list of such electronic travel aids is long, and will not be treated in detail here. This research has been augmented by other studies of mobility technique and performance, such as the optimization of long cane technique, the avoidance of obstacles, maintenance of a straight path, and so on.

Obstacle Detectors

The most popular method of obstacle detection has been the transmission of ultrasonic waves and the decoding of received reflections to sense the presence and sometimes the range of an object in the travel path. Many obstacle detection devices of varying types have been developed. Some, such as the Mowat Sensor (Pressey, 1977), are hand-held devices which can be aimed in the fashion of a flashlight. In the case of the Mowat Sensor, tactile feedback on the range of objects within the beam is provided using vibrations whose frequency increases as the object comes closer. Another class of devices, such as the Russell Pathsounder and more recently developed variations, are worn around the neck, and provides vibratory or auditory feedback on obstacle range in discrete distance increments. These devices are sometimes referred to as clear path indicators; they are intended to indicate whether the user has a clear travel path in front of him.

None of these devices is intended to replace the user’s primary travel aid; for one thing, detection of low-lying objects, path edges or dropoffs is not reliably within the devices’ capabilities. An exception is the Laser Cane (Nye, 1973), which incorporates laser sensors within the long cane itself to detect objects ahead of and above the user, and signal him via vibratory tactile feedback built into the cane handle. Through a novel configuration of transmitting and receiving elements, this device can detect dropoffs in front of the user.

Environmental Sensors

As implied above, developments in electronic mobility aids have ranged from simple obstacle detectors to sophisticated information displays that could also identify objects, textures, and flow patterns. In the early years, the most sophisticated device that became commercially available was the Kay Sonic Torch (Kay, 1964), with a handheld device that used an FM sonar with an information-rich auditory display. The auditory information was presented to the user via an earphone. Range was indicated by the pitch of the auditory signal and information about the nature of the objects detected was provided through the timbre of the sounds. In this type of system, environmental objects such as picket fences, hedges, and poles each caused characteristic sound patterns. Even differences in surface textures of the ground in front of the traveler could be detected, such as the difference between the hard sidewalk surface and the adjacent grassy verge.

Because of these characteristics, the Sonic Torch could be used as a primary mobility aid, replacing the cane or dog guide. Subsequently, Kay adopted a different approach that definitely retained the cane or dog as the primary aid and supplemented it with a version of the sonar sensor worn on the head. This new device (Kay, 1974) was later commercialized as the well-known Sonicguide, perhaps the most influential electronic mobility aid produced to date. The Sonicguide provided a similar display of range and object identification information to that of the Sonic Torch, but employed a wide “field of view” in place of the narrow Sonic Torch transmitter beam and two receiving channels that gave a binaural display with interaural intensity difference as a directional cue. Thus, with suitable training, the user could obtain information not only on the range and nature of objects, but also on direction and motion flow patterns.

Other notable developments in the field of ultrasonic mobility aids, not easily categorized as obstacle detector or environmental sensor, include the incorporation of multiple ultrasonic channels under microprocessor control as in the Pathsounder device (Russell, 1965), which assists such functions as finding a safe path through a gap such as a doorway.

An entirely different approach to sensing nearby objects was taken by Collins, Bach-y-Rita, Scadden, and others (Collins et al., 1977) in the pursuit of tactile display of image information obtained from video cameras. This approach utilized optical to tactile image conversion, which provided a point for point representation of a video camera image on the skin using a large two-dimensional array of electrical or vibratory (mechanical) tactile stimulators. This approach, like the Sonicguide, presented the user with an information-rich display, to the point where it needed to be simplified to make it more readily understandable. This led to several other experimental developments, such as presentation of a scanning sonar image (much like a ship’s radar display) on the skin (Brabyn et al., 1981). Ultimately, commercial realization was impeded by system cost and complexity, but the tactile presentation of information has received continuing attention, especially as an educational aid to present visual concepts through another medium.

The Orientation/Navigation Problem

Orientation and navigation refer to one’s ability to know where one is in the pedestrian environment and to locate waypoints, landmarks, and destinations. The primary mobility aid such as the cane or dog guide–supplemented by hearing and echolocation cues–have proven to be excellent and most difficult to improve upon technologically for the “mobility” part of the equation: finding and following a safe path through the immediate environment. The “orientation” component, however, offers considerable potential for technology to perform functions that the traveler might not otherwise be able to perform and support his primary mobility aids and skills. In many situations in everyday travel, such as transportation facilities, crossroads, shopping malls, and parks, a major unsolved problem for a blind pedestrian is to know exactly where he is, where his destination is located, and how to get there from where he is.

Of the many devices which have been developed (ultrasonics or other transmission media) to sense the adjacent environment in front of the blind pedestrian, some can also be used to assist orientation and navigation. This is especially true of those like the Sonicguide that have the capability to help station to in identifying landmarks to assist the user in establishing his position. By detecting a consistent sequence of such landmarks, the user can reinforce his internal map and gain an improved concept of his exact location. The type of information presented, however, is by its nature generic, in that all poles cause similar sounds, whether they are bus stops, lamp posts, or traffic lights.

The recent emergence of new navigation-related technologies in military and civilian applications, coinciding with legislative activity and governmental program initiation in North America and Europe, has now definitely shifted the focus of technological development to the orientation aspect of travel. At the same time as new technologies are becoming possible, governmental and regulatory activity has improved the prospects that there will be a viable market–or even a mandate–for navigation systems to make the environment accessible to the blind and visually impaired population. In the United States, for example, there is a mandate within the Americans with Disabilities Act (ADA) to provide access to public facilities. For a blind individual, the term “access” in this context means largely the ability to find the facility; braille and raised print signage do not achieve this goal since the blind individual must search around to find them. Thus, other forms of navigational assistance are needed.

Some of the different approaches to this problem that either have been or are currently being investigated include various forms of remotely readable signs or labels for the environment, dead reckoning systems, map reading technologies, and absolute positioning systems.

Remotely Readable Signs

The ability to read signs and recognize environmental objects, facilities, and landmarks from a distance is key in efficient orientation and mobility. This can be dramatically demonstrated to a sighted individual traveling in a situation where all the signs are either missing or in a foreign script. Ubiquitous signs comprise a menu of choices for sighted travelers, confronting them with the options available at any given point in their travel. In a sense, they also act as a form of memory; signs remind travelers about important characteristics of the environment.

The first system of remotely readable signs for blind users to be developed was the Talking Signs System (Loughborough, 1979). The concept was to provide coded infrared transmitters at any location where a sign or label is desirable, so that a blind pedestrian with a suitable receiver could read it at a distance in the same manner as sighted travelers read printed signs. Talking signs transmitters continuously transmit signage information from infrared Light Emitting Diodes (LED’s) located at the position of each sign or labeled location. The infrared signal is decoded by a handheld receiver carried by the blind individual to produce a directionally selective voice message. The direction selectivity is a characteristic of the infrared message beam, so that the intensity and clarity of the message increases as the sign is pointed at or approached. Thus, the individual using the device can get feedback about his location relative to the goal as he moves towards it. The signs are light and small, easy to install, consume very little power, operate indoors and outdoors, and are easy to program with human voice or synthesized voice messages. Because different signs have different functions, the range and angle of coverage of each sign is adjustable.

In recent years, the Talking Signs System has become commercially available, and demonstration installations include the Carroll Center for the Blind, the Lighthouse in New York, the Washington, DC, Metro, the Texas School for the Blind, and a number of facilities in San Francisco, such as the Powell Street Muni/BART subway station, the five-way street intersection above it, the public library, and public toilets. Many other installations are under construction or planned and similar systems are under development by companies and researchers in the United States and Europe. The system has been evaluated in a number of psychophysical studies (Brabyn & Brabyn, 1982, 1983; Schenkman, 1986; Bentzen, 1993) and reception by blind users has been enthusiastic. Clearly, remote signage technology can provide unique advantages to the blind or visually impaired traveler in allowing him to locate and recognize signs and landmarks from a distance both indoors and outdoors.

Remote signage and interactive environment systems using various forms of radio transmission as a medium have also been proposed and developed. The first was a system developed by Kelly (1980) utilizing garage door opener technology. The concept in this case was to place receivers at the locations of the signs or desired landmarks (restrooms, lifts) and to have blind persons interrogate the receivers with handheld transmitters. The use of radio transmission has since been explored by others, such as System and Main (1991), Urband et al. (1992), Fanmark Technology Corp, and the RNIB REACT system. Variations on these themes include the responsive environment concept and similar approaches using passive credit card sized tags which can absorb and retransmit appropriately coded radio signals from a suitable computerized transmitter Jaffe, 1992). Proximity-triggered talking signs have also been tested (Jones 1991). Another system, known as Verbal Landmarks, emerged in 1992. This used radio frequency transmissions from loops embedded in the surrounds of doorways to identify entrances and other waypoints. A user’s receiver within range of the Verbal Landmark transmitter announces the location as a verbal message.

In 1993, the American Council of the Blind proposed an evaluation of the infrared-and radio frequency-based technologies. The resulting comparison study (Bentzen, 1993), in which examples of both were tested in a hotel setting, showed a preference for the infrared-based technology in terms of objective travel measures (time to travel and travel distance) and in subjective user opinion. This difference is largely due to the innate ease of localization and beam pattern control using infrared transmitters and receivers.

Dead Reckoning Systems

Another approach to orientation and navigation is the use of dead reckoning navigation to establish the pedestrian’s position. Dead reckoning utilizes knowledge of a starting point, coupled with input from sensors, such as a compass and pedometer or other form of distance measuring device, to establish present position. Once position is established, reference can be made to computer stored maps or route descriptions to give the user a large amount of information about his surroundings and his desired route. A test of the potential of this approach was carried out by the author and others (Milner & Gilden, 1988; Jampolsky, Brabyn, & Gilden, 1989). The sensor technology evaluated consisted of a K band Doppler radar to measure distance traveled in combination with a flux gate compass for directional information. The sensors were ultimately to be coupled to a microcomputer which would incorporate a stored map of the area of interest, along with considerable descriptive and route instruction information to guide a user from anywhere within the mapped area to anywhere else. In the initial, simplest embodiment of the system, however, route information would be prerecorded on a cassette tape rather than a computer. To record a cassette for a given route, a sighted person would accompany the blind traveler and push a button to put a code on the tape as each landmark is reached. The sighted guide would also record a verbal description of the landmark (bus stop, corner, etc.) as well as a running commentary on the points of interest in between landmarks as the route is traversed.

The basic technology for this type of system was found to be feasible in 1988, when simulation results from tests with a dozen blind subjects were extremely positive (Jampolsky, Brabyn, & Gilden, 1989); at the time, however, the collaborating manufacturer did not feel the commercial aspects of the project would warrant further expenditure. When these explorations were made, the available technology for position sensing, electronic map storage, and high power portable computing were considerably less advanced than is now the case; it would now be possible to apply many of the same concepts while achieving greater positioning accuracy and lower cost.

Global Positioning Systems (GPS)

The advent of the highly accurate Global Positioning System (GPS) has led to new experimentation and development work on blind navigation. GPS, originally developed for military purposes, utilizes a network of satellites in very precise orbits around the earth transmitting precisely timed microwave signals. Suitable receivers on the earth’s surface receiving information from three or more satellites can use the signals to give an estimate of longitude, latitude, and altitude. Codes made available for civilian use allow positioning accuracy to within approximately 100 meters with “selective availability” and 15 meters without. Greater accuracy (1-15 meters) can be obtained with differential GPS, utilizing an additional transmitter placed nearby on the ground.

Experimental and developmental work on applying this system to assist blind travelers has been undertaken by several researchers and companies, including Loomis and Golledge (1993), Bornschein, Balachandran, Frank, and Arkenstone, Inc., and others. The Arkenstone system has been pursued to the stage of a commercial product, Atlas Strider, which combines a talking map system specially designed for blind users integrated with information from a GPS receiver.

In conjunction with suitable stored map information, computing power, and carefully designed information display interface, GPS technology provides many possibilities for providing a wide variety of orientation and navigation support to a blind traveler. It can be used alone as a position sensor or combined with other dead reckoning sensors to provide greater accuracy. It could be used to provide the type of verbal, waypoint, landmark, route instruction, and ongoing commentary described in the above section on dead reckoning systems, without the complication of regular recalibration.

GPS is ideal for providing approximate position estimates in open areas. Due to the line-of-sight nature of the microwave signals used by the system, accuracy and operation may be compromised in many areas of interest to the blind traveler, such as narrow downtown streets lined with tall buildings, indoor areas (hotels, office buildings, and shopping malls) and underground facilities (e.g., transit stations). Due to limitations in accuracy, precise location by the blind traveler of vital points in public areas such as restroom doorways, ticket machines, public telephones, and subway entrances may require combining with other technologies. This suggests the future desirability of combinations with other technologies in any comprehensive O&M support system.

Future Possibilities

A number of other approaches to the design of orientation and navigation systems are possible, and some have been tested. Methods utilizing forms of image processing or artificial recognition of environmental features have been explored by several investigators following the pioneering work of Collins (1982). Systems of this nature could conceivably be sophisticated enough to locate and read print signs and recognize objects such as public telephones and many other features of the environment. Variations on satellite navigation systems have been proposed that eliminate some of the restrictions on GPS operation. Various forms of electronic map reading technologies have been or are being investigated, and these can be interfaced with locating and positioning systems. Various inertial navigation systems have been investigated and may provide an adjunct to other sensors in increasing accuracy and directional information. It can be expected that developments in technology well beyond the field of sensory aids will provide yet more possibilities for enhancing the existing approaches or providing new alternatives.

Discussion

With few exceptions, all of the technological developments past, present, and future in this field envisage the technology in question as operating in support of the blind traveler’s primary mobility aid and skills. Each of the major technological approaches to providing useful support information to the blind traveler has advantages and disadvantages. Ideally, technology should provide the traveler with the supporting information he/she wants and needs, in a convenient, easily understood manner. The option of not receiving the information (receiving it only when it is wanted) is one major factor in these considerations. Ultimately, the user should have available enough information to allow effortless, stress-free travel with direct location (without the need to search tactually) of the key points of interest (whether they are doorways, drinking fountains, telephones, ticket machines, or others). And, of course, the ideal system would have universal coverage of all areas the user may wish to go.

Because no system presently meets all these requirements, the advantages of combining different approaches should, in the author’s view, be explored. The pinpoint accuracy, user convenience, and directionality of remote signage systems, for example, could be combined with the approximate vicinity locations and verbal commentaries obtainable from suitable computer-interfaced GPS and dead reckoning-based technologies described above, also giving coverage of those areas where remote signage is not installed. Many permutations and combinations are possible and the continuing trends towards miniaturization and cost reduction in electronic systems makes consideration of combinations of technologies more feasible. These trends also make it likely that more and more users will gradually take advantage of the supplementary travel information that technology will provide.

Bibliography

[1.] Bentzen, B., & Mitchell, P. (1993). Audible signage as a wayfinding aid: comparison of Verbal Landmark and Talking Signs. Draft report to the American Council of the Blind, Sept. 1993.

[2.] Brabyn, J. A., Collins, C. C., & Kay, L. (1981). A wide-band CTFM scanning sonar with tactile and acoustic display for persons with impaired vision (blind divers, etc.). Proceedings of the Ultrasonics International Conference, Brighton, England.

[3.] Brabyn, J. A., & Brabyn, L. A. (1982). Speech intelligibility of the talking Signs. Journal of Visual Impairment and Blindness, 76, 77-78.

[4.] Brabyn, L. A., & Brabyn. J. A. (1983). An evaluation of “Talking Signs” for the blind. Human Factors, 1983 25 (1), 49-53.

[5.] Brabyn, J. A. (1985). A Review of mobility aids and means of assessment. In D. H. Warren & E. R. Strelow (Eds): Electronic Spatial Sensing for the Blind. Martinus Nijhoff, Boston.

[6.] Carver, D. (1992). A more accurate GPS for all. Motor Boat &Sailing, 169, 5, 76.

[7.] Collins, C. C., Scadden, L. A., & Alden, A. B. (1977). Mobility studies with a tactile sensing device. Conference on Systems and Devices for the Disabled, Seattle.

[8.] Crandall, W., Gerrey, W., & Alden, A. (1993). Remote signage and its implications to print-handicapped travelers. Proceedings, RESNA Conference, June 12-17, Las Vegas.

[9.] Heyes, A. D. (1984). The Sonic Pathfinder: a new electronic travel aid. Journal of Visual Impairment and Blindness, 78, 200-202.

[10.] Jaffe, D. (1992). Responsive Environment. Proceedings, CSUN Conference on Technology for Persons with Disabilities.

[11.] Jampolsky, A., Brabyn, J., & Gilden, D. (1989). Simulation studies of a new navigation device for the blind. Journal of Rehabilitation Research & Development, 26 Ann Suppl, 384-5.

[12.] Jones, D.A. (1991). Talking Signs: the sound of things to come. New Beacon, 75, 891.

[13.] Kay, L. (1964). An ultrasonic sensing probe as an aid to the blind. Ultrasonics, 2, 53.

[14.] Kay, L. (1974). A sonar aid to enhance spatial perception of the blind: Engineering design and evaluation. The Radio and Electronic Engineer.

[15.] Kelly, G. W. (1981). Sonic Orientation and Navigational Aid (SONA). Bulletin of Prosthetics Research, 1, 189.

[16.] Loomis, J., & Golledge, R. (1993). Personal Guidance System using GPS, GIS, and VR technologies. Proceedings, CSUN Conference on Virtual Reality and Persons with Disabilities, San Francisco, June 17-18.

[17.] Loughborough, W. (1979). Talking Lights. Journal of Visual Impairment and Blindness.

[18.] Loughborough, W. (1990). Orientation: The missing factor in O & M. Proceedings, 5th Annual Conference on Technology for Persons with Disabilities, CSUN.

[19.] Main, R. (1991). Intelligent sensing device for improving mobility for the blind. Proceedings, CSUN conference on Technology for Persons with Disabilities, 589-604.

[20.] Milner, R., & Gilden, G. (1988). Navigation device for the blind. Proceedings, ICAART 88–Montreal, 214-5.

[21.] Nye, P. W. (Ed.)(1973). A preliminary evaluation of the Bionic instruments–Veterans Administration Laser Cane. National Academy of Sciences Final Report.

[22.] Pressey, N. (1977). Mowat Sensor. Focus, 11, 35-39.

[23.] Russell, L. (1965). Travel Path Sounder. Proceedings, Rotterdam Mobility Research Conference. New York: American Foundation for the Blind.

[24.] Schenkman, B. N. (1986). The effect of receiver beam width on the detection time of a message from Talking Signs, an auditory orientation aid for the blind. International Journal of Rehabilitation Research. 9 (3), 239-46.

[25.] Urband, E., & Stuart, R. (1992). Orientation enhancement through integrated virtual reality and geographic information systems. Proceedings, CSUN Virtual Reality and Persons with Disabilities, 55-62.

Dr. Brabyn is Director of Rehabilitation Engineering Research at Smith-Kettlewell Eye Research Institute, San Francisco.

COPYRIGHT 1997 U.S. Rehabilitation Services Administration

COPYRIGHT 2004 Gale Group