Authored by Sarah Bell
As we delve into the Sensing Nature data, technology is emerging as a key factor in facilitating inclusive experiences with the natural world.
This might be as simple as providing access to detailed site information online, recognising that visiting an unfamiliar environment can be overwhelming for those with sight impairment, whether nature-related or otherwise. Pre-visit anxieties may be alleviated by providing information on the site layout, topography, surface materials, facilities and the availability of visitor centre staff or volunteers to support a visit.
In more extensive nature settings, for example along countryside or coastal paths, Sensing Nature participants have discussed the potential to use GPS technology to support efforts to ‘read’ and navigate the landscape more independently. Whilst people have flagged the use of Apple Maps and separate devices, such as HumanWare’s Trekker Breeze, the full benefits of this technology seem somewhat limited to more urban nature settings where path and road networks are more clearly defined.
Building on this, people have discussed opportunities to produce more refined mobile apps, combining talking compasses with audio maps offering key orientation information; highlighting way-finding clues and key points of interest in the surrounding area. There have also been suggestions of installing Bluetooth beacons at key path intersections to support in-situ information provision, and the TacMap group in Sheffield are currently adapting their audio-tactile map production to better support orientation amongst sight impaired users of public parks and gardens.
An interesting step towards these types of familiarisation tools is underway in the United States, where researchers at the University of Hawai’i have been user testing the “UniD” mobile app with blind and partially sighted visitors to Yosemite National Park. Developed by the “UniDescription” Project, the app is designed to make brochures at national parks accessible to people with sight impairment, and currently contains audio descriptions of over 50 National Park Service brochures in the US. Although not yet developed for use in wayfinding, the app does contain audio descriptions of key points of interest within the parks.
A number of people have also discussed the potential for technology to support non-visual approaches for identifying nature, for example through mobile apps such as ChirpOMatic, Warblr, and Chirp! Whilst the technology for this is not yet sensitive enough to attribute specific bird calls and songs to one or even two bird species, these types of applications can provide some indication of the types of birds that might be in the surrounding area.
For many people, however, it’s the characters, stories and quirks of different wildlife that capture the imagination. We’ll discuss these in a future news piece exploring how to make existing nature podcasts and live audio description more interesting and engaging for listeners with sight impairment.
Notably, while technology offers the potential to open new opportunities to connect and relate to nature amongst people living with sight impairment, it’s also important to address the needs and priorities of the large numbers of people without access to such devices or the confidence in using them. Without appropriate assistance – like the brilliant services provided by organisations like UCanDoIT – technology can pose as many problems as it solves for those on the margins of such developments.
Sensing Nature will be considering these challenges and opportunities as we continue to make sense of the interview data through this year.