|American Ash tree, at start of the Tactile Tour|
|Fauna Sculpture, Benson Shaw|
My student Sarah Bair and I have been working on a tactile tour of the outdoor area surrounding the Museum, so that low vision and no vision visitors will be able to explore nature and art through the sense of touch. We have been hoping this will work with the ExploreCentral mobile app under development with the Computer Science students in collaboration with my Museum Studies students.
We have started with several trees in the campus arboretum, just behind the Museum. (Sarah notes that it is more difficult for the blind to navigate when they off the sidewalk, so for the moment we are concentrating on trees that are immediately along the main paved sidewalk that meanders through the arboretum.) We are writing down tactile descriptions of each tree’s bark, as well as season-specific notes on other tactile-accessible elements —such as needles, pine cones, buds and leaves. We will supplement this data with botanical information from Biology Department faculty. Our hope is to have audio segments, geared to the blind, on each of these trees.
It is fortuitous that the eight vase sculptures in Benson Shaw’s “Resources” public art project are all accessible to the sense of touch. Each vase is devoted to a different natural or social resource, such as “Sun” or “Fauna” or “Community,” and displays punched-out metal shapes that are well suited to haptic exploration. Thus “Fauna” centers on the metal shape of an owl with outstretched wings, and “Flora” displays touchable plants with a root structure.
In time, we hope to expand this to create a tactile tour of the whole campus; we’ll need to be attentive to meaningful touch experiences available around campus that would offer significant natural history or aesthetic encounters, supplemented by audio segments through the mobile app. We are not quite sure about navigation for low vision/no vision visitors. One possibility would be a relief map of the campus, affixed to a wall in Dean Hall outside of the museum, or a portable relief map with braille that could be distributed to blind visitors.
We aren’t sure yet if voice recognition in the Android system will work well with the mobile app: the idea eventually would be that as a user moves her/his finger over a button, the text will be audible to the user. And we may need to create a separate category, like "TACTILE" within the Mobile App to make it as easy as possible for users to find these Points of Interest (in addition to tagging them with "Art" or "Nature", etc.) We’ll clearly need to keep working on this experimentally as we develop the accessibility of the product.