Research into wearable interfaces, cross-modal perception and communication for the visually impaired on the mediated stage.
"E-skin" is a set of wearable interfaces, which constitute our past and present attempts to electronically simulate the perceptive modalities of the human skin: pressure, temperature, vibration and propioception. These four modalities constitute our biggest human organ, constantly detecting and reacting to environmental realities. The interfaces explore the cross-modal potentials of tactile and acoustic feedback, the enhanced orientation of cognitive mapping and the need to embody the interaction in the digital environment.
Today our cultural events are dominated by visual information based on sight and sound, but hardly at all on the combined senses of touch and sound. Very little theatre, dance or art events exist in which visually impaired people can participate. The project includes research into orientation, cognitive mapping and external audio-visual device control. The projects in e-skin are artificial systems that can interact intelligently with people on the real digitally mediated stage.
Contributors to this the project include:
Daniel Bisig and Vallerie Bugmann