Developing Visual Editors for High-Resolution Haptic Patterns

Short paper at Haptic and Audio Interaction Design. 7th International Conference. HAID 2012, Lund Sweden.
By Stahl Stenslie, Tony Olsson, Andreas Göransson, David Cuartielles.
Excerpt from the text. To read the full text follow this link

 

In this article we give an overview of our iterative work in developing visual editors for creating high resolution haptic patterns to be used in wearable, haptic feedback devices. During the past four years we have found the need to address the question of how to represent, construct and edit high resolution haptic patterns so that they translate naturally to the user’s haptic experience. To solve this question we have developed and tested several visual editors.

A haptic vocabulary is a ‘toolbox’ containing different ways and methods for touching users. The earlier Erotogod installation (2003)  used a two-way bodysuit that both sensed users through input and touched them through vibrotactile stimulators. Here three kinds of touch patterns formed its haptic vocabulary. These were i) the ground, basic patterns used in every part of the installations dramaturgy, ii) the designed and specific patterns scripted as specific parts or sequences of the dramaturgy and iii) the random patterns that were generated as response to user’s touch. Patterns were made without a visual editor. To aid the making of specific touches the patterns were first laid out as drawings marking positions and movement before coded directly into the software.
Much like a textual language, a haptic language consists of a hierarchical structure where the lowest layer defines the most basic components of the language, for example the characters of a text. This most basic component is then defined in more complex structures that produce yet another level of understanding; in our textual example this would be words. Multiple second tier constructs are then combined into the communicated message. In our visual editors we have tried to facilitate both for exact control of single vibrator outputs -analogue to characters- as well as the formation of combinations of multiple outputs that form higher level and meaningful haptic expressions analogue to ‘words’. Haptic languages can be subdivided into the alphabetic using tactile clues to form actual words (Braille, telegraph) and the conceptual, symbolic and non-verbal attempting to form. meaning through emotions and embodied sensations (hand gestures and body language) [1]. In this project we focus on the symbolic expression of haptic language. Although less exact it is much faster and experienced more direct for most users.

[1] Chang, A. S., O’Modhrain, R. Jacob, E. Gunther, and H. Ishii 2002. ComTouch: Design of a Vibrotactile Communication Device, In Proc. ACM DIS 2002 Designing Interactive Systems Conference, ACM Press.

Links:
Link to the full text
The full paper of HAID 2012

 

To read the full text follow this link

 

Return to Texts