This paper explores the use of haptic stimuli as non- visual affordances to assist in learnability of bend gestures. We tested 48 haptic Tactons with simulated blind participants to understand what haptic sensation could intuitively map to bend location and direction. We identify that a short, single motor Tacton indicates reliably a bend location, while participants agreed that the combination of two motors with varying intensities could indicate bend direction. This work is the first to explore the use of Tactons to communicate bend gesture location and direction, to eventually create a tactile interaction method for blind smartphone users.
Exploring Haptics for Learning Bend Gestures for the Blind
CHI EA: ACM Conference on Human Factors in Computing Systems Extended Abstracts, 2016