Article Preview
TopIntroduction
As technology becomes increasingly sophisticated, consumers expect more powerful and natural user interfaces than has previously been the case (Shan, 2010). The innate human characteristic of movement and gesture make its use attractive in the control of products (Costello & Edmonds, 2007) and is likely to be important in the era of ubiquitous or pervasive computing (Abawajy, 2009). A new generation of motion controllers which currently are mainly used for gaming, such as the Microsoft Kinect for Xbox 360 (Microsoft, 2010) are expected to be used in an increasing range of products, making whole body interaction with technology a reality. This process has already begun, with the use of touch interfaces combined with accelerometers and gyroscopes on tablets and smartphones initiated by the iPhone (http://www.toshiba.com).
The use of gesture, however, introduces a range of complex factors, including culture (Rico & Brewster, 2009; Yammiyavar, 2008), ergonomics (Fikkert, 2010; Saffer, 2008) and emotional response (Larssen, Robertson, & Edwards, 2006). Of these, emotion is the least understood, with the field of Emotional Design (Norman, 2004) emerging comparatively recently to address unrewarding and in some cases problematic user experiences. A product or machine may well ‘do the job’ but a positive emotional reaction is fundamental in ensuring that the interaction is pleasurable (Benyon, Hook, & Nigay, 2010). While it has been demonstrated that the use of gesture in gaming can engender positive emotions in players (Isbister & DiMauro, 2011; Lindley, Couteur, & Berthouze, 2008) and have driven much of the technology in gestural control, it is necessary to move beyond simply manipulating avatars and consider how movement can be used as a fundamental part of interaction with machines in our everyday lives.
The emergent technologies herald a shift in emphasis from designing interfaces for use to the interactions of use: the fundamental way in which we execute product operations. Gesture-based interaction possibilities are becoming increasingly important in doing this, as they bring the functionality of machine operation and the means of interaction for the user closer together. By better understanding how we react to the use of gestures in a practical setting, future designers would be then able to select and utilize appropriate gestures for different product operations and functionality. The aim of this research is therefore to explore what emotions and feelings gestures engender in users when interacting with sophisticated devices and systems.