Haptic interaction

A range of advanced sensors embedded in HSIs allow operators to expand their ability to sense the state of the environment and the behaviour of artefacts within the environment by means of haptic devices or ‘tangible interfaces’. Such devices take advantage of the sense of touch to convey a range of information by applying forces such as vibration, force feedback and sensing location and motion. This tactile stimulation can be used to assist in the detection of changing conditions or orientation of objects that the operator cannot handle manually due to hazards such as heat or radiation. In more advanced devices it can create the illusion of virtual objects and the ability to control them in a computer simulation, to control such virtual objects, and also to enhance the remote control of machines and devices (telerobotics). Again, the reader is referred to Buxton’s work (Buxton, 2011, chapters 7, 8, 9, 13, 14).

A common example of haptic interaction in the form of vibratory feedback is found in the Sony, Xbox and Nintendo game controllers mentioned before. Haptic devices may also incorporate tactile sensors that measure forces exerted by the user on the interface. (Jones and Sarter, 2008). It is easy to imagine how operators would be able to use a device like this to ‘feel’ the bearing vibration of a turbine while monitoring the spin-up process!

7.8.5.2 Brain interaction

Recent state-of-the-art developments promise to offer interaction possibilities considered impossible just a few years ago. For example, direct brain-machine interaction has long been considered science fiction (think of the 1984 novel Neuromancer by William Gibson or the 1999 movie The Matrix!), but it is fast becoming a reality. Consumer-oriented devices like Emotiv Systems’ Insight neuroheadset already demonstrate impressive capabilities to control devices and software. With devices like this designers can dramatically enhance interactivity and the level of immersion in the application by, for example, enabling the system to respond to a user’s facial expressions and adjusting the application’s behaviour dynamically in response to user emotions such as frustration or excitement, and enabling users to manipulate objects in an application or even turn them on or off or change their state by simply using the power of their thoughts. This is reality and no longer science fiction; it is not too hard to imagine that these devices will find their way into certain applications in industry within 20 years…