language-icon Old Web
English
Sign In

Haptic technology

Haptic technology, also known as kinaesthetic communication or 3D touch, refers to any technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means 'pertaining to the sense of touch'. Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels. Haptic technology, also known as kinaesthetic communication or 3D touch, refers to any technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means 'pertaining to the sense of touch'. Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels. Haptic technology facilitates investigation of how the human sense of touch works by allowing the creation of controlled haptic virtual objects. Most researchers distinguish three sensory systems related to sense of touch in humans: cutaneous, kinaesthetic and haptic. All perceptions mediated by cutaneous and kinaesthetic sensibility are referred to as tactual perception. The sense of touch may be classified as passive and active, and the term 'haptic' is often associated with active touch to communicate or recognize objects. One of the earliest applications of haptic technology was in large aircraft that use servomechanism systems to operate control surfaces. In lighter aircraft without servo systems, as the aircraft approached a stall, the aerodynamic buffeting (vibrations) was felt in the pilot's controls. This was a useful warning of a dangerous flight condition. Servo systems tend to be 'one-way,' meaning external forces applied aerodynamically to the control surfaces are not perceived at the controls, resulting in the lack of this important sensory cue. To address this, the missing normal forces are simulated with springs and weights. The angle of attack is measured, and as the critical stall point approaches a stick shaker is engaged which simulates the response of a simpler control system. Alternatively, the servo force may be measured and the signal directed to a servo system on the control, also known as force feedback. Force feedback has been implemented experimentally in some excavators and is useful when excavating mixed material such as large rocks embedded in silt or clay. It allows the operator to 'feel' and work around unseen obstacles. In the 1960's, Paul Bach-y-Rita developed a vision substitution system using a 20x20 array of metal rods that could be raised and lowered, producing tactile 'dots' analogous to the pixels of a screen. People sitting in a chair equipped with this device could identify pictures from the pattern of dots poked into their backs. The first US patent for a tactile telephone was granted to Thomas D. Shannon in 1973. An early tactile man-machine communication system was constructed by A. Michael Noll at Bell Telephone Laboratories, Inc. in the early 1970s and a patent was issued for his invention in 1975. In 1994, the Aura Interactor vest was developed. The vest is a wearable force-feedback device that monitors an audio signal and uses electromagnetic actuator technology to convert bass sound waves into vibrations that can represent such actions as a punch or kick. The vest plugs into the audio output of a stereo, TV, or VCR and the audio signal is reproduced through a speaker embedded in the vest. In 1995, Thomas Massie developed the PHANToM (Personal HAptic iNTerface Mechanism) system. It used thimble-like receptacles at the end of computerized arms into which a person's fingers could be inserted, allowing them to 'feel' an object on a computer screen. In 1995, Norwegian Geir Jensen described a wristwatch haptic device with a skin tap mechanism, termed Tap-in. The wristwatch would connect to a mobile phone via Bluetooth, and tapping-frequency patterns would enable the wearer to respond to callers with selected short messages. In 2015, the Apple Watch was launched. It uses skin tap sensing to deliver notifications and alerts from the mobile phone of the watch wearer.

[ "Computer vision", "Simulation", "Human–computer interaction", "Control engineering", "Artificial intelligence", "haptic user interfaces", "Haptic perception", "virtual wall", "haptic rendering", "phantom haptic device" ]
Parent Topic
Child Topic
    No Parent Topic