Haptic is the “science of using tactile feeling to human interaction with computers”. The sensation of touch is the brains most reliable learning system– more efficient than seeing or hearing– which is why the brand-new technology holds so much pledge as a teaching tool. With this innovation we can now take a seat at a computer system terminal and touch items that exists on “mind” of the computer. By utilizing unique input/output devices (joysticks, data gloves or other gadgets), users can receive feedback from computer system applications in the type of felt experiences in the hand or other parts of the body.
In combination with a visual display, Haptic innovation can be used to train individuals for jobs needing hand- eye coordinatio, such as surgery and spaceship maneuvers. In our paper we have talked about the standard concepts behind haptics in addition to the haptic gadgets and how these gadgets are engaged to produce sense of touch and force feedback systems.
Then, we proceed to a few applications of Haptic Innovation. Finally we conclude by mentioning a few future developments.
Haptic technology, or haptics, is a tactile feedback technology which takes advantage of the sense of touch by applying forces, vibrations or movements to the user.This mechanical stimulation can be utilized to help in the development of virtual objects in a computer system simulation, to control such virtual things, and to boost the push-button control of machines and gadgets (telerobotics). It has actually been referred to as “doing for the sense of touch what computer graphics does for vision”.
Haptic gadgets might incorporate tactile sensing units that determine forces exerted by the user on the interface. Haptic technology has made it possible to investigate how the human sense of touch works by enabling the creation of carefully managed haptic virtual items. These objects are utilized to systematically penetrate human haptic capabilities, which would otherwise be tough to accomplish. These research study tools contribute to the understanding of how touch and its hidden brain functions work. The word haptic, from the Greek ἅπτικός (haptikos), indicates relating to the sense of touch and comes from the Greek verb ἅπτεσθαιhaptesthai, implying to contact or to touch.
WHAT IS HAPTICS??
Haptics is Quite Literally The Science of Touch.
The origin of the word haptics is the Greek haptikos, meaning able to grasp or perceive. Haptic sensations are created in consumer devices by actuators, or motors, which create a vibration. Those vibrations are managed and controlled by embedded software, and integrated into device user interfaces and applications via the embedded control software APIs. You’ve probably experienced haptics in many of the consumer devices that you use every day. The rumble effect in your console game controller and the reassuring touch vibration you receive on your smartphone dial pad are both examples of haptic effects. In the world of mobile devices, computers, consumer electronics, and digital devices and controls, meaningful haptic information is frequently limited or missing.
For example, when dialing a number or entering text on a conventional touchscreen without haptics, users have no sense of whether they’ve successfully completed a task.With Immersion’s haptic technology, users feel the vibrating force or resistance as they push a virtual button, scroll through a list or encounter the end of a menu. In a video or mobile game with haptics, users can feel the gun recoil, the engine rev, or the crack of the bat meeting the ball. When simulating the placement of cardiac pacing leads, a user can feel the forces that would be encountered when navigating the leads through a beating heart, providing a more realistic experience of performing this procedure. Haptics can enhance the user experience through:
* Improved Usability: By restoring the sense of touch to otherwise flat, cold surfaces, haptics creates fulfilling multi-modal experiences that improve usability by engaging touch, sight and sound. From the confidence a user receives through touch confirmation when selecting a virtual button to the contextual awareness they receive through haptics in a first person shooter game, haptics improves usability by more fully engaging the user’s senses. * Enhanced Realism: Haptics injects a sense of realism into user experiences by exciting the senses and allowing the user to feel the action and nuance of the application.
This is particularly relevant in applications like games or simulation that rely on only visual and audio inputs. The inclusion of tactile feedback provides additional context that translates into a sense of realism for the user. * Restoration of Mechanical Feel: Today’s touchscreen-driven devices lack the physical feedback that humans frequently need to fully understand the context of their interactions. By providing users with intuitive and unmistakable tactile confirmation, haptics can create a more confident user experience and can also improve safety by overcoming distractions. This is especially important when audio or visual confirmation is insufficient, such as industrial applications, or applications that involve distractions, such as automotive navigation.
HISTORY OF HAPTICS
In the early 20th century, psychophysicists introduced the word haptic to label the subfield of their studies that addressed human touch-based perception and manipulation. In the 1970s and 1980s, significant research efforts in a completely different field,robotics also began to focus on manipulation and perception by touch. Initiallyconcerned with building autonomous robots, researchers soon found that building adexterous robotic hand was much more complex and subtle than their initial naive hopeshad suggested. In time these two communities, one that sought to understand the human hand and one that aspired to create devices with dexterity inspired by human abilities found fertile mutual interest in topics such as sensory design and processing, grasp control andmanipulation, object representation and haptic information encoding, and grammars for describing physical tasks. In the early 1990s a new usage of the word haptics began to emerge. The confluence of several emerging technologies made virtualized haptics, or computer haptics possible. Much like computer graphics, computer haptics enables the display of simulated objectsto humans in an interactive manner. However, computer haptics uses a display technology through which objects can be physically palpated.
Basic system configuration.
Basically a haptic system consist of two parts namely the human part and the machine part. In the figure shown above, the human part (left) senses and controls the position of the hand, while the machine part (right) exerts forces from the hand to simulate contact with a virtual object. Also both the systems will be provided with necessary sensors, processors and actuators. In the case of the human system, nerve receptors performs sensing, brain performs processing and m-uscles performs actuation of the motion performed by the hand while in the case of the machine system, the above mentioned functions are performed by the encoders, computer and motors respectively.
Basically the haptic information provided by the system will be the combination of (i)Tactile information and (ii) Kinesthetic information. Tactile information refers the information acquired by the sensors which are actually connected to the skin of the human body with a particular reference to the spatial distribution of pressure, or more generally, tractions, across the contact area .For example when we handle flexible materials like fabric and paper, we sense the pressure variation across the fingertip. This is actually a sort of tactile information .Tactile sensing is also the basis of complex perceptual tasks like medical palpation ,where physicians locate hidden anatomical structures and evaluate tissue properties using their hands. Kinesthetic information refers to the information acquired through the sensors in the joints. Interaction forces are normally perceived through a combination of these two information’s. Creation of Virtual environment (Virtual reality)
Virtual reality is the technology which allows a user to interact with a computer-simulated environment, whether that environment is a simulation of the real world or an imaginary world. Most current virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special or stereoscopic displays, but some simulations include additional sensory information, such as sound through speakers or headphones. Some advanced haptic systems now include tactile information, generally known as force feedback, in medical and gaming applications. Users can interact with a virtual environment or a virtual artifact (VA)either through the use of standard input devices such as a keyboard and mouse, or through multimodal devices such as a wired glove, the Polhemus boom arm, and omnidirectional treadmill.
The simulated environment can be similar to the real world, for example, simulations for pilot or combat training, or it can differ significantly from reality, as in VR games. In practice, it is currently very difficult to create a high-fidelity virtual reality experience, due to largely technical limitations on processing power,image resolution and communication bandwidth. However, those limitations are expected to eventually be overcome as processor, imaging and data communication technologies become more powerful and cost-effective over time. Virtual Reality is often used to describe a wide variety of applications, commonly associated with its immersive, highly visual, 3D environments.
The development of CAD software, graphics hardware acceleration, head mounted displays; database gloves and miniaturization have helped popularize the motion.The most successful use of virtual reality is generated 3-D simulators. The pilots use flight simulators. These flight simulators have designed just like cockpit of the airplanes or the helicopter. The screen in front of the pilot creates virtual environment and the trainers outside the simulators commands the simulator for adopt different modes. The pilots are trained to control the planes indifferent difficult situations and emergency landing. The simulator provides the environment. These simulators cost millions of dollars.
The virtual reality games are also used almost in the same fashion. The player has to wear special gloves, headphones, goggles, full body wearing and special sensory input devices. The player feels that he is in the real environment. The special goggles have monitors to see. The environment changes according to the moments of the player. These games are very expensive.
Virtual reality (VR) applications strive to simulate real or imaginary scenes with which users can interact and perceive the effects of their actions in real time. Ideally the user interacts with the simulation via all five senses. However, today’s typical VR applications rely on a smaller subset, typically vision, hearing, and more recently, touch.
Figure below shows the structure of a VR application incorporating visual, auditory, and haptic feedback. Haptic Feedback Block Diagram The application’s main elements are:1) The simulation engine, responsible for computing the virtual environments Behaviour over time;2) Visual, auditory, and haptic rendering algorithms, which compute the virtual Environment’s graphic, sound, and force responses toward the user; and3) Transducers, which convert visual, audio, and force signals from the Computer into a form the operator can perceive.
The human operator typically holds or wears the haptic interface device and perceives audiovisual feedback from audio (computer speakers, headphones, and so on) and visual displays (for example a computer screen or head-mounted display).Whereas audio and visual channels feature unidirectional information and energy flow (from the simulation engine toward the user), the haptic modality exchanges information and energy in two directions, from and toward the user. This bi-directionality is often referred to as the single most important feature of the haptic interaction modality.
A haptic device is the one that provides a physical interface between the user and the virtual environment by means of a computer. This can be done through an input/ output device that senses the body’s movement, such as joystick or data glove. By using haptic devices, the user can not only feed information to the computer but can also receive information from the computer in the form of a felt sensation on some part of the body. This is referred to as a haptic interface. These devices can be broadly classified into:-
a)Virtual reality/ Tele-robotics based devices:- Exoskeletons and Stationary device, Gloves and wearable devices, Point-source and Specific task devices, Locomotion Interfaces b)
Force feedback devices, Tactile displays
Virtual reality/Tele-robotics based devices:- Exoskeletons and Stationary devices: The term exoskeleton refers to the hard outer shell that exists on many creatures. In a technical sense, the word refers to a system that covers the user or the user has to wear. Current haptic devices that are classified as exoskeletons are large and immobile systems that the user must attach him or her to.
Gloves and wearable devices:
These devices are smaller exoskeleton-like devices that are often, but not always, take the down by a large exoskeleton or other immobile devices. Since the goal of building a haptic system is to be able to immerse a user in the virtual or remote environment and it is important to provide a small remainder of the user’s actual environment as possible. The drawback of the wearable systems is that since weight and size of the devices are a concern, the systems will have more limited sets of capabilities.
Point sources and specific task devices:
This is a class of devices that are very specialized for performing a particular given task. Designing a device to perform a single type of task restricts the application of that device to a much smaller number of functions. However it allows the designer to focus the device to perform its task extremely well. These task devices have two general forms, single point of interface devices and specific task devices.
An interesting application of haptic feedback is in the form of full body Force Feedback called locomotion interfaces. Locomotion interfaces are movement of force restrictiondevices in a confined space, simulating unrestrained mobility such as walking andrunning for virtual reality. These interfaces overcomes the limitations of using joysticks for maneuvering or whole body motion platforms, in which the user is seated and does not expend energy, and of room environments, where only short distances can betraversed.
b) Feedback Devices:-
Force feedback devices:
Force feedback input devices are usually, but not exclusively, connected to computer systems and is designed to apply forces to simulate the sensation of weight andresistance in order to provide information to the user. As such, the feedback hardware represents a more sophisticated form of input/output devices, complementing others such as keyboards, mice or trackers. Input from the user in the form of hand, or other body segment whereas feedback from the computer or other device is in the form of hand, or other body segment whereas feedback from the computer or other device is in the form of force or position. These devices translate digital information into physical sensations
Tactile display devices:
Simulation task involving active exploration or delicate manipulation of a virtualenvironment require the addition of feedback data that presents an object’s surface geometry or texture. Such feedback is provided by tactile feedback systems or tactile display devices. Tactile systems differ from haptic systems in the scale of the forces being generated. While haptic interfaces will present the shape, weight or compliance of an object, tactile interfaces present the surface properties of an object such as the object’s surface texture. Tactile feedback applies sensation to the skin.
c)COMMONLY USED HAPTIC INTERFACING DEVICES:-
It is a haptic interfacing device developed by a company named Sensible technologies. It is primarily used for providing a 3D touch to the virtual objects. This is a very high resolution 6 DOF device in which the user holds the end of a motor controlled jointed arm. It provides a programmable sense of touch that allows the user to feel the texture and shape of the virtual object with a very high degree of realism. One of its key features is that it can model free floating 3 dimensional objects.
The principle of a Cyber glove is simple. It consists of opposing the movement of the hand in the same way that an object squeezed between the fingers resists the movement of the latter. The glove must therefore be capable, in the absence of a real object, of recreating the forces applied by the object on the human hand with (1) the same intensity and (2) the same direction. These two conditions can be simplified by requiring the glove to apply a torque equal to the interphalangian joint. The solution that we have chosen uses a mechanical structure with three passive joints which, with the interphalangian joint, make up a flat four-bar closed-link mechanism. This solution use cables placed at the interior of the four-bar mechanism and following a trajectory identical to that used by the extensor tendons which, by nature, oppose the movement of the flexor tendons in order to harmonize the movement of the fingers. Among the advantages of this structure one can cite:-
•Allows 4 dof for each fingers
•Adapted to different size of the finger
Located on the back of the hand
•Apply different forces on each phalanx (The possibility of applying a lateral force on the fingertip by motorizing the abduction/adduction joint)
•Measure finger angular flexion (The measure of the joint angles are Independent and can have a good resolution given the important paths travelled by the cables when the finger shut. Cyber glove Mechanism
Mechanical structure of a Cyber glove:
The glove is made up of five fingers and has 19 degrees of freedom 5 of which are passive. Each finger is made up of a passive abduction joint which links it to the base (palm) and to 9 rotoid joints which, with the three interphalangian joints, make up 3closed-link mechanism with four bar and 1 degree of freedom. The structure of the thumb is composed of only two closed-links, for 3 dof of which one is passive. The segments of the glove are made of aluminum and can withstand high charges; their total weight does not surpass 350 grams. The length of the segments is proportional to the
length of the phalanxes. All of the joints are mounted on miniature ball bearings in order to reduce friction. Fig 3.4 Mechanical Structural of Cyber glove
The mechanical structure offers two essential advantages: the first is the facility of adapting to different sizes of the human hand. We have also provided for lateraladjustment in order to adapt the interval between the fingers at the palm. The second advantage is the presence of physical stops in the structure which offer complete security to the operator. The force sensor is placed on the inside of a fixed support on the upper part of the phalanx. The sensor is made up of a steel strip on which a strain gauge was glued. The position sensor used to measure the cable displacement is incremental optical encoders offering an average theoretical resolution equal to 0.1 deg for the finger joints.
Control of Cyber glove:
The glove is controlled by 14 torque motors with continuous current which can develop a maximal torque equal to 1.4 Nm and a continuous torque equal to 0.12 Nm. On each motor we fix a pulley with an 8.5 mm radius onto which the cable is wound. The maximal force that the motor can exert on the cable is thus equal to 14.0 N, a value sufficient to ensure opposition to the movement of the finger. The electronic interface of the force feedback data glove is made of PC with several acquisition cards.
The global scheme of the control is given in the figure shown below. One can distinguish two command loops: an internal loop which corresponds to a classic force control with constant gains and an external loop which integrates the model of distortion of the virtual object in contact with the fingers. In this schema the action of man on the position of the fingers joints is taken into consideration by the two control loops. Man is considered as a displacement generator while the glove is considered as a force generator Haptic Rendering:
It is a process of applying forces to the user through a force-feedback device. Using haptic rendering, we can enable a user to touch, feel and manipulate virtual objects. Enhance a user’s experience in virtual environment. Haptic rendering is process of displaying synthetically generated 2D/3D haptic stimuli to the user. The haptic interface acts as a two-port system terminated on one side by the human operator and on the other side by the virtual environment.
The addition of haptics to various applications of virtual reality and teleoperation opens exciting possibilities. Three example applications that have been pursued at our Touch Lab are summarized below.
• Medical Simulators: Just as flight simulators are used to train pilots, the multimodal virtual environment system we have developed is being used in developing virtual reality based needle procedures and surgical simulators that enable a medical trainee to see, touch, and manipulate realistic models of biological tissues and organs. The work involves the development of both instrumented hardware and software algorithms for real-time displays. An epidural injection simulator has already been tested by residents and experts in two hospitals. A minimally invasive surgery simulator is also being developed and includes (a) in vivo measurement of the mechanical properties tissues and organs, (b) development of a variety of real-time algorithms for the computation of tool-tissue force interactions and organ deformations, and (c) verification of the traning effectiveness of the simulator. This work is reviewed in . .
• Collaborative Haptics: In another project, the use of haptics to improve humancomputer interaction as well as human-human interactions mediated by computers is being explored. A multimodal shared virtual environment system has been developed and experiments have been performed with human subjects to study the role of haptic feedback in collaborative tasks and whether haptic communication through force feedback can facilitate a sense of being and collaborating with a remote partner. Two scenarios, one in which the partners are in close proximity and the other in which they are separated by several thousand miles (transatlantic touch with collaborators in University College, London, ), have been demonstrated.
• Brain Machine Interfaces: In a collaborative project with Prof. Nicolelis of Duke University Medical School, we recently succeeded in controlling a robot in real-time using signals from about 100 neurons in the motor cortex of a monkey . We demonstrated that this could be done not only with a robot within Duke, but also across the internet with a robot in our lab. This work opens a whole new paradigm for studying the sensorimotor functions in the Central Nervous System. In addition, a future application is the possibility of implanted brain-machine interfaces for paralyzed patients to control external devices such as smart prostheses, similar to pacemakers or cochlear implants.
Given below are several more potential applications:
• Medicine: manipulating micro and macro robots for minimally invasive surgery; remote diagnosis for telemedicine; aids for the disabled such as haptic interfaces for the blind. • Entertainment: video games and simulators that enable the user to feel and manipulate virtual solids, fluids, tools, and avatars. • Education: giving students the feel of phenomena at nano, macro, or astronomical scales; “what if” scenarios for non-terrestrial physics; experiencing complex data sets. • Industry: integration of haptics into CAD systems such that a designer can freely manipulate the mechanical components of an assembly in an immersive environment. • Graphic Arts: virtual art exhibits, concert rooms, and museums in which the user can login remotely to play the musical instruments, and to touch and feel the haptic attributes of the displays; individual or co-operative virtual sculpturing across the internet
APPLICATIONS, LIMITATION & FUTUREVISION
Haptic interfaces for medical simulation may prove especially useful for training in minimally invasive procedures such as laparoscopy and interventional radiology, as well as for performing remote surgery. A particular advantage of this type of work is that surgeons can perform more operations of a similar type with less fatigue. It is well documented that a surgeon who performs more procedures of a given kind will have statistically better outcomes for his patients. Haptic interfaces are also used in rehabilitation. By using this technology a person can have exercise simulated and be used to rehabilitate somebody with injury.
A Virtual Haptic Back (VHB) was successfully integrated in the curriculum at the Ohio University College of Osteopathic Medicine. Research indicates that VHB is a significant teaching aid in palpatory diagnosis (detection of medical problems via touch). The VHB simulates the contour and stiffness of human backs, which are palpated with two haptic interfaces (SensAble Technologies, PHANToM 3.0). Haptics have also been applied in the field of prosthetics and orthotics. Research has been underway to provide essential feedback from a prosthetic limb to its wearer. Several research projects through the US Department of Education and National Institutes of Health focused on this area. Recent work by Edward Colgate, Pravin Chaubey, and Allison Okamura et al. focused on investigating fundamental issues and determining effectiveness for rehabilitation.
Haptic feedback is commonly used in arcade games, especially racing video games. In 1976, Sega’s motorbike game Moto-Cross, also known as Fonz, was the first game to use haptic feedback which caused the handlebars to vibrate during a collision with another vehicle. Tatsumi’s TX-1 introduced force feedback to car driving games in 1983. Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels. Early implementations were provided through optional components, such as the Nintendo 64controller’s Rumble Pak.
Many newer generation console controllers and joysticks feature built in feedback devices, including Sony’s DualShock technology. Some automobile steering wheel controllers, for example, are programmed to provide a “feel” of the road. As the user makes a turn or accelerates, the steering wheel responds by resisting turns or slipping out of control. In 2007, Novint released the Falcon, the first consumer 3D touch device with high resolution three-dimensional force feedback; this allowed the haptic simulation of objects, textures, recoil, momentum, and the physical presence of objects in games.
In 2008, Apple’s MacBook and MacBook Pro started incorporating a “Tactile Touchpad” design with button functionality and haptic feedback incorporated into the tracking surface. Products such as the Synaptics ClickPad followed thereafter. Windows and Mac operating environments, will also benefit greatly from haptic interactions. Imagine being able to feel graphic buttons and receive force feedback as you depress a button.
Tactile haptic feedback is becoming common in cellular devices. Handset manufacturers like LG and Motorola are including different types of haptic technologies in their devices; in most cases, this takes the form of vibration response to touch. The Nexus One features haptic feedback, according to their specifications. Nokia phone designers have perfected a tactile touch screen that makes on-screen buttons behave as if they were real buttons. When a user presses the button, he or she feels movement in and movement out. He also hears an audible click. Nokia engineers accomplished this by placing two small piezoelectric sensor pads under the screen and designing the screen soit could move slightly when pressed. Everything, movement and sound is synchronized perfectly to simulate real button manipulation.
The Shadow Hand uses the sense of touch, pressure, and position to reproduce the strength, delicacy, and complexity of the human grip. The SDRH was developed by Richard Greenhill and his team of engineers in London as part of The Shadow Project, now known as the Shadow Robot Company, an ongoing research and development program whose goal is to complete the first convincing artificial humanoid. An early prototype can be seen in NASA’s collection of humanoid robots, or robonauts. The Shadow Hand has haptic sensors embedded in every joint and finger pad, which relay information to a central computer for processing and analysis. Carnegie Mellon University in Pennsylvania and Bielefeld University in Germany found The Shadow Hand to be an invaluable tool in advancing the understanding of haptic awareness, and in 2006 they were involved in related research. The first PHANTOM, which allows one to interact with objects in virtual reality through touch, was developed by Thomas Massie while a student of Ken Salisbury at MIT.
Future applications of haptic technology cover a wide spectrum of human interaction with technology. Current research focuses on the mastery of tactile interaction with holograms and distant objects, which if successful may result in applications and advancements in gaming, movies, manufacturing, medical, and other industries. The medical industry stands to gain from virtual and telepresence surgeries, which provide new options for medical care. The clothing retail industry could gain from haptic technology by allowing users to “feel” the texture of clothes for sale on the internet. Future advancements in haptic technology may create new industries that were previously not feasible or realistic.
Future medical applications
One currently developing medical innovation is a central workstation used by surgeons to perform operations remotely. Local nursing staff set up the machine and prepare the patient, and rather than travel to an operating room, the surgeon becomes a telepresence. This allows expert surgeons to operate from across the country, increasing availability of expert medical care. Haptic technology provides tactile and resistance feedback to surgeons as they operate the robotic device. As the surgeon makes an incision, they feel ligaments as if working directly on the patient. As of 2003, researchers at Stanford University were developing technology to simulate surgery for training purposes. Simulated operations allow surgeons and surgical students to practice and train more. Haptic technology aids in the simulation by creating a realistic environment of touch.
Much like telepresence surgery, surgeons feel simulated ligaments, or the pressure of a virtual incision as if it were real. The researchers, led by J. Kenneth Salisbury Jr., professor of computer science and surgery, hope to be able to create realistic internal organs for the simulated surgeries, but Salisbury stated that the task will be difficult. The idea behind the research is that “just as commercial pilots train in flight simulators before they’re unleashed on real passengers, surgeons will be able to practice their first incisions without actually cutting anyone”. According to a Boston University paper published in The Lancet, “Noise-based devices, such as randomly vibrating insoles, could also ameliorate age-related impairments in balance control.” If effective, affordable haptic insoles were available, perhaps many injuries from falls in old age or due to illness-related balance-impairment could be avoided.