Event Date/Time
Location
Room 222
Series/Event Type
When you touch objects in your surroundings, you can discern each item’s physical properties from the rich array of haptic cues you experience, including both the tactile sensations arising in your skin and the kinesthetic cues originating in your muscles and joints. Although physical interaction with the world is at the core of human experience, very few robotic and computer interfaces provide the operator with high-fidelity touch feedback, limiting their usability. This talk will describe recent Penn Haptics research projects that leverage tactile cues to allow a user to interact with distant or virtual environments as though they were real and within reach. First, we created and evaluated a wearable hand controller that enables a human to manipulate objects via a remote PR2 robot while feeling the robot's grasp force, fingertip contact and pressure, and high-frequency vibrations. Second, we developed novel algorithms for mapping sensed fingertip deformations to the actuation of a fingertip tactile display and demonstrated the striking utility of such cues in a simulated tissue palpation task through integration with a da Vinci surgical robot. Third, we built on our prior work in haptic feedback of instrument vibrations to show that a surgeon's skill at a given robotic surgery practice task can be calculated automatically from measurements of completion time, task contact force, and robot arm accelerations. Finally, we created the world's most realistic haptic virtual surfaces by combining data-driven friction, tapping, and texture feedback.