A slab of steel and rainbow-colored wires frame a small glass pad. When mechanical engineering graduate student Joe Mullenbach clicks away on a nearby computer monitor, the pad lights up and a gray 2-D ball begins bouncing around like it’s in a mini-game of air hockey. The animation is simple, but this device is actually a catalyst for a developing field of technology: surface haptics.
A finger placed in contact with the glass screen, called a TPaD, can push the ball around, but that’s not all; the ball pushes back.
In a world of increasing digitalization, Professor of Mechanical Engineering Ed Colgate says he believes people are losing meaningful haptic, or tactile, interaction with everyday objects like knobs, buttons and levers.
“The information flows from your fingers to the touch screen just fine,” says Michael Peshkin, also a professor of mechanical engineering. “But with present devices, there’s no information flowing back from the touchscreen to your fingers.”
Colgate and Peshkin, principal investigators in Northwestern haptics research, hope to bring texture back to technology by simulating real-world transactions between humans and objects. Along with a team of graduate students, including Mullenbach, they’ve been researching the possibilities and applications of touchscreen tools that “touch you back.”
The project, begun in 2006, seeks to bridge the gap between people and the inaccessible 2-D world of screens, according to Colgate. It could, in the foreseeable future, influence car navigational systems, tablets and touchscreen phones, as well as create touch technology for users who are blind or have low vision. A game user would be able to feel the plucking of a guitar string, the edges of a button or the stretchiness of fabric.
“It’s very hard to predict what the application game developers, the user-interface developers, where they’re going to go with [this technology],” Peshkin says. “If you give them a new creative tool, an opportunity to be creative, they’re going to surprise you with all the different new ideas.”
The TPaD, a first-generation haptics program Mullenbach uses for further research, functions by controlling natural friction from the hand. When the device turns on, it vibrates up and down at a frequency that can’t be heard. The connected finger floats above the screen at certain points, due to small pushes from the vibrations.
“It’s like you’re sliding your finger on ice, but [you’re] not cold,” Mullenbach says. “It’s a slippery, air hockey-type feel.”
This makes contact with the glass unique; the glass feels sticky, like the finger has hit an object and can’t proceed any further. Texture is nothing more than a series of small pushes, Peshkin says, so the vibrations mimic real textures.
Short, game-like simulations serve as examples of what the TPaD can do. A swipe against a ball causes a change in tactile perception that feels like hitting a real object. Another program simulates rolling a pencil back and forth on top of a table-like surface — the angles on the virtual pencil’s sides feel real. The sound of the flat edges on the table can almost be heard through touch.
By using software-generated, yet real physical forces like vibrations and electrical fields, the researchers can fool the brain’s touch receptors, Colgate and Peshkin say.
Forces give hints to the finger, which has touched millions of things over the years and knows how to fill in the blanks by now; the mind completes a representation of the object, and an illusion of interaction is born. This aspect of the research is called psychophysics.
It’s basically crossing the “bridge between things feeling and things interpreting,” says Steven Manuel, another graduate student working on the project. Psychophysics is especially useful in Manuel’s specialized field of touchscreen haptics using more than one finger.
“If you touch two spots of a cup, you know it’s curved because you fill it in with memory and extrapolate,” he says.
Going ahead, one of the biggest challenges in Colgate and Peshkin’s research is to make the tactile experience engaging and convincing. The project is at that stage now, working with multiple fingers in a 2-D virtual reality.
“How do you create effects that are meaningful and that have a lot of information content, that have very rich interactions?” Colgate says. “We don’t really normally interact with the world on a flat surface. It’s kind of like being in flat land.”