Texas researchers look at how enhanced touchscreens could enable users to ‘feel’ objects

Researchers at Texas A&M University’s Department of Mechanical Engineering have released a study looking at how touchscreen technology could be enhanced to mimic the feeling of physical objects.

The research team is working on better defining how the finger interacts with a device with the hope of aiding in the further development of technology that goes beyond sensing and reacting to touch. 

The goal of furthering this human-machine interface is to give touch devices the ability to provide users with a richer touch-based experience by equipping the technology with the ability to mimic the feeling of physical objects.

One example of how this could work is enabling users to feel the texture of materials on their smartphones before purchasing them.

“This could allow you to actually feel textures, buttons, slides and knobs on the screen,” said Cynthia Hipwell, professor of mechanical engineering at Texas A&M University. “It can be used for interactive touchscreen-based displays, but one holy grail would certainly be being able to bring touch into shopping so that you could feel the texture of fabrics and other products while you’re shopping online.”

According to Hipwell, the “touch” in current touchscreen technology is more for the screen’s benefit than the user. With the emergence and refinement of increasingly sophisticated haptic technology, that relationship between user and device can grow to be more reciprocal. 

She added that the addition of touch as a sensory input would ultimately enrich virtual environments and lighten the burden of communication currently carried by audio and visuals. “When we look at virtual experiences, they’re primarily audio and visual right now and we can get audio and visual overload,” Hipwell said. “Being able to bring touch into the human-machine interface can bring a lot more capability, much more realism, and it can reduce that overload. Haptic effects can be used to draw your attention to make something easier to find or easier to do using a lower cognitive load.”

Hipwell and her team are approaching the research by looking at the multiphysics – the coupled processes or systems involving multiple physical fields occurring at the same time –of the interface between the user’s finger and the device. This interface is incredibly complex and changes with different users and environmental conditions.

“We’re looking at electro-wetting effects (the forces that result from an applied electric field), electrostatic effects, changes in properties of the finger, the material properties and surface geometry of the device, the contact mechanics, the fluid motion, charge transport – really, everything that’s going on in the interface to understand how the device can be designed to be more reliable and higher performing,” Hipwell said. “Ultimately, our goal is to create predictive models than enable a designer to create devices with maximum haptic effect and minimum sensitivity to user and environmental variation.”

As research into and development of the technology continues to progress, Hipwell said she predicts consumers will begin to see early elements implemented into common devices over the next few years, with some early products already in development.

“I think early elements of it will definitely be within the next five years,” Hipwell said. “Then, it will just be a matter of maturing the technology and how advanced, how realistic and how widespread it becomes.”

Read the full study here: https://onlinelibrary.wiley.com/doi/abs/10.1002/adma.202170240