Study Captures How Humans Touch Unfamiliar Objects, Offering Lessons for Human–Robot Interaction

Study Captures How Humans Touch Unfamiliar Objects, Offering Lessons for Human–Robot Interaction

    Study Captures How Humans Touch Unfamiliar Objects, Offering Lessons for Human–Robot Interaction

    By Anne J. Manning, Harvard John A. Paulson School of Engineering and Applied Sciences – Edited by Lisa Lock, Reviewed by Robert Egan – September 2025

    How We Touch

    Humans constantly use touch to explore and understand the world — picking up objects to gauge their weight, running fingers along surfaces to detect roughness, or manipulating shapes to learn about their properties. But what happens when we encounter entirely unfamiliar objects with no explicit task or goal?

    Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) sought to answer this question through a novel experiment blending art and science. Led by robotics researcher and artist Buse Aktaş, now a research group leader at the Max Planck Institute for Intelligent Systems, the team systematically studied how people use open-ended touch to interact with strange, dynamic objects. Their findings, published in PLOS One, could inform more intuitive human–robot interaction, medical devices, industrial design, and even immersive art or gaming experiences.

    Experimental Design: Art Meets Engineering

    The study used an art installation-like setup with three stations:

    • One featured familiar objects like a potato chip bag and rolling pin.

    • Another had abstract geometric forms.

    • The third was biomorphic — a soft, intestine-like tubular structure with spikes.

    These objects had dynamic properties, periodically stiffening and softening with pumped air. Forty participants rotated through the stations, freely touching, lifting, pressing, or simply observing. Their interactions were recorded and analyzed, with guidance from co-author Roberta Klatzky, a leading expert in haptic perception from Carnegie Mellon University.

    After the study, Aktaş turned the experiment into a public art installation at the Harvard Art Lab.

    Four Categories of Human Touch

    Detailed analysis revealed that even without explicit instructions, participants engaged in distinct and repeatable patterns of movement, linked to self-created goals such as exploration, manipulation, or play. The researchers identified four key categories:

    1. Passive Observational – minimal touch, hovering hands, or stepping back to observe.

    2. Active Perceptual – pressing, lifting, rubbing to gather tactile information.

    3. Constructive – reshaping or rearranging, including stacking, coiling, folding, flattening, knotting.

    4. Hedonic – touch for pleasure, such as stroking, flicking, or massaging.

    The type of object influenced the response. Participants performed more constructive actions with abstract objects but were more observational with familiar items like the potato chip bag. Notably, objects that changed state kept participants engaged longer, encouraging more physical interaction.

    Implications for Human–Machine Collaboration

    The research could help designers of interactive systems distinguish between observation and active exploration, leading to richer user experiences in virtual reality, computer games, and virtual tours.

    In robotics, these insights could inform protocols for safe, intuitive, and responsive human–robot collaboration, where smart materials can be engineered to invite or guide human interaction.

    “People, even without a given goal, make up their own,” said Aktaş. “Understanding these self-directed behaviors can help us build machines and systems that respond in more human-centered ways.”

    Zalo
    Hotline