AnyTouch 2: General Optical Tactile Representation Learning For Dynamic Tactile Perception
Overview
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors propose a five-tier tactile dynamic pyramid framework that stratifies tactile data by the complexity of dynamic perception capabilities they support, and introduce ToucHD, a large-scale hierarchical dataset spanning simulated atomic actions, real-world manipulations, and touch-force pairs to enrich higher-tier dynamic tactile data.
The authors develop AnyTouch 2, a unified representation learning framework that integrates pixel-level deformation modeling, semantic-level tactile feature understanding, and physical-level force dynamics prediction to support hierarchical dynamic tactile perception across multiple sensor types.
The authors introduce specialized modules including frame-difference reconstruction for capturing fine-grained temporal variations, action matching for semantic-level dynamic understanding, and force prediction tasks to explicitly model physical properties underlying tactile interactions.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[6] AnyTouch: Learning Unified Static-Dynamic Representation across Multiple Visuo-tactile Sensors PDF
[48] Transferable Tactile Transformers for Representation Learning Across Diverse Sensors and Tasks PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
Tactile Dynamic Pyramid and ToucHD Dataset
The authors propose a five-tier tactile dynamic pyramid framework that stratifies tactile data by the complexity of dynamic perception capabilities they support, and introduce ToucHD, a large-scale hierarchical dataset spanning simulated atomic actions, real-world manipulations, and touch-force pairs to enrich higher-tier dynamic tactile data.
[6] AnyTouch: Learning Unified Static-Dynamic Representation across Multiple Visuo-tactile Sensors PDF
[51] Reassemble: A multimodal dataset for contact-rich robotic assembly and disassembly PDF
[52] Emotion recognition using affective touch: A survey PDF
[53] Haptic codecs for the tactile internet PDF
[54] Deep multi-model fusion network based real object tactile understanding from haptic data PDF
[55] CoMPAS3D: complex multi-level person-interaction annotated salsa dataset PDF
[56] Fabric surface characterization: Assessment of deep learning-based texture representations using a challenging dataset PDF
[57] Snake Robot with Tactile Perception Navigates on Large-scale Challenging Terrain PDF
[58] Multi-Scale Voting System for Robotic Tactile Texture Recognition on Uneven Surfaces PDF
[59] Spatiotemporal Organization of Touch Information in Tactile Neuron Population Responses PDF
AnyTouch 2 General Tactile Representation Learning Framework
The authors develop AnyTouch 2, a unified representation learning framework that integrates pixel-level deformation modeling, semantic-level tactile feature understanding, and physical-level force dynamics prediction to support hierarchical dynamic tactile perception across multiple sensor types.
[70] Tactile Sensor Integrated Fingertip Capable of Detecting Precise Contact Force for Robotic Grippers PDF
[71] Can Vision Feel Touch? Tactile-aware Visual Grasping for Transparent Objects PDF
[72] Towards forceful robotic foundation models: a literature survey PDF
[73] Development and evaluation of refreshable Braille display and active touch-reading system for digital reading of the visually impaired PDF
[74] Capturing forceful interaction with deformable objects using a deep learning-powered stretchable tactile array PDF
[75] Multi-sensor data fusion and time series to image encoding for hardness recognition PDF
[76] VibroTouch: Active Tactile Sensor for Contact Detection and Force Sensing via Vibrations PDF
[77] Tactile Robot Programming: Transferring Task Constraints into Constraint-Based Unified Force-Impedance Control PDF
[78] Investigation of Experimental Devices for Finger Active and Passive Tactile Friction Analysis PDF
[79] Intrinsic contact sensing and object perception of an adaptive fin-ray gripper integrating compact deflection sensors PDF
Multi-Level Dynamic Enhanced Modules
The authors introduce specialized modules including frame-difference reconstruction for capturing fine-grained temporal variations, action matching for semantic-level dynamic understanding, and force prediction tasks to explicitly model physical properties underlying tactile interactions.