DexMove: Learning Tactile-Guided Non-Prehensile Manipulation with Dexterous Hands
Overview
Overall Novelty Assessment
DexMove introduces a tactile-guided framework for non-prehensile manipulation using dexterous hands, combining simulation-based trajectory generation with human tactile demonstrations captured via wearable sensors. The taxonomy places this work in the 'Purely Tactile In-Hand Rotation and Regrasping' leaf, which contains four papers including DexMove itself. This leaf sits within the broader 'Tactile-Based In-Hand Manipulation' branch, indicating a moderately populated research direction focused on touch-driven object repositioning. The leaf's scope emphasizes tactile-only sensing without visual feedback, distinguishing it from visuotactile approaches in sibling categories.
The taxonomy reveals neighboring research directions that contextualize DexMove's positioning. Adjacent leaves address deformable object manipulation, visuotactile fusion, and tactile skin-based translation, while a parallel branch explores non-prehensile pushing and pulling with geometry-aware methods. DexMove bridges these areas by applying tactile-only sensing to non-prehensile tasks, a combination less explored than either purely tactile in-hand grasping or vision-guided pushing. The 'Learning and Control Frameworks' branch shows related work on deep reinforcement learning and representation learning for tactile policies, suggesting DexMove's flow-based approach operates within an active methodological landscape.
Among twenty-two candidates examined through semantic search, none clearly refute DexMove's three core contributions. The framework itself (zero refutable candidates from ten examined) appears distinct in combining wrist-finger synergy for non-prehensile tasks. The hybrid data synthesis pipeline (zero from ten) and flow-based TaFo-Net policy (zero from two) similarly show no substantial prior overlap within this limited search scope. These statistics suggest novelty relative to the examined candidates, though the modest search scale (twenty-two papers versus thirty-nine in the full taxonomy) means unexplored prior work may exist beyond top semantic matches.
Based on the limited literature search, DexMove appears to occupy a relatively sparse intersection between tactile-only sensing and non-prehensile manipulation with dexterous hands. The taxonomy structure indicates this combination is less crowded than purely tactile grasping or geometry-aware pushing alone. However, the analysis covers top-K semantic matches and does not exhaustively survey all thirty-nine taxonomy papers or broader literature, leaving open the possibility of relevant work outside the examined candidate set.
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors introduce DexMove, a complete framework that enables dexterous robotic hands to perform non-prehensile manipulation tasks using tactile guidance. The framework integrates simulation-based trajectory generation with human-demonstrated tactile data to train policies for robust object repositioning.
The authors propose a novel data acquisition paradigm that combines scalable simulation for generating diverse wrist-finger trajectories with a wearable device equipped with vision-based tactile sensors to capture real-world multi-finger contact data from human demonstrations, addressing the challenge of obtaining large-scale contact-aware datasets.
The authors develop a flow-matching based policy architecture that jointly controls the wrist and fingers for non-prehensile manipulation, incorporating TaFo-Net to learn spatiotemporal inter-finger force representations from human demonstrations for coordinated multi-contact control.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[1] Learning Purely Tactile In-Hand Manipulation with a Torque-Controlled Hand PDF
[2] Touch-Based Manipulation with Multi-Fingered Robot using Off-policy RL and Temporal Contrastive Learning PDF
[3] DexTouch: Learning to Seek and Manipulate Objects with Tactile Dexterity PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
DexMove framework for tactile-guided non-prehensile manipulation with dexterous hands
The authors introduce DexMove, a complete framework that enables dexterous robotic hands to perform non-prehensile manipulation tasks using tactile guidance. The framework integrates simulation-based trajectory generation with human-demonstrated tactile data to train policies for robust object repositioning.
[1] Learning Purely Tactile In-Hand Manipulation with a Torque-Controlled Hand PDF
[5] Grasping and control of multi-fingered hands PDF
[6] Grasping and manipulation with a multi-fingered hand PDF
[8] Sliding Touch-Based Exploration for Modeling Unknown Object Shape with Multi-Fingered Hands PDF
[9] Grasp with push policy for multi-finger dexterity hand based on deep reinforcement learning PDF
[21] Task-grasping from a demonstrated human strategy PDF
[42] TacGNN:Learning Tactile-based In-hand Manipulation with a Blind Robot PDF
[43] Learning robot in-hand manipulation with tactile features PDF
[44] Human-like Dexterous Grasping Through Reinforcement Learning and Multimodal Perception PDF
[45] Laser-actuated multi-fingered hand for dexterous manipulation of micro-objects PDF
Hybrid data synthesis pipeline combining simulation and human tactile demonstrations
The authors propose a novel data acquisition paradigm that combines scalable simulation for generating diverse wrist-finger trajectories with a wearable device equipped with vision-based tactile sensors to capture real-world multi-finger contact data from human demonstrations, addressing the challenge of obtaining large-scale contact-aware datasets.
[46] Cyberdemo: Augmenting simulated human demonstration for real-world dexterous manipulation PDF
[47] Few-Shot Transfer of Tool-Use Skills Using Human Demonstrations With Proximity and Tactile Sensing PDF
[48] Soft contact simulation and manipulation learning of deformable objects with vision-based tactile sensor PDF
[49] NeuralTouch: Neural Descriptors for Precise Sim-to-Real Tactile Robot Control PDF
[50] A modular approach to learning manipulation strategies from human demonstration PDF
[51] Gentle Manipulation Policy Learning via Demonstrations from VLM Planned Atomic Skills PDF
[52] Efficient and stable online learning for developmental robots PDF
[53] VTAO-BiManip: Masked Visual-Tactile-Action Pre-training with Object Understanding for Bimanual Dexterous Manipulation PDF
[54] Bayesian perception of touch for control of robot emotion PDF
[55] Learning Generalizable Dexterous Manipulation PDF
Flow-based policy with TaFo-Net for synergistic wrist-finger control
The authors develop a flow-matching based policy architecture that jointly controls the wrist and fingers for non-prehensile manipulation, incorporating TaFo-Net to learn spatiotemporal inter-finger force representations from human demonstrations for coordinated multi-contact control.