DexMove: Learning Tactile-Guided Non-Prehensile Manipulation with Dexterous Hands

ICLR 2026 Conference SubmissionAnonymous Authors
tactileroboticsdexterous handmanipulation
Abstract:

Non-prehensile manipulation offers a robust alternative to traditional pick-and-place methods for object repositioning. However, learning such skills with dexterous, multi-fingered hands remains largely unexplored, leaving their potential for stable and efficient manipulation underutilized. Progress has been limited by the lack of large-scale, contact-aware non-prehensile datasets for dexterous hands and the absence of wrist–finger control policies. To bridge these gaps, we present DexMove, a tactile-guided non-prehensile manipulation framework for dexterous hands. DexMove combines a scalable simulation pipeline that generates physically plausible wrist–finger trajectories with a wearable device, which captures multi-finger contact data from human demonstrations using vision-based tactile sensors. Using these data, we train a flow-based policy that enables real-time, synergistic wrist–finger control for robust non-prehensile manipulation of diverse tabletop objects. In real-world experiments, DexMove successfully manipulated six objects of varying shapes and materials, achieving a 77.8% success rate. Our method outperforms ablated baselines by 36.6% and improves efficiency by nearly 300%. Furthermore, the learned policy generalizes to language-conditioned, long-horizon tasks such as object sorting and desktop tidying.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

DexMove introduces a tactile-guided framework for non-prehensile manipulation using dexterous hands, combining simulation-based trajectory generation with human tactile demonstrations captured via wearable sensors. The taxonomy places this work in the 'Purely Tactile In-Hand Rotation and Regrasping' leaf, which contains four papers including DexMove itself. This leaf sits within the broader 'Tactile-Based In-Hand Manipulation' branch, indicating a moderately populated research direction focused on touch-driven object repositioning. The leaf's scope emphasizes tactile-only sensing without visual feedback, distinguishing it from visuotactile approaches in sibling categories.

The taxonomy reveals neighboring research directions that contextualize DexMove's positioning. Adjacent leaves address deformable object manipulation, visuotactile fusion, and tactile skin-based translation, while a parallel branch explores non-prehensile pushing and pulling with geometry-aware methods. DexMove bridges these areas by applying tactile-only sensing to non-prehensile tasks, a combination less explored than either purely tactile in-hand grasping or vision-guided pushing. The 'Learning and Control Frameworks' branch shows related work on deep reinforcement learning and representation learning for tactile policies, suggesting DexMove's flow-based approach operates within an active methodological landscape.

Among twenty-two candidates examined through semantic search, none clearly refute DexMove's three core contributions. The framework itself (zero refutable candidates from ten examined) appears distinct in combining wrist-finger synergy for non-prehensile tasks. The hybrid data synthesis pipeline (zero from ten) and flow-based TaFo-Net policy (zero from two) similarly show no substantial prior overlap within this limited search scope. These statistics suggest novelty relative to the examined candidates, though the modest search scale (twenty-two papers versus thirty-nine in the full taxonomy) means unexplored prior work may exist beyond top semantic matches.

Based on the limited literature search, DexMove appears to occupy a relatively sparse intersection between tactile-only sensing and non-prehensile manipulation with dexterous hands. The taxonomy structure indicates this combination is less crowded than purely tactile grasping or geometry-aware pushing alone. However, the analysis covers top-K semantic matches and does not exhaustively survey all thirty-nine taxonomy papers or broader literature, leaving open the possibility of relevant work outside the examined candidate set.

Taxonomy

Core-task Taxonomy Papers
39
3
Claimed Contributions
22
Contribution Candidate Papers Compared
0
Refutable Paper

Research Landscape Overview

Core task: tactile-guided non-prehensile manipulation with dexterous hands. This field explores how multi-fingered robotic systems can manipulate objects without forming stable grasps, relying instead on tactile feedback to guide dynamic contact interactions. The taxonomy reveals a rich landscape organized around several complementary themes. Tactile-Based In-Hand Manipulation focuses on purely touch-driven strategies for rotating and repositioning objects within the hand, while Non-Prehensile Manipulation with Dexterous Hands examines pushing, rolling, and sliding primitives that exploit finger surfaces rather than enclosing grasps. Tactile Sensing and Feedback branches address sensor design and signal processing, and Learning and Control Frameworks investigate how reinforcement learning and model-based methods can acquire dexterous skills from tactile data. Additional branches cover grasp planning, hand dynamics, teleoperation interfaces, variable friction surfaces, vision-tactile integration, and soft robotic designs, collectively spanning the spectrum from foundational contact mechanics to modern learning-based approaches. Within this landscape, a particularly active line of work centers on purely tactile in-hand rotation and regrasping, where systems must rely exclusively on touch signals to reorient objects. DexMove[0] sits squarely in this cluster, emphasizing tactile-only policies for non-prehensile repositioning tasks. Nearby efforts such as Tactile In-Hand Manipulation[1] and Touch Multi-Fingered RL[2] similarly explore touch-driven learning but may incorporate different sensor modalities or reward structures. DexTouch[3] also operates in this space, highlighting the trade-offs between pure tactile feedback and hybrid visuotactile approaches found in branches like Vision-Integrated Dexterous Hand Systems. A recurring theme across these works is balancing the richness of tactile information against the complexity of contact dynamics, with open questions around sample efficiency, sim-to-real transfer, and generalization to diverse object geometries. DexMove[0] contributes to this dialogue by demonstrating how tactile signals alone can guide sophisticated non-prehensile maneuvers, positioning itself among recent touch-centric methods that push the boundaries of what dexterous hands can achieve without visual input.

Claimed Contributions

DexMove framework for tactile-guided non-prehensile manipulation with dexterous hands

The authors introduce DexMove, a complete framework that enables dexterous robotic hands to perform non-prehensile manipulation tasks using tactile guidance. The framework integrates simulation-based trajectory generation with human-demonstrated tactile data to train policies for robust object repositioning.

10 retrieved papers
Hybrid data synthesis pipeline combining simulation and human tactile demonstrations

The authors propose a novel data acquisition paradigm that combines scalable simulation for generating diverse wrist-finger trajectories with a wearable device equipped with vision-based tactile sensors to capture real-world multi-finger contact data from human demonstrations, addressing the challenge of obtaining large-scale contact-aware datasets.

10 retrieved papers
Flow-based policy with TaFo-Net for synergistic wrist-finger control

The authors develop a flow-matching based policy architecture that jointly controls the wrist and fingers for non-prehensile manipulation, incorporating TaFo-Net to learn spatiotemporal inter-finger force representations from human demonstrations for coordinated multi-contact control.

2 retrieved papers

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

DexMove framework for tactile-guided non-prehensile manipulation with dexterous hands

The authors introduce DexMove, a complete framework that enables dexterous robotic hands to perform non-prehensile manipulation tasks using tactile guidance. The framework integrates simulation-based trajectory generation with human-demonstrated tactile data to train policies for robust object repositioning.

Contribution

Hybrid data synthesis pipeline combining simulation and human tactile demonstrations

The authors propose a novel data acquisition paradigm that combines scalable simulation for generating diverse wrist-finger trajectories with a wearable device equipped with vision-based tactile sensors to capture real-world multi-finger contact data from human demonstrations, addressing the challenge of obtaining large-scale contact-aware datasets.

Contribution

Flow-based policy with TaFo-Net for synergistic wrist-finger control

The authors develop a flow-matching based policy architecture that jointly controls the wrist and fingers for non-prehensile manipulation, incorporating TaFo-Net to learn spatiotemporal inter-finger force representations from human demonstrations for coordinated multi-contact control.

DexMove: Learning Tactile-Guided Non-Prehensile Manipulation with Dexterous Hands | Novelty Validation