LMask: Learn to Solve Constrained Routing Problems with Lazy Masking

ICLR 2026 Conference SubmissionAnonymous Authors
Neural Combinatorial OptimizationRouting ProblemDeep Learning
Abstract:

Routing problems are canonical combinatorial optimization tasks with wide-ranging applications in logistics, transportation, and supply chain management. However, solving these problems becomes significantly more challenging when complex constraints are involved. In this paper, we propose LMask, a novel learning framework that utilizes dynamic masking to generate high-quality feasible solutions for constrained routing problems. LMask introduces the LazyMask decoding method, which lazily refines feasibility masks with the backtracking mechanism. In addition, it employs the refinement intensity embedding to encode the search trace into the model, mitigating representation ambiguities induced by backtracking. To further reduce sampling cost, LMask sets a backtracking budget during decoding, while constraint violations are penalized in the loss function during training to counteract infeasibility caused by this budget. We provide theoretical guarantees for the validity and probabilistic optimality of our approach. Extensive experiments on the traveling salesman problem with time windows (TSPTW) and TSP with draft limits (TSPDL) demonstrate that LMask achieves state-of-the-art feasibility rates and solution quality, outperforming existing neural methods.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper proposes LMask, a neural constructive framework for constrained routing problems that employs dynamic masking with backtracking to generate feasible solutions. It resides in the 'Dynamic Masking and Feasibility Enforcement' leaf under 'Constraint Handling and Feasibility Mechanisms', which contains only two papers total (including this one). This places the work in a relatively sparse research direction within the broader taxonomy of fifty papers, suggesting that explicit feasibility enforcement through learned masking remains an underexplored approach compared to hybrid search methods or general-purpose architectures.

The taxonomy reveals that most constraint-handling research has gravitated toward hybrid neural-classical methods (e.g., Neural LNS, column generation) or problem-specific adaptations (e.g., time windows, pickup-delivery). The sibling category 'Constraint-Specific Neural Adaptations' contains two papers analyzing particular constraint types, while neighboring branches address training paradigms and architecture choices. LMask diverges from these directions by focusing on construction-time feasibility rather than post-hoc repair or constraint-specific encodings, positioning it at the intersection of pure neural construction and explicit constraint satisfaction.

Among thirty candidates examined, the LazyMask decoding algorithm shows overlap with prior work: ten candidates were reviewed, and two appear to provide refutable prior art. The refinement intensity embedding and theoretical guarantees show no clear refutation across ten candidates each. This suggests the backtracking mechanism may have precedent in the limited search scope, while the embedding strategy and formal analysis appear more distinctive. The analysis explicitly covers top-K semantic matches and citation expansion, not an exhaustive literature review.

Given the sparse taxonomy leaf and limited search scope, the work appears to occupy a relatively novel position within constraint-aware neural routing. The backtracking component has some prior overlap among examined candidates, but the overall framework combining lazy masking, refinement embeddings, and theoretical guarantees represents a coherent contribution to an underexplored direction. The assessment is necessarily bounded by the thirty-candidate search scope.

Taxonomy

Core-task Taxonomy Papers
50
3
Claimed Contributions
30
Contribution Candidate Papers Compared
2
Refutable Paper

Research Landscape Overview

Core task: solving constrained routing problems with neural constructive methods. The field has evolved into a rich landscape organized around several complementary dimensions. Neural Construction Architectures and Training Paradigms explore foundational encoder-decoder designs and learning strategies, while Constraint Handling and Feasibility Mechanisms address the critical challenge of ensuring solutions respect capacity, time windows, and other hard constraints through techniques like dynamic masking and feasibility enforcement. Hybrid Neural-Classical Search Methods combine learned heuristics with traditional optimization tools such as large neighborhood search (Neural LNS[7]) or column generation (Neural Column Generation[2]), bridging data-driven and exact approaches. Problem Variant Specializations and Domain-Specific Applications tackle diverse settings ranging from pickup-and-delivery to electric vehicle routing (EV Routing Survey[21]), waste collection (Waste Collection VRP[27]), and even circuit design (Circuit Routing CNN[42]). Meanwhile, Scalability and Real-World Deployment and Specialized Optimization Techniques focus on making these methods practical at industrial scale and under real-world conditions (Real World Routing[28]). Recent work reveals a tension between generality and constraint complexity. Many studies pursue broadly applicable architectures (Omni Generalizable[37], Generalizable Neural Solvers[49]), yet handling intricate constraint combinations remains challenging, as highlighted by Complex Constraints VRP[1] and efforts on multiple time windows (Multiple Time Windows[14]) or heterogeneous fleets (Minmax Heterogeneous VRP[5]). LMask[0] sits squarely within the Constraint Handling branch, specifically targeting Dynamic Masking and Feasibility Enforcement. Its emphasis on learned masking strategies to maintain feasibility aligns closely with Complex Constraints VRP[1], which also grapples with enforcing diverse hard constraints during neural construction. Compared to hybrid approaches like Neural LNS[7] that iteratively refine solutions, LMask[0] focuses on preventing infeasible actions at construction time, offering a complementary perspective on how neural methods can respect problem structure without relying on post-hoc repair or classical subroutines.

Claimed Contributions

LazyMask decoding algorithm with backtracking mechanism

The authors propose a novel decoding algorithm that adaptively refines feasibility masks through backtracking rather than relying solely on one-pass forward construction. This mechanism enables efficient generation of feasible solutions for routing problems with complex constraints by dynamically adjusting the set of valid actions during route construction.

10 retrieved papers
Can Refute
Refinement intensity embedding (RIE)

The authors introduce a novel embedding technique that encodes information about the search trace (including backtracking history) into the model. This addresses representation ambiguities that arise when the model cannot distinguish between forward construction and backward correction states, thereby enhancing the model's learning capability.

10 retrieved papers
Theoretical guarantees for validity and probabilistic optimality

The authors establish rigorous mathematical foundations demonstrating that their LazyMask algorithm always generates feasible solutions and assigns non-zero probability to all feasible solutions. They also prove that their probabilistic model with entropy regularization effectively approximates the target distribution, providing performance guarantees for solution quality.

10 retrieved papers

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

LazyMask decoding algorithm with backtracking mechanism

The authors propose a novel decoding algorithm that adaptively refines feasibility masks through backtracking rather than relying solely on one-pass forward construction. This mechanism enables efficient generation of feasible solutions for routing problems with complex constraints by dynamically adjusting the set of valid actions during route construction.

Contribution

Refinement intensity embedding (RIE)

The authors introduce a novel embedding technique that encodes information about the search trace (including backtracking history) into the model. This addresses representation ambiguities that arise when the model cannot distinguish between forward construction and backward correction states, thereby enhancing the model's learning capability.

Contribution

Theoretical guarantees for validity and probabilistic optimality

The authors establish rigorous mathematical foundations demonstrating that their LazyMask algorithm always generates feasible solutions and assigns non-zero probability to all feasible solutions. They also prove that their probabilistic model with entropy regularization effectively approximates the target distribution, providing performance guarantees for solution quality.