Transduction is All You Need for Structured Data Workflows

ICLR 2026 Conference SubmissionAnonymous Authors
Large Language ModelsAgenticsAgentStructured DataSoftware
Abstract:

This paper introduces Agentics, a functional agentic AI framework for building LLM-based structured data workflow pipelines. Designed for both research and practical applications, Agentics offers a new data-centric paradigm in which agents are embedded within data types, enabling logical transduction between structured states. This design shifts the focus toward principled data modeling, providing a declarative language where data types are directly exposed to large language models and composed through transductions triggered by type connections. We present a range of structured data workflow tasks and empirical evidence demonstrating the effectiveness of this approach, including data wrangling, text-to-SQL parsing, and domain-specific multiple-choice question answering.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper introduces Agentics, a functional framework embedding agents within data types to enable logical transduction between structured states in LLM-based pipelines. According to the taxonomy, this work occupies the 'Data-Centric Agentic Pipelines with Type Composition' leaf under 'Functional Agentic Frameworks for Structured Data Transformation'. Notably, this leaf contains only the original paper itself with no sibling papers, indicating a relatively sparse research direction within the examined literature. The broader parent branch also appears limited, suggesting this type-driven functional approach represents an emerging or underexplored paradigm.

The taxonomy reveals two neighboring branches: 'Schema-Based Symbolic Integration with LLM Methods' and 'Domain-Specific Structured Workflow Applications'. The schema-based branch includes dialogue management and business logic automation systems that combine symbolic methods with LLM pattern transduction, while the domain-specific branch focuses on physics-aware multi-agent RAG pipelines. The taxonomy explicitly excludes symbolic pattern matching without type-driven transduction from the functional frameworks branch, positioning Agentics as distinct from schema-centric approaches that foreground predefined ontologies rather than flexible type composition and functional programming principles.

Among thirty candidates examined, none were found to clearly refute any of the three core contributions: the Agentics framework itself, the Logical Transduction Algebra, and the asynchronous map-reduce programming model. Each contribution was evaluated against ten candidates with zero refutable matches identified. This suggests that within the limited search scope, the specific combination of functional type composition, embedded agents in data types, and declarative transduction mechanisms appears relatively novel. However, the small candidate pool and sparse taxonomy structure indicate this assessment reflects top-thirty semantic matches rather than exhaustive field coverage.

The analysis indicates the work occupies a sparsely populated research direction within the examined literature, with no direct siblings in its taxonomy leaf and limited prior work overlap across all contributions. The positioning between schema-driven and domain-specific approaches suggests potential novelty in applying functional programming principles to LLM-based data pipelines, though the thirty-candidate search scope leaves open questions about broader field coverage and related work in adjacent communities not captured by semantic search.

Taxonomy

Core-task Taxonomy Papers
3
3
Claimed Contributions
30
Contribution Candidate Papers Compared
0
Refutable Paper

Research Landscape Overview

Core task: Building LLM-based structured data workflow pipelines through logical transduction. The field organizes around three main branches that reflect different emphases in combining large language models with structured data processing. The first branch, Functional Agentic Frameworks for Structured Data Transformation, focuses on compositional and type-safe pipeline architectures where agents orchestrate data flows using functional programming principles. The second branch, Schema-Based Symbolic Integration with LLM Methods, emphasizes the interplay between formal schemas and neural language understanding, often leveraging symbolic reasoning to guide or constrain LLM outputs. The third branch, Domain-Specific Structured Workflow Applications, targets concrete use cases such as dialogue management, scientific data processing, or business logic enforcement, where domain constraints shape the design of LLM-driven pipelines. Together, these branches illustrate a spectrum from general-purpose frameworks to schema-aware integration strategies and specialized application contexts. Within this landscape, a handful of works highlight contrasting design philosophies and open questions about how best to marry symbolic structure with neural flexibility. For instance, Schema Dialogue Management[2] demonstrates schema-driven orchestration in conversational settings, while Privacy Business Logic[3] explores constraint enforcement in business workflows, both illustrating domain-specific adaptations. Transduction Structured Data[0] sits squarely within the Data-Centric Agentic Pipelines with Type Composition cluster, emphasizing logical transduction as a compositional mechanism for type-safe data transformations. Compared to schema-centric approaches like Schema Dialogue Management[2], which foreground predefined ontologies, Transduction Structured Data[0] appears to prioritize functional composition and type inference as organizing principles. This positioning suggests ongoing exploration of whether rigid schemas or flexible type systems better support robust, maintainable LLM-based data pipelines across diverse application domains.

Claimed Contributions

Agentics functional agentic AI framework

The authors propose Agentics, a framework that embeds agents within data types and enables logical transduction between structured states. This data-centric paradigm shifts focus toward principled data modeling, where data types are exposed to LLMs and composed through transductions triggered by type connections.

10 retrieved papers
Logical Transduction Algebra (LTA)

The authors develop a typed, compositional calculus called Logical Transduction Algebra that provides formal semantics for LLM-powered pipelines. LTA treats agents as stateless transducers operating over well-defined data types, enabling modularity, parallelism, and schema-constrained transduction.

10 retrieved papers
Asynchronous map-reduce programming model

The framework provides higher-order operators (aMap and Reduce) that enable asynchronous execution of transductions. Because transductions are stateless, aMap can execute independently on each state in parallel, while Reduce aggregates states synchronously, supporting scalable structured data workflows.

10 retrieved papers

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Within the taxonomy built over the current TopK core-task papers, the original paper is assigned to a leaf with no direct siblings and no cousin branches under the same grandparent topic. In this retrieved landscape, it appears structurally isolated, which is one partial signal of novelty, but still constrained by search coverage and taxonomy granularity.

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

Agentics functional agentic AI framework

The authors propose Agentics, a framework that embeds agents within data types and enables logical transduction between structured states. This data-centric paradigm shifts focus toward principled data modeling, where data types are exposed to LLMs and composed through transductions triggered by type connections.

Contribution

Logical Transduction Algebra (LTA)

The authors develop a typed, compositional calculus called Logical Transduction Algebra that provides formal semantics for LLM-powered pipelines. LTA treats agents as stateless transducers operating over well-defined data types, enabling modularity, parallelism, and schema-constrained transduction.

Contribution

Asynchronous map-reduce programming model

The framework provides higher-order operators (aMap and Reduce) that enable asynchronous execution of transductions. Because transductions are stateless, aMap can execute independently on each state in parallel, while Reduce aggregates states synchronously, supporting scalable structured data workflows.

Transduction is All You Need for Structured Data Workflows | Novelty Validation