Automatic Stage Lighting Control: Is it a Rule-Driven Process or Generative Task?
Overview
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors reconceptualize Automatic Stage Lighting Control as an art content generation task instead of a classification and mapping problem. This perspective treats lighting control as a creative process learned directly from professional lighting engineers rather than predefined rules.
The authors introduce Skip-BART, an adapted BART model that takes audio music as input and generates light hue and value as output. The framework incorporates a novel skip-connection module to enhance the relationship between music and light within fine-grained frame grids, along with pre-training and transfer learning mechanisms.
The authors create the first stage lighting dataset called RPMC-L2 (Rock, Punk, Metal, and Core - Livehouse Lighting) using an automatic label generation method from video data. This dataset addresses the scarcity of training data in the ASLC field and supports model training and evaluation.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
Contribution Analysis
Detailed comparisons for each claimed contribution
Framing ASLC as a generative task rather than rule-driven classification
The authors reconceptualize Automatic Stage Lighting Control as an art content generation task instead of a classification and mapping problem. This perspective treats lighting control as a creative process learned directly from professional lighting engineers rather than predefined rules.
[13] Preliminary Study of Effects of AI Digital Generation on Theater Lighting Design PDF
[15] Cross-Modal Metrics for Capturing Correspondences Between Music Audio and Stage Lighting Signals PDF
[17] Agentic VJ System - Real-time Visual Generation with Multi-modal Agents for Live Performances PDF
[47] MetaMGC: a music generation framework for concerts in metaverse PDF
[48] Generative theatre of totality PDF
[49] AI Driven music Playlist Genration PDF
[50] Algorithms and Light PDF
Skip-BART: end-to-end deep learning framework with skip-connection mechanism
The authors introduce Skip-BART, an adapted BART model that takes audio music as input and generates light hue and value as output. The framework incorporates a novel skip-connection module to enhance the relationship between music and light within fine-grained frame grids, along with pre-training and transfer learning mechanisms.
RPMC-L2: first stage lighting dataset with automatic label generation
The authors create the first stage lighting dataset called RPMC-L2 (Rock, Punk, Metal, and Core - Livehouse Lighting) using an automatic label generation method from video data. This dataset addresses the scarcity of training data in the ASLC field and supports model training and evaluation.