Automatically discovering mathematical expressions is a challenging issue to precisely depict natural phenomena, in which Symbolic Regression (SR) is one of the most widely utilized techniques. Mainstream SR algorithms target on searching for the optimal symbolic tree, but the increasing complexity of the tree structure often limits their performance. Inspired by neural networks, symbolic networks have emerged as a promising new paradigm. However, existing symbolic networks still face certain challenges: binary nonlinear operators { × , ÷} cannot be naturally extended to multivariate, training with fixed architecture often leads to higher complexity and overfitting. In this work, we propose a Unified Symbolic Network that unifies nonlinear binary operators into nested unary operators, thereby transforming them into multivariate operators. The capability of the proposed UniSymNet is deduced from rigorous theoretical proof, resulting in lower complexity and stronger expressivity. Unlike the conventional neural network training, we design a bi-level optimization framework: the outer level pre-trains a Transformer with sparse label encoding scheme to guide UniSymNet structure selection, while the inner level employs objective-specific strategies to optimize network parameters. This allows for flexible adaptation of UniSymNet structures to different data, leading to reduced expression complexity. The UniSymNet is evaluated on low-dimensional Standard Benchmarks and high-dimensional SRBench, and shows excellent symbolic solution rate, high fitting accuracy, and relatively low expression complexity.
扫码关注我们
求助内容:
应助结果提醒方式:
