This paper proposes a multi-feature collaborative computational neural architecture search (MFCC-NAS) method for classifying continuous time series signals. It is designed to enhance the value density of the signals and mitigate the subjective nature of the design of the model architecture while reducing the computational cost of assessing its performance. We first design a representation of sample richness and a metric of their contributions to enhance the richness of the features in them, so that they provide a basis of dense and high-value data. Following this, we develop a MFCC-NAS method that can efficiently construct a model to classify continuous time series signals by using a search space designed for global–local feature separation based on cells, an architecture search strategy for the collaborative computation of global–local features, and a low-cost and robust cascading strategy to evaluate the performance of the architecture. Finally, we design a multi-domain collaborative fusion mechanism to fully integrate convolutional visual features from different spatial domains and obtain a comprehensive representation of the features of the samples. We tested the proposed method through comparative generalization experiments on a dataset of welding defects and the Yaseen dataset. After respective search times of 1.52 h and 1.21 h on these datasets, our model achieved an accuracy of classification of over 98 % on both. Furthermore, the resulting model maintains a compact parameter size and short inference time. These results collectively demonstrate the effectiveness and strong generalization capability of the proposed MFCC-NAS method.
扫码关注我们
求助内容:
应助结果提醒方式:
