论文标题

少更多:使用面向光抽样的MLP结构预测快速多元时间序列

Less Is More: Fast Multivariate Time Series Forecasting with Light Sampling-oriented MLP Structures

论文作者

Zhang, Tianping, Zhang, Yizhuo, Cao, Wei, Bian, Jiang, Yi, Xiaohan, Zheng, Shun, Li, Jian

论文摘要

多元时间序列的预测已在各种领域(包括财务,交通,能源和医疗保健)中广泛范围。为了捕获复杂的时间模式,大量研究基于RNN,GNN和变形金刚的许多变体设计了复杂的神经网络架构。但是,复杂的模型通常在计算上很昂贵,因此当应用于大型现实世界数据集时,训练和推理效率方面面临严重的挑战。在本文中,我们介绍了Lightts,这是一种基于简单的基于MLP的结构的轻度深度学习体系结构。 LightT的关键思想是在两种微妙的下采样策略之上应用基于MLP的结构,包括间隔抽样和连续采样,灵感来自一个至关重要的事实,即下采样时间序列通常保留其大多数信息。我们对八个广泛使用的基准数据集进行了广泛的实验。与现有的最新方法相比,Lightts在其中五个方面表现出更好的性能,其余的性能可比性。此外,Lightts高效。与最大的基准数据集上的先前SOTA方法相比,它使用的触发器少于5%。此外,Lightts的预测准确性与以前的SOTA方法相比,在长序列预测任务中,预测准确性的差异要小得多。

Multivariate time series forecasting has seen widely ranging applications in various domains, including finance, traffic, energy, and healthcare. To capture the sophisticated temporal patterns, plenty of research studies designed complex neural network architectures based on many variants of RNNs, GNNs, and Transformers. However, complex models are often computationally expensive and thus face a severe challenge in training and inference efficiency when applied to large-scale real-world datasets. In this paper, we introduce LightTS, a light deep learning architecture merely based on simple MLP-based structures. The key idea of LightTS is to apply an MLP-based structure on top of two delicate down-sampling strategies, including interval sampling and continuous sampling, inspired by a crucial fact that down-sampling time series often preserves the majority of its information. We conduct extensive experiments on eight widely used benchmark datasets. Compared with the existing state-of-the-art methods, LightTS demonstrates better performance on five of them and comparable performance on the rest. Moreover, LightTS is highly efficient. It uses less than 5% FLOPS compared with previous SOTA methods on the largest benchmark dataset. In addition, LightTS is robust and has a much smaller variance in forecasting accuracy than previous SOTA methods in long sequence forecasting tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源