Enhancing Transformer-based models for Long Sequence Time Series Forecasting via Structured Matrix

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled Enhancing Transformer-based models for Long Sequence Time Series Forecasting via Structured Matrix, by Zhicheng Zhang and 4 other authors

View PDF
HTML (experimental)

Abstract:Recently, Transformer-based models for long sequence time series forecasting have demonstrated promising results. The self-attention mechanism as the core component of these Transformer-based models exhibits great potential in capturing various dependencies among data points. Despite these advancements, it has been a subject of concern to improve the efficiency of the self-attention mechanism. Unfortunately, current specific optimization methods are facing the challenges in applicability and scalability for the future design of long sequence time series forecasting models. Hence, in this article, we propose a novel architectural framework that enhances Transformer-based models through the integration of Surrogate Attention Blocks (SAB) and Surrogate Feed-Forward Neural Network Blocks (SFB). The framework reduces both time and space complexity by the replacement of the self-attention and feed-forward layers with SAB and SFB while maintaining their expressive power and architectural advantages. The equivalence of this substitution is fully demonstrated. The extensive experiments on 10 Transformer-based models across five distinct time series tasks demonstrate an average performance improvement of 12.4%, alongside 61.3% reduction in parameter counts.

Submission history

From: Zhicheng Zhang [view email]
[v1]
Tue, 21 May 2024 02:37:47 UTC (9,122 KB)
[v2]
Wed, 22 May 2024 12:12:15 UTC (1 KB) (withdrawn)
[v3]
Thu, 24 Oct 2024 01:52:17 UTC (4,282 KB)
[v4]
Mon, 16 Dec 2024 13:47:34 UTC (6,909 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.