Enhancing Time-series forecasting performance with one-line code.
The repo is the official implementation for the paper: FreDF: Learning to Forecast in the Frequency Domain.
We provide the running scripts to reproduce experiments in /scripts
, which covers three mainstream tasks: long-term forecasting, short-term forecasting, and imputation. We also provide the scripts to reproduce the baselines, which mostly inherit from the comprehensive benchmark.
🤗 Please star this repo to help others notice FreDF if you think it is a useful toolkit. Please kindly cite FreDF in your publications if it helps with your research. This really means a lot to our open-source research. Thank you!
🚩News (2024.12) FreDF has been accepted as a poster in ICLR-25: [paper] [slide] [Video]
🚩News (2024.2) A blog in Chinese to introduce this work is available.
🚩News (2023.12) The implementation of FreDF is released, with scripts on three tasks.
We maintain an updated leaderboard for time series analysis models, with a special focus on learning objectives. As of December 2024, the top-performing models across different tasks are:
Model<br> Ranking |
Long-term<br> Forecasting |
Short-term<br> Forecasting |
Imputation |
---|---|---|---|
🥇 1st | FreDF + iTrans. | FreDF + FreTS | FreDF + iTrans. |
Note: We will keep updating this leaderboard. If you have proposed advanced and awesome models, you can send us your paper/code link or raise a pull request. We will add them to this repo and update the leaderboard as soon as possible.
Compared models of this leaderboard. ☑ means that their codes have already been included in this repo.
- ☑ iTransformer - iTransformer: Inverted Transformers Are Effective for Time Series Forecasting [arXiv 2023] [Code].
- ☑ PatchTST - A Time Series is Worth 64 Words: Long-term Forecasting with Transformers [ICLR 2023] [Code].
- ☑ TimesNet - TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis [ICLR 2023] [Code].
- ☑ DLinear - Are Transformers Effective for Time Series Forecasting? [AAAI 2023] [Code].
- ☑ FEDformer - FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting [ICML 2022] [Code].
- ☑ Autoformer - Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting [NeurIPS 2021] [Code].
- ☑ Transformer - Attention is All You Need [NeurIPS 2017] [Code].
- ☑ TiDE - Long-term Forecasting with TiDE: Time-series Dense Encoder [arXiv 2023] [Code].
- ☑ Crossformer - Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting [ICLR 2023][Code].
- Implement FreDF by adapting the following script in your pipeline
# The canonical temporal loss
loss_tmp = ((outputs-batch_y)**2).mean()
# The proposed frequency loss
loss_feq = (torch.fft.rfft(outputs, dim=1) - torch.fft.rfft(batch_y, dim=1)).abs().mean()
# Note. The frequency loss can be used individually or fused with the temporal loss using finetuned relative weights. Both witness performance gains, see the ablation study in our paper.
- Install Python 3.8 and pytorch 1.8. For convenience, execute the following command.
pip install -r requirements.txt
- Prepare Data. You can obtain the well pre-processed datasets from [Google Drive] or [Baidu Drive], Then place the downloaded data in the folder
./dataset
. Here is a summary of supported datasets.
- Train and evaluate model. We provide the experiment scripts for all benchmarks under the folder
./scripts/
. You can reproduce the experiment results as the following examples:
# long-term forecast
bash ./scripts/fredf_exp/ltf_overall/ETTh1_script/iTransformer.sh
# short-term forecast
bash ./scripts/fredf_exp/stf_overall/FreTS_M4.sh
# imputation
bash ./scripts/fredf_exp/imp_autoencoder/ETTh1_script/iTransformer.sh
- Apply FreDF to your own model.
- Add the model file to the folder
./models
. You can follow the./models/iTransformer.py
. - Include the newly added model in the
Exp_Basic.model_dict
of./exp/exp_basic.py
. - Create the corresponding scripts under the folder
./scripts
. You can follow./scripts/fredf_exp/ltf_overall/ETTh1_script/iTransformer.sh
.
The paper introducing FreDF is available in ICLR-25. If you use FreDF in your work, please consider citing it as below and 🌟staring this repository to make others notice this library. 🤗
@inproceedings{wang2025fredf,
title = {FreDF: Learning to Forecast in the Frequency Domain},
author = {Wang, Hao and Pan, Licheng and Chen, Zhichao and Yang, Degui and Zhang, Sen and Yang, Yifei and Liu, Xinggao and Li, Haoxuan and Tao, Dacheng},
booktitle = {ICLR},
year = {2025},
}
This library is mainly constructed based on the following repos, following the training-evaluation pipelines and the implementation of baseline models:
- Time-Series-Library: https://github.com/thuml/Time-Series-Library.
All the experiment datasets are public, and we obtain them from the following links:
- Long-term Forecasting and Imputation: https://github.com/thuml/Autoformer.
- Short-term Forecasting: https://github.com/ServiceNow/N-BEATS.