/swift

An Autoregressive Consistency Model for Efficient Weather Forecasting

Primary LanguagePythonMIT LicenseMIT

light-logo dark-logo

arXiv Hugging Face

An Autoregressive Consistency Model for Efficient Weather Forecasting

q700-gif
Q700 6h forecast initialized 2020-08-22T06

Setup

Dataset: downsampled ERA5 data at 1.40625° spatial resolution (128× 256 pixels) from WeatherBench2 is needed with per-sample h5 files. See paper for data specifics.

Checkpoints: model weights and configs for our Swift and Diffusion model are available on HuggingFace. Sample data is available.

Environment: get started with cloning the repo and installing swift into a virtual env.

# load conda env
module load frameworks

# create venv and install library
cd swift
python3 -m venv venv --system-site-packages
source venv/bin/activate
python3 -m pip install --require-virtualenv -e ".[dev]"

Training

Bash scripts to submit distributed pbs jobs can be found under scripts/, with scripts/chain-resume.sh being the entry point to chain together and resume training on Aurora. These call train.py with the hydra experiment configs (where hyperparams are set) from the configs directory. For pretraining, we can do:

# Diffusion (trigflow)
bash chain-resume.sh -s 0 -n 5 -b 1 -e era5-swinv2-1.4-trigflow
# Swift-B (scm)
bash chain-resume.sh -s 0 -n 7 -b 1 -e era5-swinv2-1.4-scm

With finetuning, we need to modify scripts/aurora-general.sh to include the finetune=multistep hydra argument and the intervals in multistep.yaml config to have the correct intervals, e.g.,

finetune:
  intervals: [
    {steps: 1, kimg: 1500},
    # {steps: 2, kimg: 1500},
    # {steps: 3, kimg: 1000},
    # {steps: 4, kimg: 500},
    # {steps: 8, kimg: 500},
  ]
  name: multistep

Thereafter, we can resume from the correct, subsequent resume id as

# Swift
bash chain-resume.sh -s 8 -n 1 -b 1 -e era5-swinv2-1.4-scm

Inference

For simplicity, we run inference within one or more compute node(s) by calling generate.py within our virtual environment. Its important to first initialize ezpz. For example, to generate 12 members with 64 initial conditions for 15 days on 6h intervals we have

# init venv
module load frameworks
source venv/bin/activate

source <(curl -s https://raw.githubusercontent.com/saforem2/ezpz/refs/heads/main/src/ezpz/bin/utils.sh)
ezpz_setup_env

# run generation
launch python -m swift.generate \
  --input results/era5-swinv2-1.4-scm/011 \
  --checkpoint checkpoint-020000 \
  --members 12 \
  --steps 60 \
  --samples 64 \
  --interval 6

BibTeX

If you find this repo useful in your research, please consider citing our paper:

@misc{stock2025swift,
  title         = {Swift: An Autoregressive Consistency Model for Efficient Weather Forecasting},
  author        = {Stock, Jason and Arcomano, Troy and Kotamarthi, Rao},
  year          = {2025},
  eprint        = {2509.25631},
  archivePrefix = {arXiv},
  primaryClass  = {cs.LG},
  url           = {https://arxiv.org/abs/2509.25631}
}