/Hasaformer

The implement of Hybrid Adaptive Self-Attention coupled with Differential Excitation for Time Series Anomaly Detection.

Apache License 2.0Apache-2.0

Hasaformer

The implement of Hybrid Adaptive Self-Attention coupled with Differential Excitation for Time Series Anomaly Detection.

==========================================================

The code will be released after the paper is received.

==========================================================

Introduction

None

Model Overview

model

Geting Started

To clone this repo:

git clone https://github.com/qiumiao30/Hasaformer.git && cd Hasaformer

1. get data

2. Install Dependencies(Recomend Virtualenv)

  • python>=3.7
  • torch>=1.9
pip install -r requirements.txt

3. dataset preprocess

python data_preprocess.py --dataset $dataset_name$

$dataset$ is one of SWAT, WaDI, SMD, PSM et al.

for example:

python data_preprocess.py --dataset swat

4. Params

  • --dataset : default "swat".
  • --lookback : Windows size, default 10.
  • --normalize : Whether to normalize, default True.
  • --epochs : default 10
  • --bs : Batch Size, default 256
  • --init_lr : init learning rate, default 1e-3
  • --val_split : val dataset, default 0.1
  • --dropout : 0.3

5. run

python train.py --Params "value" --Parmas "value" ......

6. visualization

vis