/DAFA-LS

Official repository of DAFA-LS, a dataset of satellite image time series for the task of archaeological looting detection.

Primary LanguagePythonMIT LicenseMIT

Detecting Looted Archaeological Sites from Satellite Image Time Series

Elliot VincentMerhaïl SaroufimJonathan Chemla
Yves UbelmannPhilippe MarquisJean PonceMathieu Aubry

Official PyTorch implementation of Detecting Looted Archaeological Sites from Satellite Image Time Series. Check out our webpage for other details!

We introduce the DAFA Looted Sites dataset (DAFA-LS), a labeled multi-temporal remote sensing dataset containing 55,480 images acquired monthly over 8 years across 675 Afghan archaeological sites, including 135 sites looted during the acquisition period. DAFA-LS is an interesting playground to assess the performance of satellite image time series (SITS) classification methods on a real and important use case.

alt text

If you find this code useful, don't forget to star the repo ⭐.

Installation ⚙️

1. Clone the repository in recursive mode

git clone git@github.com:ElliotVincent/DAFA-LS.git --recursive

2. Download the datasets

You can download the datasets using the code below or by following this link (426M).

cd DAFA-LS
mkdir datasets
cd datasets
gdown 16v7_AcRwNeRhCacmQuX2477VYs51f4fU
unzip DAFA_LS.zip

3. Download pretrained weights for DOFA [1]

cd ..
mkdir weights
cd weights
wget https://huggingface.co/XShadow/DOFA/resolve/main/DOFA_ViT_base_e100.pth?download=true

4. Create and activate virtual environment

python3 -m venv dafals
source dafals/bin/activate
python3 -m pip install -r requirements.txt

This implementation uses Pytorch.

How to use 🚀

If you use the repository for the first time, please create a results folder:

mkdir results

Now you can run the following command, replacing <config_name> by either resnet [2], dofa [1], ltae [3], tempcnn [4], duplo [5], transformer [6], utae [7], tsvit_cls [8], tsvit_seg [8], pse_ltae [9] or dofa_ltae [1,3]. Replace <exp_name> by the experiment name of your choice. Output files will be located at results/<exp_name>/.

PYTHONPATH=$PYTHONPATH:./src python src/trainer.py -t <exp_name> -c <config_name>.yaml

Citing

If you use our work in your project please cite:

@article{vincent2024detecting,
    title = {Detecting Looted Archaeological Sites from Satellite Image Time Serie},
    author = {Vincent, Elliot and Saroufim, Mehraïl and Chemla, Jonathan and Ubelmann, Yves and Marquis, Philippe and Ponce, Jean and Aubry, Mathieu},
    journal = {arXiv},
    year = {2024},
  }

And if you use our dataset, please give proper attributions to Planet Labs:

@article{planet2024planet,
    author={{Planet Team}},
    title={{Planet Application Program Interface: In Space for Life on Earth (San Francisco, CA)}},
    journal={\url{https://api.planet.com}},
    year={2024}
}

Bibliography

[1] Z. Xiong et al. Neural plasticity-inspired foundation model for observing the Earth crossing modalities. (2024)
[2] K. He et al. Deep residual learning for image recognition. (2016)
[3] V. S. F. Garnot et al. Lightweight temporal self-attention for classifying satellite images time series. (2020)
[4] C. Pelletier et al. Temporal convolutional neural network for the classification of satellite image time series. (2019)
[5] R. Interdonato et al. Duplo: A dual view point deep learning architecture for time series classification. (2019)
[6] M. Rußwurm et al. Self-attention for raw optical satellite time series classification. (2020)
[7] V. S. F. Garnot et al. Panoptic segmentation of satellite image time series with convolutional temporal attention networks. (2021)
[8] M. Tarasiou et al. Vits for sits: Vision transformers for satellite image time series. (2023)
[9] V. S. F. Garnot et al. Satellite image time series classification with pixel-set encoders and temporal self-attention. (2020)