Spatial Nonstationarity Mitigation with Vision Transformer

πŸ“– Table of Contents

Table of Contents
  1. ➀ About The Project
  2. ➀ Prerequisites
  3. ➀ Folder Structure
  4. ➀ Dataset
  5. ➀ Roadmap
  6. ➀ Acknowledgements
  7. ➀ Contributors

-----------------------------------------------------

πŸ“ About The Project

  • Spatial nonstationarity, ubiquitous in spatial settings, are generally not acocunted-for or even mitigated in deep learning (DL) applictaions for spatial phenomenon.

  • We design a workflow to explore the impacts of nonstationarity on DL prediction performance:

    1. DL is trained with stationary SGS realizations of propoerty of interest (we use porosity for example) with variogram range labeled.
    2. Test the DL prediction performance using nonstationary realiztaions (investigate the impacts of nonstationarity).
  • The benchmark results are obatined by training convolutional neural network (CNN) model, which is commonly used for computer vision (CV) tasks due to its performance in leanring spatial hierarchies of features.

  • Then we explore Vision Transformer (ViT) and Swin Transformer (SwinT) models for spatial nonstationarity mitigation. The original ViT and SwinT architectures are modifed for the predictive tasks (regression tasks).

  • We found out self-attention networks can effectively help mitigate the spatial nonstatioanrity.

-----------------------------------------------------

🍴 Prerequisites

made-with-python
Made withJupyter

The following open source packages are mainly used in this project:

  • Numpy
  • Pandas
  • Matplotlib
  • Scikit-Learn
  • GSLIB
  • TensorFlow
  • Keras
  • PyTorch

Please also install other required packages when there is any missing (see detailed package uses in Jupyter notebooks)

-----------------------------------------------------

🌡 Folder Structure

code
.
β”œβ”€β”€ Data Preparation.ipynb
β”œβ”€β”€ train_CNN.ipynb
β”œβ”€β”€ train_vision_transformers.ipynb  
β”œβ”€β”€ SwinT.py    
β”œβ”€β”€ ViT.py

-----------------------------------------------------

πŸ’Ύ Dataset

The Data Preparation.ipynb walks through the training data and testing data calculation.

  • all data is calculated using sequential Gaussian simulation.

  • this notebook shows how to calculate SGS realization using GSLIB related packages.

-----------------------------------------------------

🎯 Roadmap

The goals of this project include the following:

  1. Prepare the training data >>> statioanry SGS realizations (see Data Preparation.ipynb) Train the same models - Decision Tree, k Nearest Neighbors, and Random Forest using the preprocessed data obtained from topological data analysis and compare the performance against the results obtained by Weiss et. al.

  2. Train CNN model (see train_CNN.ipynb), train ViT, and SwinT model (see train_vision_transformers.ipynb) with training data. After the model is fully trained, test its prediction performance with testing data (nonstationary realizations).

    • CNN model is implemented with tensorflow packages. train_CNN.ipynb shows how to create your own CNN model and train it with your training data.

    • Vision transformers are implemented using Pytorch. train_vision_transformers.ipynb demonstrates the loading of ViT/SwinT architectures (ViT.py & SwinT.py), how to train the ViT/SwinT model. To visulzie the training progress, please couple it with tensorboard summary or wandb writer up to yourself.

    • Generally we need to train the DL models with a large number of training data. Here for easier demonstration, we randomly cretae single data (training data size =1) for both training and validation. In practical useage, you should train the model with data generated in Data Preparation.ipynb.

-----------------------------------------------------

πŸ“œ Acknowledgements

This work is supported by Digital Reservoir Characterization Technology (DIRECT) Industry Affiliate Program at the University of Texas at Austin.

-----------------------------------------------------

πŸ“œ Contributors

πŸ‘©β€πŸŽ“: Lei Liu
      Email: leiliu@utexas.edu
      GitHub: @LeiLiu

πŸ‘¨β€πŸ’»: Javier E. Santos
      Email: jesantos@lanl.gov
      GitHub: @je-santos

πŸ‘©β€πŸ«: MaΕ‘a ProdanoviΔ‡
      Email: masha@utexas.edu
      Personal Website: @ProdanoviΔ‡

πŸ‘¨β€πŸ«: Michael J. Pyrcz
      Email: mpyrcz@austin.utexas.edu
      GitHub: @GeostatsGuy