The current directory provides the code necessary to reproduce the results from my Master Thesis report. For any addtional question, send me an email: fadel.seydou@gmail.com.
- Open a terminal on Linux or a Anaconda command prompt on Windows
- Clone this repository
- Move to the repository with
cd ./hierarchicalclassification
- Create a virtual environment
python -m venv .venv
- Activate a virtual environment
- For Linux:
source ../.venv/bin/activate
- For Windows:
.venv\bin\activate.bat
- For Linux:
- Install dependencies
pip install -r requirements.txt
- Open the jupyter notebook:
./src/data_exploration.ipynb
- It will explain the general workflow and hypothesis behind the data prepration.
- Download the following files and move them to
./data
: - Move to directory
./src
- Run
python datapreparation.py
to create needed files for downloading data - Initialize to Google earth engine API: more here
- Run
bash download_data.sh
- In the terminal, move to
./src
- Create a free account on weight&biases to log training metrics
- Initialize weight&biases in this working directory as follows:
wandb login
and provide API key. - Open
train.sh
and update the parameters. All parameters are explained inargs.py
- Run
bash train.sh