Note: This is a work in progress
TODO:
- Generalize data read function to include xarrays metadata.
- reformat the dataset into .zarr (started)
- create code for doing the forward pass with metadata
- Get doug a txt with chr 1, 2, 6 as a N part.
This project seeks to compress snp's using deep neural networks.
- Run model to test speed
- check profiler
- run w/o profiler
- set up sweep
- run sweep
- select best model
├── LICENSE
├── README.md <- The top-level README for developers using this project.
├── data
│ └── interim <- The .zarr files created using src/data/plink_to_dask.py
│
├── models <- Trained and serialized models, model predictions, or model summaries
│
├── references <- Data dictionaries, manuals, and all other explanatory materials.
│
├── reports <- Generated analysis as HTML, PDF, LaTeX, etc.
| ├── slurm-output <- slurm .out files
│ └── figures <- Generated graphics and figures to be used in reporting
│
├── requirements.txt <- The requirements file for reproducing the analysis environment.
│
├── src <- Source code for use in this project.
│ ├── __init__.py <- Makes src a Python module
│ │
│ ├── data <- Scripts to download or generate data
│ │
│ └── models <- Scripts to train models and then use trained models to make
│ │ predictions
│ ├── predict_model.py
│ └── train_model.py