Repository for the automatic inference of models from raw floor plans
- Python >= 3.10.5
- Google-Cloud-CLI
- Install dev requirements:
pip install -r requirements.txt
- Login to gcloud to make sure you have access to the DVC remote repository
gcloud auth application-default login
- Pull the data (optional)
dvc pull
- Place the GitHub deploy key in
docker/secrets/github.key
and make sure it has right permissionschmod 400 docker/secrets/github.key
- Place the gcloud service account credentials in
docker/secrets/gce_service_account_credentials.json
- You'll need a file named as the set variable ML_IMAGES_BUCKET_CREDENTIALS_FILE in
docker/.env
. This file should contain the credentials for the service account that has access to the images bucket. - You'll need the file microsoft-identity-association.json placed in
demo/ui/
(ask a fellow developer for this file). This is needed to be able to authenticate with Auth0. Otherwise, build will fail. - Download necessary resources:
make update_resources
-
Install Node
16.18.0
, for example, with nvm:nvm use 16.18.0
-
Install dev requirements:
make install_demo_ui_requirements
-
Run it locally:
make run_demo_ui_locally
- download necessary resources:
make update_resources
- run the workers, api and router images
(router might be needed to be commented out if you are running the UI locally)
make docker_build make docker_up
Authentication is set up with Auth0.
In order for authentication to work, you need to create a copy of .env.sample
file
located in the demo/ui
directory into .env
and fill with proper values.
Basic tests can be run with:
make tests_demo_ui
- Make changes to entrypoint etc.
- Bump DVC_IMAGE_VERSION in .env
- Run
make dvc_detectron_docker_push
Run the makefile recipe make remote_training
and you will be prompted for the relevant parameters.
Alternatively you can use the Vertex AI Interface by creating new trainings.
Choose the dvc_detectron
image as custom container.
Once the training is completed a new branch will be created (with a currently quite cryptic branch name) and the experiment will be available on DagsHub. For changing parameters we have two options:
- Change the config of the model under conf/detectron2/remote.yaml
- commit your changes and push to GitHub
- on Vertex AI run with flags
--train <COMMIT_HASH>
On Vertex AI, you can provide the parameters you want to override via the -S
flag, see Hydra Parameter Override
and the DVC documentation.
--train_detectron <BASE_COMMIT_HASH>
-S "conf/detectron2/remote.yaml:SOLVER.MAX_ITER=100"
-S "conf/detectron2/remote.yaml:+INPUT.MAX_SIZE_TRAIN=800"
-S "conf/detectron2/remote.yaml:~DATALOADER.SAMPLER_TRAIN"
- Change the config under
conf/dataset/default.yaml
- commit your changes and push to GitHub
- on vertex AI / locally run with flags
--dataset <COMMIT_HASH>
In order to run the GitHub actions some variables need to be set:
base64 encoded service account credentials for the ML images project. In order to get it you can get the service account file (json) and run:
cat <FILE_NAME>.json | base64 -w 0 > temp.base_64
gh secret set ML_IMAGES_SA_BASE64 < temp.base_64
rm temp.base_64
This is using GitHub cli, but you can also do it manually in the GitHub secrets page.
Similar as in the previous step. We need to have the .env
file located in the demo/ui
directory.
Then we can run:
cat demo/ui/.env | base64 -w 0 > temp.base_64
gh secret set DEMO_UI_ENV_BASE64 < temp.base_64
rm temp.base_64
Similar as in the previous step. We need to have the microsoft-identity-association.json
file located in the demo/ui
directory.
Then we can run:
cat demo/ui/microsoft-identity-association.json | base64 -w 0 > temp.base_64
gh secret set MICROSOFT_IDENTITY_BASE64 < temp.base_64
rm temp.base_64