This setup is for running Metaflow locally using S3 as a Datastore. This way you can see the artifacts in the Metaflow UI
-
Rename the file
.metaflow/config.template.json
to.metaflow/config.json
. Inside the file change the bucket name -
Be sure you have the AWS credentials at
~/.aws
. They will be copied inside themetaflow-service
Docker container -
Clone netflix/metaflow-service and run it using Docker Compose
git clone https://github.com/Netflix/metaflow-service.git cd metaflow-service AWS_PROFILE=default docker compose -f docker-compose.development.yml up
-
Clone netflix/metaflow-ui and run it using Docker
git clone https://github.com/Netflix/metaflow-ui.git cd metaflow-ui docker build --tag metaflow-ui:latest . docker run -p 3000:3000 -e METAFLOW_SERVICE=http://localhost:8083/ metaflow-ui:latest
Metaflow UI will be available at: http://localhost:3000/
-
Run
python flows/minimum_flow.py run
~/.metaflowconfig/config.json
Global config./.metaflow/config.json
Project overrides