For machine learning, you often have to use clouds or remote servers. And so I made this assembly for quick initialization of machines and packaged it all in Docker.
- First, clone this repository on your working machine and go to it. (How to use GitHub + SSH?)
$ git clone git@github.com:AlekseevDanil/remote-ml.git
$ cd remote-ml
- Before starting the second step, make sure you have Docker and Docker-Compose installed. Then proceed with the following commands.
You need to create an .env file that will contain all the virtual environment variables we need.
$ touch .env
- Next, add the following to the .env file. Change the values to your. Make sure the ports you specify are free.
POSTGRES_USER="airflow"
POSTGRES_PASSWORD="airflow"
POSTGRES_DB="airflow"
POSTGRES_HOST="postgres"
POSTGRES_PORT=5432
AIRFLOW_PORT=8080
JUPYTER_PORT=6060
- Now we raise our docker images.
--build means we will build our images based on Dockerfiles
-d means bring up in the background without explicitly showing logging
$ docker-compose up --build -d
- And the final installation step is to create a user for our airflow.
We go inside the airflow container and write the command to create a user (specify your data)
$ docker exec -it airflow bash
$ airflow users create \
--username admin \
--password 12345 \
--firstname Admin \
--lastname Admin \
--role Admin \
--email admin@admin.com
$ exit
Congratulations to everyone who has made it this far! 🎉
If you ran all the commands and everything went up without errors, then you can now go to your_host:6060 and you will see a window welcoming Jupyter Lab.
To get the secret key, go to the terminal and enter this command, after that there will be similar links at the very top, they contain the key we need.
$ docker logs jupyterlab
Copy what is highlighted in the screenshot
In order to see airflow, you need to go to the browser along the path your_host:8080
Next, enter the username and password you provided earlier.