Just another small project - matching inputs from results and storing them.
1) Install Docker + docker-compose
2) Prepare your database model
2.1) init_01_creation.sql : table creation scripts
2.2) init_02_dumping_data.sql : dumping data sql_scripts
2.3) Place *.csv files in sql_scripts folder (initial_sample_data.csv as example)
2.4) Write your SQLAlchemy models on the data/models folders
2.4.1) You can use sqlacodegen script to get autogenerated ones
example:
sqlacodegen postgresql://<user_name>:<passoword>@db:5432/<db_name>
3) Leave .csv files on root (would be copied) #TODO use volumes in docker-compose
docker build . -t matcher:staging
Inside the docker-compose context the services has access directly into the postgres instance, but if you need to access from host you will have to set the environment variables for the postgres connection
export DB_USERNAME=user_example
export DB_PASSWORD=password_example
export DB_HOSTNAME=localhost
export DB_NAME=matchers
... or
export DB_HOST=postgresql://user_example:password_example@localhost:5432/matchers
$ docker-compose up
(optional if you open ports in docker-compose)
$ docker exec -it matcherpy_matcher_1 bash
$ python matcher/main.py process_file <input_file>
$ python matcher/main.py process_file sound_recordings_input_report.csv
If you change your sql_scripts
you may need see no changes happened.
- Run
docker-compose down
- Try using
--force-recreate
flag when rundocker-compose up
again.