/foodsharing

Primary LanguageJavaScriptMIT LicenseMIT

About

Tools and packages used in this project:

  • Flask: a micro web framework written in Python
  • React: a JavaScript library for building user interfaces
  • Docker: a set of platform as a service products that uses OS-level virtualization to deliver software in packages called containers
  • Postgres: a free and open-source relational database management system
  • SQLAlchemy: an open-source SQL toolkit and object-relational mapper for Python
  • Flask-RESTX: a Flask extension for building REST APIs
  • PyTest: a Python testing framework
  • Jest: a JavaScript testing framework
  • Python Linting and Formatting: flake8, black, isort
  • JS Linting and Formatting: ESLint and Prettier
  • JSON Web Tokens (JWT) via flask-bcrypt and pyjwt

Team

Setup

You need to install the followings:

  • Python 3
  • Node.js
  • Docker

We also use additional platform accounts/apikeys in application:

  • Gmail account (to send mail notifications to our users)
  • Claudinary (to manage load/download photos)
  • Google Cloud Account (to use google's map)
  • GEOAPIFY (to geo search using their api)

Load Mock Data

To load mock data from csv files you need to be in foodsharing directory. We prepare some csv files in directory app_dev (which is also database name)
After running up docker containers check 3 first letter of db container hash. You can use command sudo docker ps -a | grep foodsharing_db.
Right after use command ./import_data.sh <3 first letter of db container hash> <directory with csv files (same as db_name)> e.g. ./import_data.sh 797 app_dev

Run

  1. Clone the repo: git clone https://github.com/jasiekg25/foodsharing.git
  2. Use .env-template file to create your own .env file. To run our application localy you need to fill these environments:
    • MAIL_USERNAME
    • MAIL_PASSWORD
    • CLOUDINARY_CLOUD_NAME
    • CLOUDINARY_API_KEY
    • CLOUDINARY_API_SECRET
    • REACT_APP_GOOGLE_API_KEY
    • REACT_APP_GEOAPIFY_API_KEY
  3. Switch to foodsharing folder and run docker-compose up -d (on linux add sudo before the command)
  4. Visit http://localhost:3007 to check the app (you can register a new user or use the sample testing user account username: test, email: test@test.com, password: test)
  5. Visit http://127.0.0.1:5001/docs/ to check API docs.

Other useful commands (on linux add sudo before the command):

$ docker-compose stop # stop containers
$ docker-compose down # stop and remove containers
$ docker-compose down -v # stop and remove containers and volumes

If something does not work, you can try to use:

$ docker-compose down -v
$ docker-compose up -d

Other docker commands:

$ docker image ls # check images
$ docker system prune -a --volumes # delete everything

Access the database via psql:

$ docker-compose exec db psql -U postgres
# \c app_dev
# select * from offers;
# select * from orders;
# \q

SSH to containers:

$ docker-compose exec backend /bin/sh
$ docker-compose exec backend-db /bin/sh

Tests

To run test you need to import json file from backends/tests to postman. Then find your collection, click 3 dots and 'Run collection'. There you can choose which endpoints you want to test and run. All the endpoints require logged in user so you need to provide valid user token in collection variables.