dockerized API, gateway and React with redux architecture with hot reloading and including external state management for microservice data through packaged reducers, sagas, containers and components
Demonstrate how microservices combined with their frontend services (reducers, sagas, actions, initial state) can help create highly reusable modules.
On a higher level, it proposes to introduce boundaries in the redux state: not everything is necessarily UI state. Let API-data related reducers manage their own state!
The idea is that you just need to connect the microservice's inital state, reducer and saga to the redux state and then be able to use containers provided by the "connector". This could be done by packaging example-app/src/components/TodoList and related components into a ready-to-use container (and components) in example-api/connector. These containers would already be "connected" to the state, regarding API-related state data.
Of course, this will only truly scale with multiple microservices and frontends.
- node
- docker
The example-app/docker-compose.yml
configuration references a development image that can be used as a starting point.
Under this configuration, when running the container, it's example-app directory will be swapped out with the local example-app directory: this enables live-reloading.
cd example-app
docker-compose pull
docker-compose up
open http://localhost
When updating the example-api/connector
, while running this configuration, run the following commands:
# in directory example-api/connector
yarn run dev
# (only because default create-react-app doesn't watch stuff inside node_modules without ejecting)
# in directory example-app
docker-compose up -d --force-recreate example-app
Stopping everything (including cleaning up networks and images) can be done by running:
# in directory example-app:
docker-compose down
docker-compose pull
docker-compose up
open http://localhost
cd example-app
make release VERSION=0.3.2 BUILD_ENV=production
I didn't figure out how to scale example-api: each instance has it's own data.json. Sharing (with volumes?) didn't work yet (I am doing something wrong). This is why, right now, example-api has only one instance (replica)... And is therefore a single point of failure. But I suppose a real API service would manage to share some kind of distributed database.
- Connect to your docker swarm
docker run --rm -ti -v /var/run/docker.sock:/var/run/docker.sock -e DOCKER_HOST dockercloud/client username/swarm # ... # Use your Docker ID credentials to authenticate: # Username: username # Password:
- Define docker host
export DOCKER_HOST=tcp://127.0.0.1:32768
- deploy
# in services-example docker stack deploy -c docker-compose.yml services-example
- inspect
# in services-example docker stack ps services-example
- shutdown
# in services-example docker stack rm services-example