EXPERIMENTAL vNext Auditing Bounded Context Mono Repository
The Auditing BC is responsible for maintaining an immutable record of all of the transactions that take place on the Switch.
See the Reference Architecture documentation Auditing Section for context on this vNext implementation guidelines.
The Auditing BC consists of the following packages;
auditing-svc
Auditing Service.
README
client-lib
Auditing BC Client Library.
README
public-types-lib
Auditing BC Public Types Library.
README
npm run start:auditing-svc
More information on how to install NVM: https://github.com/nvm-sh/nvm
nvm install
nvm use
npm install
npm run build
Use https://github.com/mojaloop/platform-shared-tools/tree/main/packages/deployment/docker-compose-infra
To startup Kafka, MongoDB, Elasticsearch and Kibana, follow the steps below(executed in docker-compose-infra/):
- Create a sub-directory called
exec
inside thedocker-compose-infra
(this) directory, and navigate to that directory.
mkdir exec
cd exec
- Create the following directories as sub-directories of the
docker-compose/exec
directory:
certs
esdata01
kibanadata
logs
mkdir {certs,esdata01,kibanadata,logs}
- Copy the
.env.sample
to the exec dir:
cp ../.env.sample ./.env
-
Review the contents of the
.env
file -
Ensure
vm.max_map_count
is set to at least262144
: Example to apply property on live system:
sysctl -w vm.max_map_count=262144 # might require sudo
Start the docker containers using docker-compose up (in the exec dir)
docker-compose -f ../docker-compose-infra.yml --env-file ./.env up -d
To view the logs of the infrastructure containers, run:
docker-compose -f ../docker-compose-infra.yml --env-file ./.env logs -f
To stop the infrastructure containers, run:
docker-compose -f ../docker-compose-infra.yml --env-file ./.env stop
Once started, the services will available via localhost. Use the credentials set in the .env file.
- ElasticSearch API - https://localhost:9200/
- Kibana - http://localhost:5601
- Kafka Broker - localhost:9092
- Zookeeper - localhost:2181
- RedPanda Kafka Console - http://localhost:8080
- MongoDB - mongodb://localhost:27017
- Mongo Express Console - http://localhost:8081
Once ElasticSearch has started you should upload the data mappings for the logs and audits indexes using the following commands.
This must be executed once after setting up a new ElasticSearch instance, or when the indexes are updated.
Execute this in the directory containing the files es_mappings_logging.json
and es_mappings_auditing.json
.
When asked, enter the password for the elastic
user in the .env
file.
# Create the logging index
curl -i --insecure -X PUT "https://localhost:9200/ml-logging/" -u "elastic" -H "Content-Type: application/json" --data-binary "@es_mappings_logging.json"
# Create the auditing index
curl -i --insecure -X PUT "https://localhost:9200/ml-auditing/" -u "elastic" -H "Content-Type: application/json" --data-binary "@es_mappings_auditing.json"
NOTE: The master/source for the mappings files is the respective repositories: logging-bc and auditing-bc.
We can see the indexes in ElasticSearch API:
https://www.elastic.co/guide/en/elasticsearch/reference/8.1/explicit-mapping.html https://www.elastic.co/guide/en/elasticsearch/reference/8.1/mapping-types.html
Once the mappings are installed, it is time to import the prebuilt Kibana objects for the DataView and the search.
- Open Kibana (login with credentials in .env file)
- Navigate to (top left burger icon) -> Management / Stack Management -> Kibana / Saved Objects
Or go directly to: http://localhost:5601/app/management/kibana/objects
- Use the Import button on the top right to import the file
kibana-objects.ndjson
located in thedocker-compose
directory (this one).
Go to (top left burger icon) -> Analytics / Discover, and then use the Open option on the top right to open the imported "MojaloopDefaultLogView"
view.
Go to (top left burger icon) -> Analytics / Discover, and then use the Open option on the top right to open the imported "MojaloopDefaultLogView"
view.
Monitor Kafka Events (Download the Kafka clients from https://kafka.apache.org/downloads.html)
./kafka-console-consumer.sh --topic nodejs-rdkafka-svc-integration-test-log-bc-topic --from-beginning --bootstrap-server localhost:9092
docker-compose down -v
npm run start:auditing-svc
See the README.md file on each services for more Environment Variable Configuration options.
npm run test:unit
npm run test:integration
Requires integration tests pre-requisites
npm run test
After running the unit and/or integration tests:
npm run posttest
You can then consult the html report in:
coverage/lcov-report/index.html
We use npm audit to check dependencies for node vulnerabilities.
To start a new resolution process, run:
npm run audit:fix
You can check to see if the CI will pass based on the current dependencies with:
npm run audit:check
Execute locally the pre-commit checks - these will be executed with every commit and in the default CI/CD pipeline
Make sure these pass before committing any code
npm run pre_commit_check
As part of our CI/CD process, we use CircleCI. The CircleCI workflow automates the process of publishing changed packages to the npm registry and building Docker images for select packages before publishing them to DockerHub. It also handles versioning, tagging commits, and pushing changes back to the repository.
The process includes five phases.
-
Setup : This phase initializes the environment, loads common functions, and retrieves commits and git change history since the last successful CI build.
-
Detecting Changed Package.
-
Publishing Changed Packages to NPM.
-
Building Docker Images and Publishing to DockerHub.
-
Pushing Commits to Git.
All code is automatically linted, built, and unit tested by CircleCI pipelines, where unit test results are kept for all runs. All libraries are automatically published to npm.js, and all Docker images are published to Docker Hub.
The following documentation provides insight into the FSP Interoperability API Bounded Context.
- Reference Architecture - https://mojaloop.github.io/reference-architecture-doc/boundedContexts/auditing/
- MIRO Board - https://miro.com/app/board/o9J_lJyA1TA=/
- Work Sessions - https://docs.google.com/document/d/1Nm6B_tSR1mOM0LEzxZ9uQnGwXkruBeYB2slgYK1Kflo/edit#heading=h.6w64vxvw6er4
error:25066067:DSO support routines:dlfcn_load:could not load the shared library
Fix: https://github.com/mojaloop/security-bc.git export OPENSSL_CONF=/dev/null