Import and convert data to GOB format and publish the results on the GOB Message Broker.
Importing data consists of the following steps:
1 Setup a connection to be able to access the external data
* External data can be stored in files, databases, api's, ...
2 Use the connection to read the required external data
* External data can be formatted as csv, xml, StUF, database format, ...
3 Convert the external data to GOB format
4 Publish the results
A running GOB infrastructure is required to run this component.
In order to encrypt secure data you need to define environment variables:
SECURE_SALT
andSECURE_PASSWORD
- shared with GOB API (symmetrical encryption). GOB Import is responsible for the encryption and GOB API uses the secrets for decryption
- docker compose >= 1.25
- Docker CE >= 18.09
Connect to the Neuron database with credentials retrieved with help of this guide.
When running the project in Docker, edit the .env
file and set NRBIN_*
parameters to the params retrieved from the previous guide.
Set the NRBIN host to host.docker.internal
:
NRBIN_DATABASE_HOST=host.docker.internal
Forward the port using SSH:
ssh -L 0.0.0.0:1521:$DB:1521 $HOST
Finally, start docker.
docker compose build
docker compose up -d
docker compose -f src/.jenkins/test/docker-compose.yml build
docker compose -f src/.jenkins/test/docker-compose.yml run --rm test
- Python >= 3.9
Create a virtual environment:
python3 -m venv venv
source venv/bin/activate
pip install -r src/requirements.txt
Or activate the previously created virtual environment:
source venv/bin/activate
Optional: Set environment if GOB-Import should connect to secure data sources:
export $(cat .env | xargs) # Copy from .env.example if missing
Start the service:
cd src
python -m gobimport
Run the tests:
cd src
sh test.sh
Imports are triggered by the GOB-Workflow module. See the GOB-Workflow README for more details.