This Github action can be used to insert rows from a JSON file to Google BigQuery table.
It doesn't do any schema validation of the rows - BQ will return a list of errors if the inserts are failin.
name: "Insert rows to BigQuery"
on:
pull_request: {}
push:
branches: ["main"]
jobs:
deploy_schemas:
runs-on: ubuntu-latest
name: Insert rows to BigQuery
steps:
# To use this repository's private action,
# you must check out the repository
- name: Checkout
uses: actions/checkout@v2.3.4
- name: Deploy schemas to BigQuery
uses: Atom-Learning/bigquery-upload-action
with:
gcp_project: 'gcp-us-project'
dataset_id: 'dataset-id'
table_id: 'table-id'
bq_rows_as_json_path: 'bq_rows.json'
credentials: ${{ secrets.GCP_SERVICE_ACCOUNT }}
The full name of the GCP project you want to deploy.
Example: gcp-us-project
The dataset containting the table you want to insert the rows to.
Example: best_dataset
The table you want to insert the rows to.
Example: awesome_table
The path to the JSON file containing rows you want to insert in.
Example: rows.json
Google Service Account with permission to create objects in the specified project. Can be stored as a repository secret
See the Contributing Guide for additional information.
To execute tests locally (requires that docker
and docker-compose
are installed):
docker-compose run test
To validate the changes:
- Start Docker locally.
- Inside docker/ run
$ docker-compose up
- Inside main dir run:
$ docker-compose run test
If all of that work, push a version and tag by bumping the number after the ., e.g. if version is 1.1, then push 1.2 and run the job pointing to the new version, to verify it's working
This Github Action was written by Wojciech Chmiel, based on the fork of: https://github.com/jashparekh/bigquery-action