- What is BigFlow?
- Getting started
- Installing Bigflow
- Help me
- BigFlow tutorial
- CLI
- Configuration
- Project structure and build
- Deployment
- Workflow & Job
- Starter
- Technologies
- Logging
- Development
BigFlow is a Python framework for data processing pipelines on GCP.
The main features are:
- Dockerized deployment environment
- Powerful CLI
- Automated build, deployment, versioning and configuration
- Unified project structure
- Support for the major data processing technologies — Dataproc (Apache Spark), Dataflow (Apache Beam) and BigQuery
- Project starter
Start from installing BigFlow on your local machine. Next, go through the BigFlow tutorial.
Prerequisites. Before you start, make sure you have the following software installed:
- Python == 3.7
- Google Cloud SDK
- Docker Engine
You can install the bigflow
package globally but we recommend to
install it locally with venv
, in your project's folder:
python -m venv .bigflow_env
source .bigflow_env/bin/activate
Install the bigflow
PIP package:
pip install bigflow[bigquery,dataflow,dataproc,log]
Test it:
bigflow -h
Read more about BigFlow CLI.
To interact with GCP you need to set a default project and log in:
gcloud config set project <your-gcp-project-id>
gcloud auth application-default login
Finally, check if your Docker is running:
docker info
You can ask questions on our gitter channel or stackoverflow.