/DataflowJavaSDK

Google Cloud Dataflow provides a simple, powerful model for building both batch and streaming parallel data processing pipelines.

Primary LanguageJavaApache License 2.0Apache-2.0

Google Cloud Dataflow SDK for Java

Google Cloud Dataflow provides a simple, powerful programming model for building both batch and streaming parallel data processing pipelines. This repository hosts the open-sourced Cloud Dataflow SDK for Java, which can be used to run pipelines against the Google Cloud Dataflow Service.

General usage of Google Cloud Dataflow does not require use of this repository. Instead:

  1. depend directly on a specific version of the SDK in the Maven Central Repository by adding the following dependency to development environments like Eclipse or Apache Maven:

     <dependency>
       <groupId>com.google.cloud.dataflow</groupId>
       <artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
       <version>version_number</version>
     </dependency>
    
  2. download the example pipelines from the separate DataflowJavaSDK-examples repository.

However, if you'd like to contribute to the SDK, write your own PipelineRunner, or just dig in for the fun of it, please stay with us here!

Status Build Status

Both the SDK and the Dataflow Service are generally available, open to all developers, and considered stable and fully qualified for production use.

Overview

The key concepts in this programming model are:

  • PCollection: represents a collection of data, which could be bounded or unbounded in size.
  • PTransform: represents a computation that transforms input PCollections into output PCollections.
  • Pipeline: manages a directed acyclic graph of PTransforms and PCollections that is ready for execution.
  • PipelineRunner: specifies where and how the pipeline should execute.

We provide three PipelineRunners:

  1. The DirectPipelineRunner runs the pipeline on your local machine.
  2. The DataflowPipelineRunner submits the pipeline to the Dataflow Service, where it runs using managed resources in the Google Cloud Platform (GCP).
  3. The BlockingDataflowPipelineRunner submits the pipeline to the Dataflow Service via the DataflowPipelineRunner and then prints messages about the job status until the execution is complete.

The SDK is built to be extensible and support additional execution environments beyond local execution and the Google Cloud Dataflow Service. In partnership with Cloudera, you can run Dataflow pipelines on an Apache Spark backend using the SparkPipelineRunner. Additionally, you can run Dataflow pipelines on an Apache Flink backend using the FlinkPipelineRunner.

Getting Started

This repository consists of the following parts:

  • The sdk module provides a set of basic Java APIs to program against.
  • The examples module provides a few samples to get started. We recommend starting with the WordCount example.
  • The contrib directory hosts community-contributed Dataflow modules.

The following command will build both the sdk and example modules and install them in your local Maven repository:

mvn clean install

You can speed up the build and install process by using the following options:

  1. To skip execution of the unit tests, run:

    mvn install -DskipTests

  2. While iterating on a specific module, use the following command to compile and reinstall it. For example, to reinstall the examples module, run:

    mvn install -pl examples

Be careful, however, as this command will use the most recently installed SDK from the local repository (or Maven Central) even if you have changed it locally.

If you are using Eclipse integrated development environment (IDE), the Cloud Dataflow Plugin for Eclipse provides tools to create and execute Dataflow pipelines locally and on the Dataflow Service.

After building and installing, you can execute the WordCount and other example pipelines by following the instructions in this README.

Contact Us

We welcome all usage-related questions on Stack Overflow tagged with google-cloud-dataflow.

Please use issue tracker on GitHub to report any bugs, comments or questions regarding SDK development.

More Information