Eclipse Zenoh-Flow
Zenoh-Flow provides a zenoh-based dataflow programming framework for computations that span from the cloud to the device.
Description
Zenoh-Flow allow users to declare a dataflow graph, via a YAML file, and use tags to express location affinity and requirements for the operators that makeup the graph. When deploying the dataflow graph, Zenoh-Flow automatically deals with distribution by linking remote operators through zenoh.
A dataflow is composed of set of sources — producing data, operators — computing over the data, and sinks — consuming the resulting data. These components are dynamically loaded at runtime.
Remote source, operators, and sinks leverage zenoh to communicate in a transparent manner. In other terms, the dataflow the dafalow graph retails location transparency and could be deployed in different ways depending on specific needs.
Zenoh-Flow provides several working examples that illustrate how to define operators, sources and sinks as well as how to declaratively define they dataflow graph by means of a YAML file.
How to build it
Install Cargo and Rust. Zenoh Flow can be successfully compiled with Rust stable (>= 1.5.1), so no special configuration is required — except for certain examples.
To build Zenoh-Flow, just type the following command after having followed the previous instructions:
$ cargo build --release
How to run
Assuming that the previous steps completed successfully, you'll find the the Zenoh-Flow runtime under target/release/runtime
. This executable expects the following arguments:
- the path of the dataflow graph to execute:
--graph-file zenoh-flow-examples/graphs/fizz_buzz_pipeline.yaml
, - a name for the runtime:
--runtime foo
.
The graph describes the different components composing the dataflow. Although mandatory, the name of the runtime is used to "deploy" the graph on different "runtime instances" (see the related examples).
Examples
FizzBuzz
First, compile the relevant examples:
cargo build --example manual-source --example example-fizz --example example-buzz --example generic-sink
This will create, depending on your OS, the libraries that the pipeline will fetch.
Single runtime
To run all components on the same Zenoh Flow runtime:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/fizz_buzz_pipeline.yaml --runtime foo
Note: in that particular case the --runtime foo
is discarded.
Multiple runtimes
In a first machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/fizz-buzz-multiple-runtimes.yaml --runtime foo
In a second machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/fizz-buzz-multiple-runtimes.yaml --runtime bar
OpenCV FaceDetection - Haarcascades
First, compile the relevant examples:
cargo build --example camera-source --example face-detection --example video-sink
This will create, depending on your OS, the libraries that the pipeline will fetch.
Single runtime
To run all components on the same Zenoh Flow runtime:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/face_detection.yaml --runtime foo
Note: in that particular case the --runtime foo
is discarded.
Multiple runtimes
In a first machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/face-detection-multi-runtime.yaml --runtime gigot
In a second machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/face-detection-multi-runtime.yaml --runtime nuc
In a third machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/face-detection-multi-runtime.yaml --runtime leia
OpenCV Object Detection - Deep Neural Network - CUDA powered
First, compile the relevant examples:
cargo build --example camera-source --example object-detection-dnn --example video-sink
This will create, depending on your OS, the libraries that the pipeline will fetch.
Then please update the files zenoh-flow-examples/graphs/dnn-object-detection.yaml
and zenoh-flow-examples/graphs/dnn-object-detection-multi-runtime.yaml
by changing the neural-network
, network-weights
, and network-classes
to match the absolute path of your Neural Network configuration
Single runtime
To run all components on the same Zenoh Flow runtime:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/dnn-object-detection.yaml --runtime foo
Note: in that particular case the --runtime foo
is discarded.
Multiple runtimes
In a first machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/dnn-object-detection-multi-runtime.yaml --runtime foo
In a second machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/dnn-object-detection-multi-runtime.yaml --runtime cuda
In a third machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/dnn-object-detection-multi-runtime.yaml --runtime bar
OpenCV Car Vision - Deep Neural Network - CUDA powered
ffmpeg
and run the following command: ffmpeg -framerate 15 -pattern_type glob -i 'I1*.png' -c:v libx264 I1.mp4
.
First, compile the relevant examples:
cargo build --example video-file-source --example object-detection-dnn --example video-sink
This will create, depending on your OS, the libraries that the pipeline will fetch.
Then please edit the file zenoh-flow-examples/graphs/car-pipeline-multi-runtime.yaml
by changing the neural-network
, network-weights
, and network-classes
to match the absolute path of your Neural Network configuration.
You also need to edit the file
in zenoh-flow-examples/graphs/car-pipeline-multi-runtime.yaml
to match the absolute path of your video file.
Multiple runtimes
In a first machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/car-pipeline-multi-runtime.yaml --runtime gigot
In a second machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/car-pipeline-multi-runtime.yaml --runtime cuda
In a third machine, run:
./target/release/runtime --graph-file zenoh-flow-examples/graphs/car-pipeline-multi-runtime.yaml --runtime macbook