This is the official ROS driver for Zivid 3D cameras.
Note: Current version is 0.9.0. API and behavior are still subject to changes. When 1.0.0 is released the API will be stable.
Contents: Installation | Getting Started | Launching | Services | Topics | Configuration | Samples | FAQ
This driver supports Ubuntu 16.04 with ROS Kinetic and Ubuntu 18.04 with ROS Melodic. Follow the official ROS Wiki instructions to install ROS Kinetic Kame (for Ubuntu 16.04) or ROS Melodic Morenia (for Ubuntu 18.04).
Also install catkin and git:
sudo apt-get update
sudo apt-get install -y python-catkin-tools git
To use the ROS driver you need to download and install the "Zivid Core" package. Follow this guide to install Zivid for your version of Ubuntu. "Zivid Studio" and "Zivid Tools" packages are not required by the ROS driver, but can be useful for testing that your system has been setup correctly and that the camera is detected.
An OpenCL 1.2 compatible GPU and OpenCL driver is required by the Zivid SDK. Follow this guide to install OpenCL drivers for your system.
A C++17 compiler is required.
Ubuntu 16.04:
sudo apt-get install -y software-properties-common
sudo add-apt-repository -y ppa:ubuntu-toolchain-r/test
sudo apt-get update
sudo apt-get install -y g++-8
Ubuntu 18.04:
sudo apt-get install -y g++
First, load the setup.bash script into the current session.
Ubuntu 16.04:
source /opt/ros/kinetic/setup.bash
Ubuntu 18.04:
source /opt/ros/melodic/setup.bash
Create the workspace and src directory:
mkdir -p ~/catkin_ws/src
Clone the Zivid ROS project into the src directory:
cd ~/catkin_ws/src
git clone https://github.com/zivid/zivid-ros.git
Install dependencies:
cd ~/catkin_ws
sudo apt-get update
rosdep update
rosdep install --from-paths src --ignore-src -r -y
Finally, build the driver.
Ubuntu 16.04:
catkin build -DCMAKE_CXX_COMPILER=/usr/bin/g++-8
Ubuntu 18.04:
catkin build
Connect the Zivid camera to your USB3 port on your PC. You can use the ZividListCameras command-line tool available in the "Zivid Tools" package to confirm that your system has been configured correctly, and that the camera is discovered by your PC. You can also open Zivid Studio and connect to the camera. Close Zivid Studio before continuing with the rest of this guide.
Launch sample_capture_assistant.py
to test that everything is working:
cd ~/catkin_ws && source devel/setup.bash
roslaunch zivid_samples sample.launch type:=sample_capture_assistant.py
This will start the zivid_camera
driver node, the
sample_capture_assistant.py node, as well as
rviz to visualize the point cloud and the 2D color and depth images
and rqt_reconfigure to view or change capture settings.
The sample_capture_assistant.py
node will first call the
capture_assistant/suggest_settings service to find suggested
capture settings for your particular scene, then call the capture service to
capture using those settings. If everything is working, the point cloud, color image and depth image
should be visible in rviz.
You can adjust the maximum capture time by changing variable max_capture_time
in
sample_capture_assistant.py and
re-launching the sample.
A more detailed description of the zivid_camera
driver follows below.
For sample code in C++ and Python, see the Samples section.
It is required to specify a namespace when starting the driver. All the services, topics and configurations will be pushed into this namespace.
To launch the driver, first start roscore
in a seperate terminal window:
roscore
In another terminal run:
cd ~/catkin_ws && source devel/setup.bash
Then, launch the driver either as a node or a nodelet. To launch as a node:
ROS_NAMESPACE=zivid_camera rosrun zivid_camera zivid_camera_node
To launch as a nodelet:
ROS_NAMESPACE=zivid_camera rosrun nodelet nodelet standalone zivid_camera/nodelet
The following parameters can be specified when starting the driver. Note that all the parameters are optional, and typically not required to set.
For example, to launch the driver with frame_id
specified, append _frame_id:=camera1
to the
rosrun command:
ROS_NAMESPACE=zivid_camera rosrun zivid_camera zivid_camera_node _frame_id:=zivid
file_camera_path
(string, default: "")
Specify the path to a file camera to use instead of a real Zivid camera. This can be used to develop without access to hardware. The file camera returns the same point cloud for every capture. Click here to download a file camera.
frame_id
(string, default: "zivid_optical_frame")
Specify the frame_id used for all published images and point clouds.
num_capture_frames
(int, default: 10)
Specify the number of dynamic_reconfigure
capture/frame_<n>
nodes that are created. This number defines the maximum number of frames that can be a part of a 3D HDR capture. Allcapture/frame_<n>
nodes are by default enabled=false (see section Configuration). If you need to perform 3D HDR capture with more than 10 enabled frames then increase this number. Otherwise it can be left as default. We do not recommend lowering this setting, especially if you are using the capture_assistant/suggest_settings service.
serial_number
(string, default: "")
Specify the serial number of the Zivid camera to use. Important: When passing this value via the command line or rosparam the serial number must be prefixed with a colon (
:12345
). This parameter is optional. By default the driver will connect to the first available camera.
zivid_camera/CaptureAssistantSuggestSettings.srv
Invoke this service to analyze your scene and find suggested settings for your particular scene, camera distance, ambient lighting conditions, etc. The suggested settings are configured on this node and accessible via dynamic_reconfigure, see section Configuration. When this service has returned you can invoke the capture service to trigger a 3D capture using these suggested settings.
This service has two parameters:
max_capture_time
(duration):
Specify the maximum capture time for the settings suggested by the Capture Assistant. A longer capture time may be required to get good data for more challenging scenes. Minimum value is 0.2 sec and maximum value is 10.0 sec.
ambient_light_frequency
(uint8):
Possible values are:
AMBIENT_LIGHT_FREQUENCY_NONE
,AMBIENT_LIGHT_FREQUENCY_50HZ
,AMBIENT_LIGHT_FREQUENCY_60HZ
. Can be used to ensure that the suggested settings are compatible with the frequency of the ambient light in the scene. If ambient light is unproblematic, useAMBIENT_LIGHT_FREQUENCY_NONE
for optimal performance. Default isAMBIENT_LIGHT_FREQUENCY_NONE
.
See Sample Capture Assistant for code example.
Invoke this service to trigger a 3D capture. See section Configuration for how to configure the 3D capture settings. The resulting point cloud is published on topic points, color image is published on topic color/image_color, and depth image is published on topic depth/image_raw. Camera calibration is published on topics color/camera_info and depth/camera_info.
See Sample Capture for code example.
Invoke this service to trigger a 2D capture. See section Configuration for how to configure the 2D capture settings. The resulting 2D image is published to topic color/image_color. Note: 2D RGB image is also published as a part of 3D capture, see capture.
See Sample Capture 2D for code example.
zivid_camera/CameraInfoModelName.srv
Returns the camera's model name.
zivid_camera/CameraInfoSerialNumber.srv
Returns the camera's serial number.
Returns if the camera is currently in Connected
state (from the perspective of the ROS driver).
The connection status is updated by the driver every 10 seconds and before each capture service
call. If the camera is not in Connected
state the driver will attempt to re-connect to the camera
when it detects that the camera is available. This can happen if the camera is power-cycled or the
USB cable is unplugged and then replugged.
Camera calibration and metadata.
Color/RGB image. For 3D captures (capture service) the image is encoded as "rgb8". For 2D captures (capture_2d service) the image is encoded as "rgba8", where the alpha channel is always 255.
Camera calibration and metadata.
Depth image. Each pixel contains the z-value (along the camera Z axis) in meters. The image is encoded as 32-bit float. Pixels where z-value is missing are NaN.
Point cloud data. Each time capture is invoked the resulting point cloud is published on this topic. The included point fields are x, y, z (in meters), c (contrast value), and r, g, b (colors). The output is in the camera's optical frame, where x is right, y is down and z is forward.
The zivid_camera
node supports both single-capture (2D and 3D) and HDR-capture (3D). 3D HDR-capture works by taking
several individual captures (called frames) with different settings (for example different exposure time)
and combining the captures into one high-quality point cloud. For more information about HDR capture, visit our
knowledge base.
The capture settings available in the zivid_camera
node matches the settings in the Zivid API.
To become more familiar with the different settings and what they do, see the API reference for the
Settings
and Settings2D
classes, or use Zivid Studio.
The settings can be viewed and configured using dynamic_reconfigure. Use rqt_reconfigure to view/change the settings using a GUI:
rosrun rqt_reconfigure rqt_reconfigure
If you want to experiment with the capture settings, launch the Sample Capture sample, which will capture in a loop forever, and use rqt_reconfigure to adjust the settings.
The available capture settings are organized into a hierarchy of configuration nodes. 3D settings are available
under the /capture
namespace, while 2D settings are available under /capture_2d
.
/capture
/frame_0
...
/frame_1
...
...
/frame_9
...
/general
...
/capture_2d
/frame_0
Note: The Capture Assistant feature can be used to find optimized 3D capture settings for your scene. Refer to service capture_assistant/suggest_settings.
Note for C++ users: The min, max and default values of the settings can change dependent on what
Zivid camera model you are using. Therefore you should not use the static __getMin()__
,
__getMax()__
and __getDefault()__
methods of the auto-generated C++ config classes
(zivid_camera::CaptureGeneralConfig
, zivid_camera::CaptureFrameConfig
and
zivid_camera::Capture2DFrameConfig
). Instead, you should query the server for the default values
using dynamic_reconfigure::Client<T>::getDefaultConfiguration()
. See the C++ samples
for how to do this.
capture/frame_<n>/
contains settings for an individual frame. By default <n>
can be 0 to 9 for a
total of 10 frames. The total number of frames can be configured using the launch parameter num_capture_frames
(see section Launch Parameters above).
capture/frame_<n>/enabled
controls if frame <n>
will be included when the capture service is
invoked. If only one frame is enabled the capture service performs a 3D single-capture. If more than
one frame is enabled the capture service will perform a 3D HDR-capture. By default enabled is false.
In order to capture a point cloud at least one frame needs to be enabled.
Name | Type | Zivid API Setting | Note |
---|---|---|---|
capture/frame_<n>/bidirectional |
bool | Zivid::Settings::Bidirectional | |
capture/frame_<n>/brightness |
double | Zivid::Settings::Brightness | |
capture/frame_<n>/enabled |
bool | ||
capture/frame_<n>/exposure_time |
int | Zivid::Settings::ExposureTime | Specified in microseconds (µs) |
capture/frame_<n>/gain |
double | Zivid::Settings::Gain | |
capture/frame_<n>/iris |
int | Zivid::Settings::Iris |
capture/general
contains settings that apply to all frames in a 3D capture.
Name | Type | Zivid API Setting |
---|---|---|
capture/general/blue_balance |
double | Zivid::Settings::BlueBalance |
capture/general/filters_contrast_enabled |
bool | Zivid::Settings::Filters::Contrast::Enabled |
capture/general/filters_contrast_threshold |
double | Zivid::Settings::Filters::Contrast::Threshold |
capture/general/filters_gaussian_enabled |
bool | Zivid::Settings::Filters::Gaussian::Enabled |
capture/general/filters_gaussian_sigma |
double | Zivid::Settings::Filters::Gaussian::Sigma |
capture/general/filters_outlier_enabled |
bool | Zivid::Settings::Filters::Outlier::Enabled |
capture/general/filters_outlier_threshold |
double | Zivid::Settings::Filters::Outlier::Threshold |
capture/general/filters_reflection_enabled |
bool | Zivid::Settings::Filters::Reflection::Enabled |
capture/general/filters_saturated_enabled |
bool | Zivid::Settings::Filters::Saturated::Enabled |
capture/general/red_balance |
double | Zivid::Settings::RedBalance |
2D settings are available under capture_2d/frame_0/
. To trigger a 2D capture, invoke the capture_2d
service. Note that capture_2d/frame_0/enabled
is default false, and must be set to true before
calling the capture_2d service, otherwise the service will return an error.
Name | Type | Zivid API Setting | Note |
---|---|---|---|
capture_2d/frame_0/brightness |
double | Zivid::Settings2D::Brightness | |
capture_2d/frame_0/enabled |
bool | ||
capture_2d/frame_0/exposure_time |
int | Zivid::Settings2D::ExposureTime | Specified in microseconds (µs) |
capture_2d/frame_0/gain |
double | Zivid::Settings2D::Gain | |
capture_2d/frame_0/iris |
int | Zivid::Settings2D::Iris |
In the zivid_samples
package we have added samples in C++ and Python that demonstrate how to use
the Zivid ROS driver. These samples can be used as a starting point for your project.
This sample shows how to use the Capture Assistant to capture with suggested settings for your particular scene. This sample first calls the capture_assistant/suggest_settings service to get the suggested settings. It then calls the capture service to invoke the 3D capture using those settings.
Using roslaunch (also launches roscore
, zivid_camera
, rviz
and rqt_reconfigure
):
roslaunch zivid_samples sample.launch type:=sample_capture_assistant_cpp
roslaunch zivid_samples sample.launch type:=sample_capture_assistant.py
Using rosrun (when roscore
and zivid_camera
are running):
rosrun zivid_samples sample_capture_assistant_cpp
rosrun zivid_samples sample_capture_assistant.py
This sample performs single 3D captures repeatedly. This sample shows how to configure the capture settings, how to subscribe to the points topic, and how to invoke the capture service.
Using roslaunch (also launches roscore
, zivid_camera
, rviz
and rqt_reconfigure
):
roslaunch zivid_samples sample.launch type:=sample_capture_cpp
roslaunch zivid_samples sample.launch type:=sample_capture.py
Using rosrun (when roscore
and zivid_camera
are running):
rosrun zivid_samples sample_capture_cpp
rosrun zivid_samples sample_capture.py
This sample performs 2D captures repeatedly. This sample shows how to configure the 2D capture settings, how to subscribe to the color/image_color topic, and how to invoke the capture_2d service.
Using roslaunch (also launches roscore
, zivid_camera
, rviz
and rqt_reconfigure
):
roslaunch zivid_samples sample.launch type:=sample_capture_2d_cpp
roslaunch zivid_samples sample.launch type:=sample_capture_2d.py
Using rosrun (when roscore
and zivid_camera
are running):
rosrun zivid_samples sample_capture_2d_cpp
rosrun zivid_samples sample_capture_2d.py
zivid_camera_with_settings.launch starts
the zivid_camera
node with 3D capture settings set to 3-frame HDR. Modify the settings in this .launch
file to match your scene.
roslaunch zivid_samples zivid_camera_with_settings.launch
ROS_NAMESPACE=zivid_camera roslaunch zivid_camera visualize.launch
You can use multiple Zivid cameras simultaneously by starting one node per camera and specifying unique namespaces per node:
ROS_NAMESPACE=camera1 rosrun zivid_camera zivid_camera_node
ROS_NAMESPACE=camera2 rosrun zivid_camera zivid_camera_node
By default the zivid_camera node will connect to the first available/unused camera. We recommend that you first start the first node, wait for it to be ready (for example, by waiting for the capture service to be available), then start the second node. This avoids any race conditions where both nodes may try to connect to the same camera at the same time.
This project comes with a set of unit and module tests to verify the provided functionality. To run the tests locally, first download and install the file camera used for testing:
wget -q https://www.zivid.com/software/ZividSampleData.zip
unzip ./ZividSampleData.zip
rm ./ZividSampleData.zip
sudo mkdir -p /usr/share/Zivid/data/
sudo cp ./MiscObjects.zdf /usr/share/Zivid/data/
rm ./MiscObjects.zdf
Then run the tests:
cd ~/catkin_ws && source devel/setup.bash
catkin run_tests && catkin_test_results ~/catkin_ws
The tests can also be run via docker. See the Azure Pipelines configuration file for details.
The node logs extra information at log level debug, including the settings used when capturing. Enable debug logging to troubleshoot issues.
rosconsole set /<namespace>/zivid_camera ros.zivid_camera debug
For example, if ROS_NAMESPACE=zivid_camera,
rosconsole set /zivid_camera/zivid_camera ros.zivid_camera debug
catkin build -DCOMPILER_WARNINGS=ON
This project is licensed under BSD 3-clause license, see the LICENSE file for details.
Please report any issues or feature requests related to the ROS driver in the issue tracker. Visit Zivid Knowledge Base for general help on using Zivid 3D cameras. If you cannot find a solution to your issue, please contact support@zivid.com.
This FTP (Focused Technical Project) has received funding from the European Union's Horizon 2020 research and innovation programme under the project ROSIN with the grant agreement No 732287. For more information, visit rosin-project.eu.