Setting `proc_mask` once bricks the driver
Closed this issue · 7 comments
Describe the bug
I launch the driver node like this
ros2 launch ouster_ros sensor.launch.xml sensor_hostname:=os-122307000738.local lidar_mode:=1024x20 viz:=false udp_profile_lidar:=RNG19_RFL8_SIG16_NIR16
And get some data on the /ouster/points
topic at very low frequency (different issue…). If however, I stop the driver and add the proc_mask
argument and restart with e.g. proc_mask:=PCL
, no point clouds are being generated. If I then restart the driver with the default value of proc_mask:='IMG|PCL|IMU|SCAN'
, still nothing.
The /ouster/points
topic does not get published. It seems none of the publishers come up, but nothing alarming is getting logged. Despite the fact that the logging says a "reset service" was created, ros2 service call /ouster/reset std_srvs/srv/Empty
just waits for it to become available.
…
[os_driver-1] [INFO] [1727868077.595491890] [ouster.os_driver]: auto start requested
[os_driver-1] [INFO] [1727868078.595938256] [ouster.os_driver]: auto start initiated
[os_driver-1] [INFO] [1727868078.745839196] [ouster.os_driver]: Retrived sensor active config
[os_driver-1] [INFO] [1727868078.745948114] [ouster.os_driver]: Will send UDP data to 169.254.204.196
[os_driver-1] [INFO] [1727868078.745967214] [ouster.os_driver]: Contacting sensor os-122307000738.local ...
[os_driver-1] [INFO] [1727868078.841318538] [ouster.os_driver]: Sensor os-122307000738.local configured successfully
[os_driver-1] [INFO] [1727868078.841431937] [ouster.os_driver]: Starting sensor os-122307000738.local initialization... Using ports: 7502/7503
[os_driver-1] [2024-10-02 13:21:18.841] [ouster::sensor] [info] initializing sensor client: os-122307000738.local expecting lidar port/imu port: 7502/7503
[os_driver-1] [2024-10-02 13:21:19.033] [ouster::sensor] [info] parsing non-legacy metadata format
[os_driver-1] [INFO] [1727868079.036441063] [ouster.os_driver]: No metadata file was specified, using: os-122307000738-metadata.json
[os_driver-1] [INFO] [1727868079.036761554] [ouster.os_driver]: Wrote sensor metadata to os-122307000738-metadata.json
[os_driver-1] [INFO] [1727868079.036803072] [ouster.os_driver]: ouster client version: 0.11.1+483206f-release
[os_driver-1] product: OS-1-128, sn: 122307000738, firmware rev: v3.0.1
[os_driver-1] lidar mode: 1024x20, lidar udp profile: RNG19_RFL8_SIG16_NIR16
[os_driver-1] [INFO] [1727868079.037431122] [ouster.os_driver]: reset service created
[os_driver-1] [INFO] [1727868079.037862657] [ouster.os_driver]: get_metadata service created
[os_driver-1] [INFO] [1727868079.038084166] [ouster.os_driver]: get_config service created
[os_driver-1] [INFO] [1727868079.038302050] [ouster.os_driver]: set_config service created
A computer reboot solved this last time, but I would like to know what the problem is.
To Reproduce
see above.
Platform (please complete the following information):
- Ouster Sensor? OS1-128 rev7
- Ouster Firmware Version? v3.0.1
- ROS version/distro? humble
- Operating System? ubuntu mate 22.04
- Machine Architecture? x64
- git commit hash (if not the latest): 5dd8555
It seems that I could work around this by killing the shell/tmux session this was running in, and opening a new one with sourcing the ros and workspace setup files.
@themightyoarfish as I noted earlier I couldn't generate the problem. Could you confirm that the problem persists within 0.13.1 and maybe give more hints that help me re-produce the issue (if still exist). Thanks.
I will try it again under controlled circumstances next week.
Could this issue be related? (it doesn't seem like it)
Can confirm that with current ros2
head (fc911d1), this seems to not occur.
awesome, thanks for the feedback.