micro-ROS/micro-ROS-Agent

The micro_ros_agent does not relay topic messages

wuhanstudio opened this issue · 22 comments

Describe the bug

Hi, I tried this example micro-ros_subscriber_twist using UDP on STM32F407 (Ethernet), but the micro_ros_agent only relays the message to MCU if

[Normal]
Step 1: Start the agent
Step 2: Start the program on MCU
Step 3: Publish the twist message (keep publishing)

Twist messages received.

If I exchange step 2 with step 3, I cannot receive the message on MCU:

[BUG]
Step 1: Start the agent
Step 2: Publish the twist message (keep publishing)
Step 3: Start the program on MCU

No Twist messages.

I'm wondering is this an expected behavior or a BUG?

Besides, the micro-ros_publisher.ino example cannot receive the int32 message under both scenarios.

To Reproduce

Scenario 1 (BUG):

$ Step 1: Start the agent
$ docker run -it -p 9999:9999/udp --privileged microros/micro-ros-agent:galactic udp4 -p 9999 -v 6

$ Step 2: Publish the twist message
$ docker container -it sleepy_rubin /bin/bash
$ # Inside the container
$ ros2 run teleop_twist_keyboard teleop_twist_keyboard 

$ Step 3: Start the program on MCU
$ micro_ros_sub twist

# No twist message on MCU

Scenario 1 (Normal):

$ Step 1: Start the agent
$ docker run -it -p 9999:9999/udp --privileged microros/micro-ros-agent:galactic udp4 -p 9999 -v 6

$ Step 2: Start the program on MCU
$ micro_ros_sub twist

$ Step 3: Publish the twist message
$ docker container -it sleepy_rubin /bin/bash
$ # Inside the container
$ ros2 run teleop_twist_keyboard teleop_twist_keyboard 

# Normal twist message on MCU

Environment

  • ROS Version: Galactic
  • OS Version: Ubuntu Docker (micro-ros-agent:galactic)
  • MCU: STM32L475 Wifi UDP

This problem only exists while using UDP protocol (UART is normal).

Logs
The micro_ros_agent did not relay sub_int32 messages to the MCU:

docker run -it -p 9999:9999/udp --privileged microros/micro-ros-agent:galactic udp4 -p 9999 -v 6
C:\Users\hw630>docker run -it -p 9999:9999/udp --privileged microros/micro-ros-agent:galactic udp4 -p 9999 -v 6
[1664030807.997278] info     | UDPv4AgentLinux.cpp | init                     | running...             | port: 9999
[1664030807.997471] info     | Root.cpp           | set_verbose_level        | logger setup           | verbose_level: 6
[1664030928.787084] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x00000000, len: 24, data:
0000: 80 00 00 00 00 01 10 00 58 52 43 45 01 00 01 0F 7C 73 2F 81 81 00 FC 01
[1664030928.787660] info     | Root.cpp           | create_client            | create                 | client_key: 0x7C732F81, session_id: 0x81
[1664030928.787799] info     | SessionManager.hpp | establish_session        | session established    | client_key: 0x7C732F81, address: 172.17.0.1:44710
[1664030928.788065] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 19, data:
0000: 81 00 00 00 04 01 0B 00 00 00 58 52 43 45 01 00 01 0F 00
[1664030928.797360] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 60, data:
0000: 81 80 00 00 01 07 34 00 00 0A 00 01 01 03 00 00 25 00 00 00 00 01 01 20 1D 00 00 00 6D 69 63 72
0020: 6F 5F 72 6F 73 5F 72 74 74 5F 73 75 62 5F 69 6E 74 33 32 5F 6E 6F 64 65 00 00 00 00
[1664030928.819027] info     | ProxyClient.cpp    | create_participant       | participant created    | client_key: 0x7C732F81, participant_id: 0x000(1)
[1664030928.819248] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 14, data:
0000: 81 80 00 00 05 01 06 00 00 0A 00 01 00 00
[1664030928.819296] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 01 00 00 00 80
[1664030928.826875] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 01 00 00 00 80
[1664030928.830280] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 92, data:
0000: 81 80 01 00 01 07 52 00 00 0B 00 02 02 03 00 00 44 00 00 00 1C 00 00 00 72 74 2F 6D 69 63 72 6F
0020: 5F 72 6F 73 5F 72 74 74 5F 73 75 62 73 63 72 69 62 65 72 00 00 01 00 00 1C 00 00 00 73 74 64 5F
0040: 6D 73 67 73 3A 3A 6D 73 67 3A 3A 64 64 73 5F 3A 3A 49 6E 74 33 32 5F 00 00 01 00 00
[1664030928.830632] info     | ProxyClient.cpp    | create_topic             | topic created          | client_key: 0x7C732F81, topic_id: 0x000(2), participant_id: 0x000(1)
[1664030928.830892] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 14, data:
0000: 81 80 01 00 05 01 06 00 00 0B 00 02 00 00
[1664030928.830954] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 02 00 00 00 80
[1664030928.837399] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 02 00 00 00 80
[1664030928.837614] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 24, data:
0000: 81 80 02 00 01 07 10 00 00 0C 00 04 04 03 00 00 02 00 00 00 00 00 00 01
[1664030928.838214] info     | ProxyClient.cpp    | create_subscriber        | subscriber created     | client_key: 0x7C732F81, subscriber_id: 0x000(4), participant_id: 0x000(1)
[1664030928.838680] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 14, data:
0000: 81 80 02 00 05 01 06 00 00 0C 00 04 00 00
[1664030928.838854] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 03 00 00 00 80
[1664030928.846771] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 03 00 00 00 80
[1664030928.847067] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 40, data:
0000: 81 80 03 00 01 07 1D 00 00 0D 00 06 06 03 00 00 0F 00 00 00 00 02 01 08 03 00 01 20 0A 00 00 00
0020: 00 00 00 00 04 00 00 00
[1664030928.849254] info     | ProxyClient.cpp    | create_datareader        | datareader created     | client_key: 0x7C732F81, datareader_id: 0x000(6), subscriber_id: 0x000(4)
[1664030928.849478] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 14, data:
0000: 81 80 03 00 05 01 06 00 00 0D 00 06 00 00
[1664030928.849522] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 04 00 00 00 80
[1664030928.856036] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 04 00 00 00 80
[1664030928.962182] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0B 01 05 00 00 00 03 00 80
[1664030928.962785] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x7C732F81, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 04 00 00 00 80

We will take a look

Thanks again for your help.

I tested MicroROS on RT-Thread using Cortex M3, M4, M7 via UART, and they were very stable (pub, sub , service examples).

While for UDP connection, the MCU could publish topics and host services, but failed to receive messages from subscribed topics (/int32). After running several tests, I noticed that the micro_ros_agent did not relay subscribed messages to the MCU.

If the UDP subscription example is successful, we are very close to the full support of MicroROS on RT-Thread.

Best regards,
Han

Are you using Fast DDS as middleware in ROS 2 Galactic? micro-ROS is not compatible with Cyclone DDS (the default middleware in Galactic)

https://github.com/ros2/rmw_fastrtps#getting-started

I just checked the RMW:

  • The MCU used the same RMW (rmw_microxrcedds) as Arduino (static library recompiled with gcc-10).
  • The PC used the docker container, and it used rmw_fastrtps_cpp.
root@1855c4f4e0b0:/uros_ws# echo $RMW_IMPLEMENTATION
rmw_fastrtps_cpp
root@1855c4f4e0b0:/uros_ws#

The implementation of UDP transport on RT-Thread is OK because we can publish topics and provide services.

We can see the node (ros2 node list) and the topic (ros2 topic list) after the MCU subscribed to the topic, but later after that, the micro_ros_agent did not print further info (with -v 6 turned on) though the PC kept publishing topics (ros2 topic pub xxx).

Occationally, the micro_ros_agent printed further info, and then the MCU could receive the subscribed message. Thus, I assume the problem happens when the agent fails to relay topics (via UDP), but I dont't know why the agent does not relay messages sometimes.

Are ROS 2 and the micro-ROS Agent running in the same computer?

Are ROS 2 and the micro-ROS Agent running in the same computer?

I did not run ROS2 on the computer. The topics were published from the micro-ros-agent container as well.

Can you check if running both the agent and ROS 2 command in the very same machine (or inside the same docker) has the same problem?

Can you check if running both the agent and ROS 2 command in the very same machine (or inside the same docker) has the same problem?

Yes, the problem occured when I used the same docker container for the micro-ros-agent and the topic publication.

image

And we had the same problem on another PC that did not use container (The same PC that ran agent and topic publication natively without docker).

image

Can you test the first use case running the docker with the flag --net=host?

edit:
use --net=host -v /dev/shm:/dev/shm

for reference this is my docker command for ROS 2:

docker run -it --rm --net=host -v /dev/shm:/dev/shm --privileged -v /dev:/dev microros/micro-ros-agent:humble

Thank you, I'll try this, on my way to the Lab.

Previously, I used:

docker run -it -p 9999:9999/udp --privileged microros/micro-ros-agent:galactic udp4 -p 9999 -v 6

Hi, I tried --host, but got the same error.

The same problem occurred on M3, M4, and M7, but only when UDP was used (UART was normal). Thus, I think the problem is related to UDP communication.

Since the MCU can set up and publish topics to the PC, I then realize that the agent did not relay messages to the MCU.

I recorded my terminal here:

UDP Sub Int32 (no message)

asciicast

UDP Pub Int32 (Success)

asciicast

Make sure that the docker where you are launching the ROS 2 commands also are initialized with --net=host -v /dev/shm:/dev/shm --privileged.

Also, make sure that you are using the latest micro-ROS Agent docker version.

In any case, this seems to be a ROS 2 / DDS network issue between the Fast DDS in the micro-ROS Agent and the Fast DDS in your ROS 2 installation, not a micro-ROS issue.

I have been testing your scenario and I have been able to:

  1. Start publishing from ROS 2, open the agent, open the micro-ROS subscriber and receive data
  2. Open the agent, open the micro-ROS subscriber, start publishing from ROS 2 and receive data

Make sure that the docker where you are launching the ROS 2 commands also are initialized with --net=host -v /dev/shm:/dev/shm --privileged.

I connected to the same container running micro-ros-agent using docker container exec -it container_name /bin/bash, thus they share the same initialization commands --net=host -v /dev/shm:/dev/shm --privileged.

Also, make sure that you are using the latest micro-ROS Agent docker version.

I used the docker image you pushed last month.

In any case, this seems to be a ROS 2 / DDS network issue between the Fast DDS in the micro-ROS Agent and the Fast DDS in your ROS 2 installation, not a micro-ROS issue.

I agree this is a network issue (UDP). The ROS 2 installation I used is inside the same container where micro-ros agent was initialized (docker container exec -it).

Thanks for your help. I need to do more testing to pinpoint the exact location of the network issue.

One more thing, can you provide the same videos you prepare but using the flag -v6 in the agent instantiation?

Sure, I recorded the terminal with the flag -v6. (You can select, copy and pate from the video on the website)

asciicast

Full log:

(base) wuhanstudio@pop-os:~$ docker run -it --rm --net=host -v /dev/shm:/dev/shm --privileged -v /dev:/dev microros/micro-ros-agent:galactic udp4 --port 9999 -v6
[1664380470.783193] info     | UDPv4AgentLinux.cpp | init                     | running...             | port: 9999
[1664380470.783343] info     | Root.cpp           | set_verbose_level        | logger setup           | verbose_level: 6
[1664380474.625578] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x00000000, len: 24, data:
0000: 80 00 00 00 00 01 10 00 58 52 43 45 01 00 01 0F 79 0F E6 9C 81 00 FC 01
[1664380474.625678] info     | Root.cpp           | create_client            | create                 | client_key: 0x790FE69C, session_id: 0x81
[1664380474.625713] info     | SessionManager.hpp | establish_session        | session established    | client_key: 0x790FE69C, address: 192.168.199.180:704
[1664380474.625788] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 19, data:
0000: 81 00 00 00 04 01 0B 00 00 00 58 52 43 45 01 00 01 0F 00
[1664380474.631055] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 60, data:
0000: 81 80 00 00 01 07 34 00 00 0A 00 01 01 03 00 00 25 00 00 00 00 01 01 20 1D 00 00 00 6D 69 63 72
0020: 6F 5F 72 6F 73 5F 72 74 74 5F 73 75 62 5F 69 6E 74 33 32 5F 6E 6F 64 65 00 00 00 00
[1664380474.635493] info     | ProxyClient.cpp    | create_participant       | participant created    | client_key: 0x790FE69C, participant_id: 0x000(1)
[1664380474.635550] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 14, data:
0000: 81 80 00 00 05 01 06 00 00 0A 00 01 00 00
[1664380474.635562] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 01 00 00 00 80
[1664380474.638698] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 01 00 00 00 80
[1664380474.642553] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 92, data:
0000: 81 80 01 00 01 07 52 00 00 0B 00 02 02 03 00 00 44 00 00 00 1C 00 00 00 72 74 2F 6D 69 63 72 6F
0020: 5F 72 6F 73 5F 72 74 74 5F 73 75 62 73 63 72 69 62 65 72 00 00 01 00 00 1C 00 00 00 73 74 64 5F
0040: 6D 73 67 73 3A 3A 6D 73 67 3A 3A 64 64 73 5F 3A 3A 49 6E 74 33 32 5F 00 00 01 00 00
[1664380474.642797] info     | ProxyClient.cpp    | create_topic             | topic created          | client_key: 0x790FE69C, topic_id: 0x000(2), participant_id: 0x000(1)
[1664380474.642924] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 14, data:
0000: 81 80 01 00 05 01 06 00 00 0B 00 02 00 00
[1664380474.643033] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 02 00 00 00 80
[1664380474.646354] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 02 00 00 00 80
[1664380474.646594] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 24, data:
0000: 81 80 02 00 01 07 10 00 00 0C 00 04 04 03 00 00 02 00 00 00 00 00 00 01
[1664380474.646835] info     | ProxyClient.cpp    | create_subscriber        | subscriber created     | client_key: 0x790FE69C, subscriber_id: 0x000(4), participant_id: 0x000(1)
[1664380474.647028] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 14, data:
0000: 81 80 02 00 05 01 06 00 00 0C 00 04 00 00
[1664380474.647075] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 03 00 00 00 80
[1664380474.650736] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 03 00 00 00 80
[1664380474.651467] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 40, data:
0000: 81 80 03 00 01 07 1D 00 00 0D 00 06 06 03 00 00 0F 00 00 00 00 02 01 08 03 00 01 20 0A 00 00 00
0020: 00 00 00 00 04 00 00 00
[1664380474.652314] info     | ProxyClient.cpp    | create_datareader        | datareader created     | client_key: 0x790FE69C, datareader_id: 0x000(6), subscriber_id: 0x000(4)
[1664380474.652470] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 14, data:
0000: 81 80 03 00 05 01 06 00 00 0D 00 06 00 00
[1664380474.652515] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 04 00 00 00 80
[1664380474.656175] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 04 00 00 00 80
[1664380474.762746] debug    | UDPv4AgentLinux.cpp | recv_message             | [==>> UDP <<==]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0B 01 05 00 00 00 03 00 80
[1664380474.763031] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 04 00 00 00 80                  

No new message received after datareader created and the last message sent out by UDPv4AgentLinux.cpp was:

[1664380474.763031] debug    | UDPv4AgentLinux.cpp | send_message             | [** <<UDP>> **]        | client_key: 0x790FE69C, len: 13, data:
0000: 81 00 00 00 0A 01 05 00 04 00 00 00 80   

Does this mean the MCU failed to respond to the last message sent out by the agent?

Not at all, the communication between the client and the agent is correct. The point is that the DDS side of the agent (no related to this log) is not matching with the DDS side of ROS 2.

Thanks for confirming the communication between the client and the agent is correct, since I did not find documentations about the protocol between the agent and the client.

I'll make sure all ROS2 installations use FastRTPS.

root@1855c4f4e0b0:/uros_ws# echo $RMW_IMPLEMENTATION
rmw_fastrtps_cpp

The protocol between the client and the agent is XRCE-DDS and it is documented here: https://micro-xrce-dds.docs.eprosima.com/en/latest/

The protocol between the client and the agent is XRCE-DDS and it is documented here: https://micro-xrce-dds.docs.eprosima.com/en/latest/

Thank you. I'll check the DDS communication and try to pinpoint the cause of the problem.

I have the same issue, I cannot see any of the data topics (imu etc) coming from a PX4 pixracer running the client. This is on Ubuntu 20.04 running ROS2 Galactic, but with fastrtps. This is all compiled from source, on a single host computer and no virtual machine or docker container used.

In one terminal on the host running the agent:

RMW_IMPLEMENTATION=rmw_fastrtps_cpp ros2 run micro_ros_agent micro_ros_agent serial --dev /dev/ttyUSB0 -b 921600

[1672512771.271733] info     | TermiosAgentLinux.cpp | init                     | running...             | fd: 3
[1672512771.274715] info     | Root.cpp           | set_verbose_level        | logger setup           | verbose_level: 4
[1672512771.921498] info     | Root.cpp           | create_client            | create                 | client_key: 0x00000001, session_id: 0x81
[1672512771.921738] info     | SessionManager.hpp | establish_session        | session established    | client_key: 0x00000001, address: 1
[1672512771.989536] info     | ProxyClient.cpp    | create_participant       | participant created    | client_key: 0x00000001, participant_id: 0x001(1)
[1672512771.995646] info     | ProxyClient.cpp    | create_topic             | topic created          | client_key: 0x00000001, topic_id: 0x3E8(2), participant_id: 0x001(1)
[1672512771.995883] info     | ProxyClient.cpp    | create_subscriber        | subscriber created     | client_key: 0x00000001, subscriber_id: 0x3E8(4), participant_id: 0x001(1)
[1672512771.996993] info     | ProxyClient.cpp    | create_datareader        | datareader created     | client_key: 0x00000001, datareader_id: 0x3E8(6), subscriber_id: 0x3E8(4)
[1672512772.002119] info     | ProxyClient.cpp    | create_topic             | topic created          | client_key: 0x00000001, topic_id: 0x3E9(2), participant_id: 0x001(1)
[1672512772.002378] info     | ProxyClient.cpp    | create_subscriber        | subscriber created     | client_key: 0x00000001, subscriber_id: 0x3E9(4), participant_id: 0x001(1)
[1672512772.003367] info     | ProxyClient.cpp    | create_datareader        | datareader created     | client_key: 0x00000001, datareader_id: 0x3E9(6), subscriber_id: 0x3E9(4)
.... cut some lines here for brevity .....
[1672512772.089737] info     | ProxyClient.cpp    | create_topic             | topic created          | client_key: 0x00000001, topic_id: 0x3F4(2), participant_id: 0x001(1)
[1672512772.089933] info     | ProxyClient.cpp    | create_subscriber        | subscriber created     | client_key: 0x00000001, subscriber_id: 0x3F4(4), participant_id: 0x001(1)
...

And in another terminal:

RMW_IMPLEMENTATION=rmw_fastrtps_cpp ros2 node info /px4_micro_xrce_dds

/px4_micro_xrce_dds
  Subscribers:
    /fmu/in/obstacle_distance: px4_msgs/msg/ObstacleDistance
    /fmu/in/offboard_control_mode: px4_msgs/msg/OffboardControlMode
    /fmu/in/onboard_computer_status: px4_msgs/msg/OnboardComputerStatus
    /fmu/in/sensor_optical_flow: px4_msgs/msg/SensorOpticalFlow
    /fmu/in/telemetry_status: px4_msgs/msg/TelemetryStatus
    /fmu/in/trajectory_setpoint: px4_msgs/msg/TrajectorySetpoint
    /fmu/in/vehicle_attitude_setpoint: px4_msgs/msg/VehicleAttitudeSetpoint
    /fmu/in/vehicle_command: px4_msgs/msg/VehicleCommand
    /fmu/in/vehicle_mocap_odometry: px4_msgs/msg/VehicleOdometry
    /fmu/in/vehicle_rates_setpoint: px4_msgs/msg/VehicleRatesSetpoint
    /fmu/in/vehicle_trajectory_bezier: px4_msgs/msg/VehicleTrajectoryBezier
    /fmu/in/vehicle_trajectory_waypoint: px4_msgs/msg/VehicleTrajectoryWaypoint
    /fmu/in/vehicle_visual_odometry: px4_msgs/msg/VehicleOdometry
  Publishers:

  Service Servers:

  Service Clients:

  Action Servers:

  Action Clients:
pengg commented

I have the same issue on windows too, windows 10: ros2 humble(microsoft desktop version), micro-ros-agent compiled run udp mode, microros mcu: stm32 m7, it seems agent can receive pub from mcu but cannot relay ros2 message to mcu, even more, ros2 node list and ros2 topic list can not see agent node and mcu topics.
1699342241239
more information, I app run pub and sub sucessfully severial hours, then suddenly agent cannot relay the ros2 pub messages, and cannot recovery until restart windows 10 and ros2.
I will update my test information while trying to find stable reproduce path and keep in touch this thread.