Can this method publish image to ros?
ByeTragedy opened this issue · 11 comments
ROS subscriber uses image_transport and cv_bridge to receive images, BUT I cannot find similar tutorials or modules to publish unity-cam pictures, so can you help me?
Git issues are really for bug reports rather than help requests. In future I recommend asking on the Robotics Forum https://forum.unity.com/forums/robotics.623/
The basic flow will be to have a camera that renders to a RenderTexture, convert that into a Texture2D, and then convert that into a ROS message. In the latest version of ROS TCP Connector we have a few utility functions for converting a unity Texture2D into various ROS image messages, depending on what kind of message you want to send:
ToCompressedImageMsg_JPG()
ToCompressedImageMsg_PNG() or ToImageMsg()
Once you have it converted to a message, you should just be able to call ROSConnection.Publish. Here's a tutorial that demonstrates a simple looping publisher: https://github.com/Unity-Technologies/Unity-Robotics-Hub/blob/main/tutorials/ros_unity_integration/publisher.md
All together it would look something like this:
void SendImage(Camera sensorCamera, string topicName)
{
var oldRT = RenderTexture.active;
RenderTexture.active = sensorCamera.targetTexture;
sensorCamera.Render();
// Copy the pixels from the GPU into a texture so we can work with them
// For more efficiency you should reuse this texture, instead of creating and disposing them every time
Texture2D camText = new Texture2D(sensorCamera.targetTexture.width, sensorCamera.targetTexture.height);
camText.ReadPixels(new Rect(0, 0, sensorCamera.targetTexture.width, sensorCamera.targetTexture.height), 0, 0);
camText.Apply();
RenderTexture.active = oldRT;
// Encode the texture as an ImageMsg, and send to ROS
ImageMsg imageMsg = camText.ToImageMsg();
RosConnection.GetOrCreateInstance().Publish(topicName, message);
camText.Dispose();
}
Git issues are really for bug reports rather than help requests. In future I recommend asking on the Robotics Forum https://forum.unity.com/forums/robotics.623/
The basic flow will be to have a camera that renders to a RenderTexture, convert that into a Texture2D, and then convert that into a ROS message. In the latest version of ROS TCP Connector we have a few utility functions for converting a unity Texture2D into various ROS image messages, depending on what kind of message you want to send: ToCompressedImageMsg_JPG() ToCompressedImageMsg_PNG() or ToImageMsg()
Once you have it converted to a message, you should just be able to call ROSConnection.Publish. Here's a tutorial that demonstrates a simple looping publisher: https://github.com/Unity-Technologies/Unity-Robotics-Hub/blob/main/tutorials/ros_unity_integration/publisher.md
All together it would look something like this:
void SendImage(Camera sensorCamera, string topicName) { var oldRT = RenderTexture.active; RenderTexture.active = sensorCamera.targetTexture; sensorCamera.Render(); // Copy the pixels from the GPU into a texture so we can work with them // For more efficiency you should reuse this texture, instead of creating and disposing them every time Texture2D camText = new Texture2D(sensorCamera.targetTexture.width, sensorCamera.targetTexture.height); camText.ReadPixels(new Rect(0, 0, sensorCamera.targetTexture.width, sensorCamera.targetTexture.height), 0, 0); camText.Apply(); RenderTexture.active = oldRT; // Encode the texture as an ImageMsg, and send to ROS ImageMsg imageMsg = camText.ToImageMsg(); RosConnection.GetOrCreateInstance().Publish(topicName, message); camText.Dispose(); }
OK,Thank you very much
@Hasan-atc--You should be making a new MonoBehaviour script for the suggested approach, not modifying MessageExtensions.cs. In your own script attached to a GameObject in your scene, you'll want to pass in your own references to the desired camera (sensorCamera
). Additionally, the updated expectation for ToImageMsg()
states in the error message that it expects a new header message to be passed in for the construction of the ImageMessage.
Hello @at669 -- I know I'm tiring you out, but I need your help to complete this job. I had a question for you. After creating our own c# file, can you tell me exactly what system I need to add to it? I'm new to C#. For example, which functions in Texture2d should I use after creating a new MonoBehaviour script? If you have the time, could you tell me exactly what I need to do, down to the smallest detail? I beg you. Please. :(
Hi @Hasan-atc
Thanks for trying out our robotics packages! Unfortunately we don’t have capacity to provide custom support for training. Please refer to the documentation on ros-tcp-connector here, and robotics tutorials on the hub. For requests for help with our robotics packages, please use the robotics forum. If you encounter any bugs in the future, please do come back and file a bug report!
I also recommend taking a look at learn.unity.com for general Unity training; you’ll find great resources there for Unity newbies. For requests for help with Unity in general, you can post questions on the Unity forum. Thanks!
For everyone coming here from their Google Search on how to send a camera image from Unity to ROS, this is a working solution.
- Setup the ROS-TCP-Connector as described in the Tutorial
- Create a new camera in your scene
- Create a new Render Texture, make the camera render its image to the render texture
- Create a new C# Mono Behaviour based on the attached Code at the end of this comment
- Place the Mono Behaviour in your scene and provide a reference to the previously created Render Texture trough the Inspector
- Launch your Unity Scene, you can see that the camera image is published on topic
camera1
in ROS
View from Unity (i turned the camera 180°, because the image ended up flipped in ROS)
View from rqt Image Viewer
Source Code:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using Unity.Robotics.ROSTCPConnector.MessageGeneration;
using RosMessageTypes.Sensor;
using RosMessageTypes.Std;
public class sendImageToROS : MonoBehaviour
{
public RenderTexture renderTexture;
ROSConnection ros;
public string topicName = "camera1";
// The game object
// public GameObject cube;
// Publish the cube's position and rotation every N seconds
public float publishMessageFrequency = 0.5f;
// Used to determine how much time has elapsed since the last message was published
private float timeElapsed;
// Start is called before the first frame update
void Start()
{
// start the ROS connection
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterPublisher<ImageMsg>(topicName);
}
// Update is called once per frame
void Update()
{
timeElapsed += Time.deltaTime;
if (timeElapsed > publishMessageFrequency)
{
Texture2D texture = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGBA32, false);
RenderTexture.active = renderTexture;
texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0, false);
Color32[] pixels = texture.GetPixels32();
byte[] bytes = new byte[pixels.Length];
for (int i = pixels.Length - 1; i != 0 ; i--)
{
bytes[i] = (byte)pixels[i].r;
}
// Finally send the message to server_endpoint.py running in ROS
ImageMsg testMsg = new ImageMsg(new HeaderMsg(), (uint)renderTexture.height, (uint)renderTexture.width, "mono8", 0x00, (uint)renderTexture.width, bytes);
ros.Publish(topicName, testMsg);
timeElapsed = 0;
}
}
}
Here is my example of publishing a camera view; it is working on my machine. You need to connect the render texture on the camera.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using Unity.Robotics.ROSTCPConnector.MessageGeneration;
using RosMessageTypes.Sensor;
using RosMessageTypes.Std;
public class ros2_cam_publish : MonoBehaviour
{
ROSConnection ROSConnection;
[SerializeField]
public string topicName = "camera22";
[SerializeField]
public float publishMessageFrequency = 0.5f;
// Used to determine how much time has elapsed since the last message was published
private float timeElapsed;
[SerializeField]
public Camera inputCamImage;
// Start is called before the first frame update
void Start()
{
ROSConnection = ROSConnection.GetOrCreateInstance();
ROSConnection.RegisterPublisher<ImageMsg>(topicName);
}
// Update is called once per frame
void Update()
{
timeElapsed += Time.deltaTime;
if (timeElapsed > publishMessageFrequency)
{
SendImage(inputCamImage, topicName);
}
}
void SendImage(Camera sensorCamera, string topicName)
{
var oldRT = RenderTexture.active;
RenderTexture.active = sensorCamera.targetTexture;
sensorCamera.Render();
// Copy the pixels from the GPU into a texture so we can work with them
// For more efficiency you should reuse this texture, instead of creating and disposing them every time
Texture2D camText = new Texture2D(sensorCamera.targetTexture.width, sensorCamera.targetTexture.height);
camText.ReadPixels(new Rect(0, 0, sensorCamera.targetTexture.width, sensorCamera.targetTexture.height), 0, 0);
camText.Apply();
RenderTexture.active = oldRT;
// Encode the texture as an ImageMsg, and send to ROS
ImageMsg imageMsg = camText.ToImageMsg(new HeaderMsg());
ROSConnection.Publish(topicName, imageMsg);
//camText.Dispose();
}
}
@NotaLabs Thank you. You are godlike.
For anyone who wants to publish RGB image, just change his code like below...
byte[] bytes = new byte[pixels.Length * 3];
for (int i = pixels.Length - 1; i != 0 ; i--)
{
bytes[3i] = (byte)pixels[i].b;
bytes[3i + 1] = (byte)pixels[i].g;
bytes[3*i + 2] = (byte)pixels[i].r;
}
and "mono8" to "bgr8".
Also 'step' need to become triple.
ImageMsg testMsg = new ImageMsg(new HeaderMsg(), (uint)renderTexture.height, (uint)renderTexture.width, "bgr8", 0x00, (uint)renderTexture.width*3, bytes);
Hi @NotaLabs Thank you. How did you mange to flip the image in Rviz2?
Hi @NotaLabs Thank you. How did you mange to flip the image in Rviz2?
Hi, I haven't used Rviz2 since doing this project, but in my screenshot there are two orange arrows above the image preview. I guess I clicked those to turn the image.
(If this doesn't work I don't know)
I never managed to flip the image in the sense of mirroring I believe. I guess that just happened somewhere in between.