lynnlilu/mmWave_reading

x,y coordinates

08Elia1994 opened this issue · 14 comments

Hello,
I have the IWR1443 and i'm trying to use the code that you posted to get x,y,z coordinates.
I have just aquestion:

  • The x,y,z coordinates are post-processed after the acquisition from the device or the only thing that is done is to convert them from the Q format?
    That's because I'm trying to create a SLAM algorithm from the data that read the device but I'm not sure that x,y,z that I have are right.
    The SLAM algorithm requires the x,y,z coordinates without processing so it's possible to create a map.

Thank you so much for your help!

Hi,

Yes, the x, y, z coordinates are the post-processed ones, which can be checked in parseTVL.m. If you want the raw data without conversion, you can modify the lines 29, 30, 31 of parseTLV.m to get them.

Best,
Li

Thanks for you reapid reply,
So if the SLAM algorithm requires the x,y,z coordinates i have to use the ones without conversion (without dividing for 2^xyz_q_format) that is done in line 29,30,31?

Yes, exactly.

Thank you very much.

I've got another question:
If i collect all the x,y,z coordinates from all the frames i've got this:

image

This is the collection of the x coordinate from all frames.
Each row represent each frame so reading by row there are the x coordinate of the objects that the radar sees in that frame.
The strange thing is that the experimet was made by moving the device from a point and approaching it to a fixed obstacole in an another position.
Before the NaN that i added it seems that each frame, or some of them, terminate with the same data.
What I expect is that moving on with the frames the coordinates became smaller because the obstacole is nearer than before.
So i'm not sure of which is the reference system of the device and in which way the device update the coordinates while moving.

Thanks for your incredible help,
Best regards

I am a little confused with your problem. Do you mean that you want to find a column that represents an object moving close toward the mmWave radar?

If so, I have several suggestions:

  1. You can first check the y and z coordinates whether they change during the moving of the device. If not, this means you set the right axis and then you can explore the x-coordinates.
  2. I guess during the whole movement, these are 16 objects detected by the mmWave radar, but some objects are not always in the field-of-view of the radar. So some fields are filled with NaN. Also, to my understanding, in different frames, the same column does not always mean the same object. For example, in you scenario, there are 16 sensed object, which are numbered with 1, 2, ..., 16. Then, in the frame 1, the column 1 may indicate the object 1, and column 2 may indicate the object 2, etc. But in the frame 2, the column 1 may indicate the object 2 and column 2 may indicate the object 4, etc. Such a relationship may not be predictable actually. So further algorithms are needed to handle it.

Hope these help.

Best,
Li

In the experiment was the radar that moves toward and obstacole. But i think that the final result is the same.
So you mean that there is a no sure correspondance from the same column of different frames?
Because if i plot the x,y,z coordinates i see something similar to what i see on the mmWave Demo Visualizer so i don't know how to move om.
Just to get it the radar is always positioned in (0,0,0) and from here the device caluclate x,y,z, even if the radar moves?
Thank you

Yes, I think there is no sure correspondence.
For radar, it cannot detect whether it moves. So in the perspective of radar, even the moving one is the radar, it would still think the ambient objects are the moving ones.

Thank you for your precious support.
I've got another question:
Talking about the range azimut heatmap.
I would like to plot the data on matlab. So I read the doxygen documentation and i found where the data that i want are and how they are constructed.
My question is: Since in each cell there are the real and imaging part of the data how can I use it to plot?
I tried doing the absolute value but then i dont know how to represent it on a matlab graph because i have a matrix of point that i want to plot with different colors in relation of the value?
With which data i have to plot the graph?
I hope that my question is clear,if it is not i will re-formulate m question.
Best regards

I am also not quite sure how the heatmap can be plotted like the visualizer does. So you can find the code for heatmap extraction is quite simple compared with others.

Since in my following work I turn to use the DCA1000EVM for real-time data capturing, these codes are actually not completed. You may turn to TI E2E forum for further support.

Sorry for the inconvenience.

Best,
Li

Thank you very much for your great support. This is the only resourse that i've found about matlab. I'll follow your seggests and if i'll need help in the future i will come back here.
Thanks a lots,
Best regards.

Hi,
I've got a doubt.
How the device do the scan while running?
It has a beam and during a frame it does a scan from 0° to 180° and then the returning echos that are over CFAR threshold are detected objects?
Thanks for your support,
Best Regards

Could IWR1443 scan through 0 to 180 degree? Actually, I used AWR1642 in our experiments, and this device cannot realize such a scanning. The device can only scan the frequency of the transmitted mmWave signals in a fixed sensing field of view to realize the FMCW-based measurement.

Hi,
I'm not sure about the scan so don't mind about what I Said before.
I'm still blocked on the problem discussed yesterday and I think that I will try again with the E2E forum. We are working with iwar1443 because we need a 3D radar but I think that the same problem will be also on the awr1642.
Tank you for your support,
Best regards