Problem with decoding ADS-B messages with 16 bit SDR
akaranovski opened this issue · 22 comments
Hello thanks for the module.
I just wonder whether it can cause problem while decoding messages if SDR is 16 bit and not 8 or 12?
If so in which part of Framer or Demodulator this can be tuned?
Thanks
You mean your SDR has a 16-bit ADC? Are you getting no messages? Or junk messages? There's a constant/GUI variable tuner called Detection Threshold
in the example absb_rx.py
. If you're seeing garbage, you'll need to raise it. If you're seeing nothing you'll need to lower it. The threshold displays in the GUI.
Hopefully that's helpful.
Thanks for the answer. Yes it is 16-bit ADC.
The case is I see messages while decoding DF17
CRC: Failed PI^CRC=some value.
Meantime whe I switch to 12-bit ADC SDR it decodes everything correctly with same RF chain(Antenna, cable, and detection threshold).
Very strange. I just merged a pull request that was messing with the CRC. I'm wondering if it's related. Would you mind including a screenshot of your 12-bit working and 16-bit not working cases? That will help me debug.
In the failed case, the SNR is 3-4 dB and in the successful case it's 13 dB. I'm wondering if turning up the gain on your 16-bit SDR would help. Do you have the GUI enabled? That picture speaks a thousands words...
It appears you are running with a 4 Msps sample rate, correct?
I noticed some "D"s in the terminal output. This indicates samples are being dropped. That will equate to dropped bits and surely make the CRC fail. I would suggest reducing the sample rate to 2 Msps and try again.
Sorry for the hassle, but can I see a screenshot of your full flowgraph?
Sure
What do you think is causing the SNR difference between the two front ends? Since the failing cases have lower SNR than the succeeding ones, I'm tempted to chalk it up to signal quality.
Could you zoom in the y-axis of the time plot that detects bursts? I'm curious if they're constant amplitude or time-varying.
I think the SOP/EOB burst tags may have been from a previous debug version. From looking at the signal, I wouldn't expect the CRC to pass. What does your front end look like? Antenna is pretty important here. And are there any differences between the 12-bit setup that works?
I got your point, but seems the problem caused by another reason.
Will try to explain: seems MSPS on 16-bit ADC getting lower sample rate via decimation of max available.
For example max msps is 160 and in order to get 2 it is decimating signal which brings the loss of signal strength.
Do you have idea how to avoid signal loss due to decimation?
What kind of SDR are you using? Is the ADC running at 160 Msps and then in FPGA downsampling to 4 Msps? If so, the digital filters in there should be good. Do you have a hardware filter out front? If not, out-of-band signal may be saturating your receiver's front-end and stealing your dynamic range.
This is USRP SDR custom version and it has 325 Msps.
But the question I have do I lose signal when it decimates the signal until 2msps or FPGA filters should save info?
Cause when I put 160Msps signal strenght rising immideately by 30 dBm.
Now I am lost why simple 12 bit Pluto can catch the ADSB and well designed USRP with higher MSPS can not ?
Seems it goes out of synchro with SOB.
What you think?
About antenna I use same antenna as for 12 bit SDR it is self made and checked in terms of VSWR and the exact frequency.
It works with pluto easily.
@akaranovski did you have success?
Hi Matt,
I will close the topic as it was solved after firmware upgrade of device.
Thanks