mcci-catena/arduino-lmic

LMIC v4.1 doesn't work with network servers that disable all 500kHz channels

terrillmoore opened this issue · 2 comments

With ChirpStack, it's easy to set up a channel plan that disables all 500 kHz channels by putting the following in chirpstack-network-server.toml:

  [network_server.network_settings]
  enabled_uplink_channels=[0, 1, 2, 3, 4, 5, 6, 7]

In this case, ChirpStack sends a MAC downlink with two LinkADRReq commands.

The first one disables all 125 kHz channels, and disables all 500 kHz channels.
The second enables channels 0-7.

The LMIC, rejects the first command as invalid, due to code here:

} else if (chpage == MCMD_LinkADRReq_ChMaskCntl_USLIKE_125ON ||
chpage == MCMD_LinkADRReq_ChMaskCntl_USLIKE_125OFF) {
u1_t const en125 = chpage == MCMD_LinkADRReq_ChMaskCntl_USLIKE_125ON;
// if disabling all 125kHz chans, must have at least one 500kHz chan
// don't allow reserved bits to be set in chmap.
if ((! en125 && chmap == 0) || (chmap & 0xFF00) != 0)
return 0;
} else {

This makes the device not work properly with the network, as it rejects the channel mask, and also therefore doesn't honor the desired data rate.

The problem is at line 124; we cannot check whether all 125kHz channels are off, because it's legal as long as subsequent LinkADRReq entries turn some on. We check this in LMICuslike_mapChannels() and should not check in canMapChannels().

There appears to be a similar (but unreported) problem with the setting of channel masks directly here:

} else if (chpage == MCMD_LinkADRReq_ChMaskCntl_USLIKE_BANK) {
if (chmap == 0 || (chmap & 0xFF00) != 0) {
// no bits set, or reserved bitsset , fail.
return 0;
}

Testing with ChirpStack configured as described above was positive; device now properly accepts the LinkADRReq pair and behaves as expected. Will submit a PR, but need to run regression tests before final merge to head and release.