ModbusSerialTransport.getInterFrameDelay returns wrong values if the baudrate is below 19200
pierantoniomerlino opened this issue · 0 comments
The getInterFrameDelay
in ModbusSerialTransport
returns milliseconds values instead of microseconds if the baudrate is below 19200. The following method:
/**
* In microseconds
*
* @return Delay between frames
*/
int getInterFrameDelay() {
if (commPort.getBaudRate() > 19200) {
return 1750;
}
else {
return Math.max(getCharInterval(Modbus.INTER_MESSAGE_GAP), Modbus.MINIMUM_TRANSMIT_DELAY);
}
}
calls the getCharInterval
method when the baudrate is less than 19200. However, the getCharInterval
, as shown below, returns a millisecond value:
/**
* Calculates an interval based on a set number of characters.
* Used for message timings.
*
* @param chars Number of characters
* @return char interval in milliseconds
*/
int getCharInterval(double chars) {
return (int) (getCharIntervalMicro(chars) / 1000);
}
So, I got 2 instead of 2000 for 19200. Indeed, I suppose that replacing the getCharInterval
with getCharIntervalMicro
will fix the issue.
Moreover, the getCharIntervalMicro
computes the interval based on the packet length. This includes also the number of stop bits, but the SerialPort.getNumStopBits
method returns one of these:
// Number of Stop Bits
static final public int ONE_STOP_BIT = 1;
static final public int ONE_POINT_FIVE_STOP_BITS = 2;
static final public int TWO_STOP_BITS = 3;
Is it correct that the two-stop-bits correspond to 3 bits?
Expected Behavior
The getInterFrameDelay
has to return milliseconds values instead of microseconds to get the correct interframe delay.
Actual Behavior
With 19200 baudrate the interframe delay is rounded to 0.
Specifications
- Version: 2.5.5 (but it seems to be present also in the development branch)
- Platform:
- Subsystem: