claudeheintz/LXESP32DMX

Cannot receive input from a DMX console

mridoni opened this issue · 7 comments

Hello, I'm developing a DMX "input-only" device. For testing (sending commands to the device) I'm using two devices who are supposed to control mine: a uDMX USB/DMX interface and a small 12-channel DMX console (Ibiza LC12DMX, this one: https://www.amazon.it/Ibiza-16-2320-LC12DMX-Dmx-Controller/dp/B076F72283).

I can successfully receive DMX input from the USB interface, but there's no way I can receive anything from the console. I connected a DMX LED controller (a commercial one) both to the console and to the USB interface, and it works flawlessly with both.

I replaced the hal-uart.c file (I also tried with the standard one), and the schematic I'm using is the one in circuit.pdf (I will add some optoisolators and some configuration switches down the road but for now I kept it simple in order to develop the software part of the device).

I tried to add a 120 Ohm termination resistor for the DMX bus but it made no difference.

Do you have any idea about what I could look into? The hardware part is quite basic, and on the software side I do not have enough experience about possible woes (read: differences of implementation in commercial devices) of the DMX protocol.

Thanks

It requires a logic analyzer or oscilloscope to view an incoming DMX signal to see how if there is an issue with timing. But, the one easy thing to try, however is to reverse the polarity of the incoming a/b signal lines. There are three incoming lines for DMX, ground, data-a, and data-b. Try reversing your connections for data-a and data-b. It might be as simple as that.

A little late, but those cheap DMX controllers tend to only send the amount of channels they support - in your case 6.
Changing this

#define DMX_MIN_SLOTS 24

to a lower value might solve your problem. It at least did for me with a similar console.

Edit:
And this hardcoded value

if ( _current_slot > 24 ) {

Was #define DMX_MIN_SLOTS 24 just an arbitrary choice the author made?

I'm also building a DMX device (receive only) that will be 10 channels. Should I be setting some of these to lower numbers to help optimize things? (As well as that hard coded _current_slot > 24 mentioned above?)

#define DMX_MIN_SLOTS 24
#define RDM_MAX_FRAME 257
#define DMX_MAX_SLOTS 512
#define DMX_MAX_FRAME 513

The DMX standard defines the minimum break-to-break time as 1204µs. The minimum break is 88µs and the minimum mark-after-break is 8µs. It takes 44µs to transmit a byte (4µs per bit, 8 bits plus a start and two stop bits). (1204-(88+8))/44 ~= 25 which is 24 bytes plus the start code. It is possible to transmit fewer than 24 addresses and still be legal DMX by increasing the break time. The maximum break is actually a full second. So it is possible to transmit a single address plus the start code and still have a legal DMX signal. However, using typical values, the minimum works out to be about 24.

I would say go ahead and change the DMX_MIN_SLOTS to suit your needs. If it seems necessary to enough people, this could be placed in a class variable and getter/setter functions added.

Knuth famously said premature optimization is the root of all evil. DMX has functioned well with many, if not most, devices sending a full frame of 512 bytes for years. It has been reliable on hardware that is much, much slower than today's micro-controllers. I would absolutely not worry about optimizing the refresh rate.

@claudeheintz I appreciate the detailed info. And that is true, I should not be concerning myself with optimization yet before even getting everything working. Thank you for the library.

Sorry for being late, but I was in the middle of a new revision of my board: I just made a test and i can confirm that the solution suggested fixes my problem.

This may be related or duplicate of the issue I opened yesterday. If so I'll close my issue.
I can't get any DMX slot over 13 to read. The input test code doesn't return anything for slots 14 and above. I have a logic analyzer on the input pin (in my case GPIO 16) and can confirm that all 512 slots are being sent and the analyzer confirms the values in the slots are correct.

If anyone has an idea where I when wrong let me know.

Edit: See issue #13 for more details, I discovered why this is happening.