jamesfining/scte

Small bug in read_splice_event for SCTE104 packets?

Closed this issue · 2 comments

Hi,
First of all, thanks for providing the module, it saved me a lot of time trying to read the live VANC packets of our SDI analyzer.

I got some KeyError exceptions while trying to decode SCTE104 packets.

In read_splice_event.py you read "request["segmentation_type_id"] = {"decimal": bitarray_data.read("uint:8")}"

This value is then matched to the enums:
request["segmentation_type_id"]["name"] = scte104_enums.get_segmentation_type(request["segmentation_type_id"]["decimal"])

But actually, they are not decimals, but still hex values. For example, a Program start with hex value 0x10 gets translated to "10". You then try to translate that to "16" in scte104_enums.py, which isn't provided. The code advances but then it crashes on the KeyError because "segment_num" isn't available in the dictionairy.

When we are talking about those codes in the broadcast world, we talk about the hex values, I've never seen them translated to decimal.

I think part of the problem is that Python casts the number to an integer. In the function get_segmentation_type(type_id) (scte104_enums.py) you never convert the "integer" (which is actually hex) to a decimal, which is why an 0x10 never matched a "16".
I got it working by changing the values in scte104_enums.py to the SCTE35 list. (e.g. "Program Start" is 10 instead of 16).
I slightly changed:
request["segmentation_type_id"] = {"hex": bitarray_data.read("uint:8")}
request["segmentation_type_id"]["name"] = scte104_enums.get_segmentation_type(request["segmentation_type_id"]["hex"])
so the message makes more sense to me.

This is an example SCTE104 packet from production: ffff0050000077000000020838071202010400021f40010b00360000000000001e0f2461653965333639332d303433372d346632322d623761662d3166353937636330633061631e0000000000000000

I'm open for discussion if I'm interpreting your code incorrectly or if I'm interpreting the standard incorrectly.
Kind regards

To complement my question, I dived a little deeper in the bitstring documentation. You seem to correctly cast the segmentation_type_id. It seems my automation system is sending "1e" instead of "0x30". I'll need to check with them.

I'm sorry, it seems your code is totally correct. The driver on automation side expected me to fill in the decimal value instead of the hex value to correctly mark the segmentation_type_id. So I should have entered 16 like you decode instead of 10.
My apologies for raising an incorrect issue