PandABlocks/PandABlocks-client

Make HDF datasets always be float64 unless they are PCAP.BITS

Closed this issue · 0 comments

At the moment in SCALED mode the datasets are always the native type, double. In RAW mode this doesn't work as they are transported as int32 then multiplied by scale and offset. The code at the moment turns it into double sometimes:

if raw and (field.capture == "Mean" or field.scale != 1 or field.offset != 0):

However this makes it difficult to handle downstream as changing an offset might change the datatype of the field.

Agreed we should always write double, except for the PCAP.BITS fields. Fortunately we can detect those by the lack of scale and offset and units in the header, so we can set them as None here:

scale=float(field.get("scale", 1)),
offset=float(field.get("offset", 0)),
units=str(field.get("units", "")),

which needs a typing change to add Optional here:

scale: float = 1.0
offset: float = 0.0
units: str = ""

then we can check for them to be not None here:

if raw and (field.capture == "Mean" or field.scale != 1 or field.offset != 0):

I don't know what the datatypes in the unit tests are, so those might need changing too...