ilastik/lazyflow

OpPredictRandomForest fails if input image is not contiguous

burgerdev opened this issue · 1 comments

In classifierOperators.py:271, the assignment

res.shape = (prod, shape[-1])

fails for non-contiguous res. This is intended behaviour as stated in the numy doc. I think copying data would be better than failing, so could we use

res = res.reshape((prod, shape[-1]))

here?

Test Case:

def testRF(self):
    from lazyflow.operators.classifierOperators import OpTrainRandomForest, OpPredictRandomForest

    parent = OpAutocontextClassification(graph=Graph())

    volShape = (100, 100, 20, 1)
    labelsShape = volShape
    featsShape = volShape[:3] + (5,)

    img = np.random.randint(0, 256, size=volShape).astype(np.uint8)
    labels = np.random.randint(1, 3, size=labelsShape).astype(np.uint8)
    features = np.zeros(featsShape)
    for f in range(featsShape[3]):
        features[..., f] = labels.squeeze() * (f+1)/featsShape[3]

    opTrain = OpTrainRandomForest(parent=parent)
    opTrain.fixClassifier.setValue(False)

    opPredict = OpPredictRandomForest(parent=parent)
    opPredict.Classifier.connect(opTrain.Classifier)
    opPredict.LabelsCount.setValue(2)

    #########
    # WORKS #
    #########

    opTrain.Images.resize(1)
    opTrain.Images[0].setValue(features)
    opPredict.Image.setValue(features)
    opTrain.Labels.resize(1)
    opTrain.Labels[0].setValue(labels)
    out = opPredict.PMaps[...].wait()

    ################
    # DOESN'T WORK #
    ################

    # make features non-contiguous
    newfeats = np.swapaxes(features, 0, 2)
    newlabels = np.swapaxes(labels, 0, 2)

    assert np.may_share_memory(newfeats, features)

    opTrain.Images.resize(1)
    opTrain.Images[0].setValue(newfeats)
    opPredict.Image.setValue(newfeats)
    opTrain.Labels.resize(1)
    opTrain.Labels[0].setValue(newlabels)
    out = opPredict.PMaps[...].wait()

fixed in #110