biigle/maia

Investigate error

Closed this issue · 0 comments

mzur commented

Investigate this error:

Error while executing python command '/var/www/vendor/biigle/maia/src/config/../resources/scripts/novelty-detection/DetectionRunner.py /var/www/storage/maia_jobs/maia-598-novelty-detection/input.json':
2022-07-15 14:19:04.143230: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
/var/www/vendor/biigle/maia/src/resources/scripts/novelty-detection/ImageCollection.py:56: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
  pca_features = np.array(list(pca_features))
TypeError: only size-1 arrays can be converted to Python scalars

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/var/www/vendor/biigle/maia/src/config/../resources/scripts/novelty-detection/DetectionRunner.py", line 130, in <module>
    runner.run()
  File "/var/www/vendor/biigle/maia/src/config/../resources/scripts/novelty-detection/DetectionRunner.py", line 67, in run
    clusters = images.make_clusters(number=self.clusters)
  File "/var/www/vendor/biigle/maia/src/resources/scripts/novelty-detection/ImageCollection.py", line 57, in make_clusters
    pca_features = PCA(n_components=2, copy=False).fit_transform(pca_features)
  File "/usr/local/lib/python3.8/dist-packages/sklearn/decomposition/_pca.py", line 369, in fit_transform
    U, S, V = self._fit(X)
  File "/usr/local/lib/python3.8/dist-packages/sklearn/decomposition/_pca.py", line 390, in _fit
    X = check_array(X, dtype=[np.float64, np.float32], ensure_2d=True,
  File "/usr/local/lib/python3.8/dist-packages/sklearn/utils/validation.py", line 531, in check_array
    array = np.asarray(array, order=order, dtype=dtype)
ValueError: setting an array element with a sequence.

It occurred in volume 6803.