griegler/octnet

Error in Example 00_create_data (after successfully running example 01)

fcontijoch opened this issue · 6 comments

Hello,

I'm getting an error when I try to run example/00. I was able to successfully run example/01_classification_modelnet so I believe its related to some of the dense portions of the code (which is what I'm interested).

The error I get is:
octnet/example/00_create_data$ python2 create_data.py

Traceback (most recent call last):
File "create_data.py", line 33, in
oc_from_dense1 = pyoctnet.Octree.create_from_dense(dense, val_range, n_threads=n_threads)
File "pyoctnet.pyx", line 654, in pyoctnet.Octree.create_from_dense (pyoctnet.cpp:8805)
def create_from_dense(cls, float[:,:,:,::1] dense, bool fit=False, int fit_multiply=1, bool pack=False, int n_threads=1):
ValueError: Buffer has wrong number of dimensions (expected 4, got 3)

Modifying line 27 to add a 4th dimension with:
dense = np.zeros((vx_res,vx_res,vx_res,1), dtype=np.float32)

Leads to the following error:

Traceback (most recent call last):
File "create_data.py", line 33, in
oc_from_dense1 = pyoctnet.Octree.create_from_dense(dense, val_range, n_threads=n_threads)
File "pyoctnet.pyx", line 654, in pyoctnet.Octree.create_from_dense (pyoctnet.cpp:8807)
def create_from_dense(cls, float[:,:,:,::1] dense, bool fit=False, int fit_multiply=1, bool pack=False, int n_threads=1):
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

Thanks for reporting. The function 'create_from_dense' changed a while ago and I did not update the example code until now. The four dimensions are n_features x depth x height x width.

Thanks.

With the new changes, it looks like the val_range is no longer used in Line 33. Is that no longer supported?

Following up again, running the code now gives a new error:

octnet/example/00_create_data$ python2 create_data.py
Traceback (most recent call last):
File "create_data.py", line 33, in
oc_from_dense1 = pyoctnet.Octree.create_from_dense(dense[np.newaxis], n_threads=n_threads)
File "pyoctnet.pyx", line 654, in pyoctnet.Octree.create_from_dense (pyoctnet.cpp:8805)
def create_from_dense(cls, float[:,:,:,::1] dense, bool fit=False, int fit_multiply=1, bool pack=False, int n_threads=1):
File "stringsource", line 616, in View.MemoryView.memoryview_cwrapper (pyoctnet.cpp:23312)
File "stringsource", line 323, in View.MemoryView.memoryview.cinit (pyoctnet.cpp:19681)
ValueError: ndarray is not C-contiguous

val_range is in the new implementation obsolete. Voxels are pooled, if the values are equal. If you need more fine grained control you can use the second version, where you pass the structure and values in separate arrays.
Ok, I tested the sample only with Python3. If you add a .copy() to your array that you pass to the function it should work.

Thanks. the copy() command worked.

I'm trying to better understand what you mean for the second version. From what I can tell from the documentation (lines 661-669 of pyoctnet.pyx), I pass along two arrays: the occupancy tensor and feature array.

The occupancy tensor which has dimensions DxHxW and non-zero values at occupied voxels. Does the value the occupied voxels have matter?

I'm also a bit confused about the feature vector. It has an additional dimension f which I'm unclear how to define.

Briefly, I am trying to adapt the pipeline to process 3D images (which are X x Y x Z arrays with integer intensity values at all voxel positions). I believe the first implementation (exact match in values) may be too strict of a criteria and lead to unnecessarily large octrees which is why I'm looking into the second approach.

If the 3D array has 256 discrete intensity values, would I need to build a 256xDxHxW feature vector?

Thanks

Does the value the occupied voxels have matter?

No.

It has an additional dimension f which I'm unclear how to define.

Think of it like an 3D voxel grid with the an additional dimension for the features. If you have just intensities, then this dimension will be 1, if you have color, then it should be 3. But maybe you also have normals or whatever.