unbale to find the validation dataset txt file and same for test kindly share some link
saira05 opened this issue · 9 comments
File "/home/saira/Downloads/SGPN-master/provider.py", line 110, in getDataFiles
return [line.rstrip() for line in open(list_filename)]
FileNotFoundError: [Errno 2] No such file or directory: '/media/hdd2/data/pointnet/stanfordindoor/valid_hdf5_file_list.txt'
Valid data doesn't have to be excluded from training data. In our implementation, valid data is a random subset of training.
And what about test data?
In our experiments, we use cross-validation. For example, if we use Area 6 as test data, Area 1-5 will be the training data.
hello i tried to run the code by generating h5 files from pointnet (gen_indoor3d_h5.py ) but i am getting this following error can you please tell me how you prepare "Area_5_office_20.h5" and why your code is not accepting my h5 files.
name: Tesla K80, pci bus id: 0000:0b:00.0, compute capability: 3.7)
Model loaded in file: checkpoint/stanford_ins_seg/trained_models/epoch_200.ckpt
Traceback (most recent call last):
File "train.py", line 276, in
train()
File "train.py", line 176, in train
cur_data, cur_group, _, cur_seg = provider.loadDataFile_with_groupseglabel_stanfordindoor(cur_train_filename)
File "/home/sarshad/SGPN-master/provider.py", line 207, in loadDataFile_with_groupseglabel_stanfordindoor
group = f['pid'][:].astype(np.int32)#NxG
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "/home/sarshad/.local/lib/python3.5/site-packages/h5py/_hl/group.py", line 167, in getitem
oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5o.pyx", line 190, in h5py.h5o.open
KeyError: "Unable to open object (object 'pid' doesn't exist)"
hi @saira05 , I wonder how do you obtain the instance label for each point in S3DIS dataset ? I remember there is only semantic labels for each point.
Thanks a lot!
Yes @shuluoshu you are right that's why i am getting this error. @laughtervv Can you tell me how you get that instance labels or how you prepare dataset for your project bcz in repo there is only 1 h5 file. Kindly help me in this matter i want to use your project in my work
Thanx
Please refer to #3
Hi @laughtervv
How many datas should I pick to valid dataset?
Hi @lhiceu,
How many dataset did you take to valid.py for generating mingroupsize.txt and pergroup_thres.txt?