RuntimeError: start (199238348) + length (403668) exceeds dimension size (199325676).
GUYYYUG opened this issue · 1 comments
Hi,
Sorry to bother you again , I met a problem when I use my CuO dataset to train the dimenet network(06_dimenet.ipynb). When I make the CuO dataset , I include the 'Lattice' in porps. Then error occurs in dataset.generate_angle_list()
,
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
/root/autodl-tmp/lct/NeuralForceField/tutorials/train_dimenet.ipynb Cell 11 in <cell line: 1>()
----> [1](vscode-notebook-cell://ssh-remote%2Bregion-41.autodl.com/root/autodl-tmp/lct/NeuralForceField/tutorials/train_dimenet.ipynb#ch0000010vscode-remote?line=0) angles = dataset.generate_angle_list()
[2](vscode-notebook-cell://ssh-remote%2Bregion-41.autodl.com/root/autodl-tmp/lct/NeuralForceField/tutorials/train_dimenet.ipynb#ch0000010vscode-remote?line=1) angles[0]
File ~/autodl-tmp/lct/NeuralForceField/tutorials/../nff/data/dataset.py:230, in Dataset.generate_angle_list(self)
227 def generate_angle_list(self):
228 self.make_all_directed()
--> 230 angles, nbrs = get_angle_list(self.props['nbr_list'])
231 self.props['nbr_list'] = nbrs
232 self.props['angle_list'] = angles
File ~/autodl-tmp/lct/NeuralForceField/tutorials/../nff/data/graphs.py:376, in get_angle_list(nbr_lists)
374 angle_tens = torch.cat(angles)
375 mask = angle_tens[:, 0] != angle_tens[:, 2]
--> 376 angles = list(torch.split(angle_tens[mask],
377 num))
379 return angles, new_nbrs
File ~/miniconda3/envs/nff/lib/python3.9/site-packages/torch/functional.py:159, in split(tensor, split_size_or_sections, dim)
153 return handle_torch_function(
154 split, (tensor,), tensor, split_size_or_sections, dim=dim)
155 # Overwriting reason:
156 # This dispatches to two ATen functions depending on the type of
...
572 return super(Tensor, self).split_with_sizes(split_size, dim)
573 else:
--> 574 return super(Tensor, self).split_with_sizes(split_size, dim)
RuntimeError: start (199238348) + length (403668) exceeds dimension size (199325676).
I didn't modify anything in 06_dimenet.ipynb except replacing its dataset with my dataset. I guess that adding the 'Lattice' in props when making my dataset results in this error ? this is the only thing that different from the original dataset in 06_dimenet.py.
I upload my CuO dataset below, it includes the CuO.npz
and getdata.py
CuO_dataset.zip
Could you please have a look on it ? (just run the getdata.py
to get the CuO.pth.tar
,then replace the original dataset in 06_dimenet.ipynb with my CuO dataset). I'm much confused about this error.
Hope to get your early reply , thanks a lot !
Unfortunately I never implemented PBC for DimeNet. I'll add an error message if somebody tries to use it with PBC. I'd suggest trying out PaiNN, which has PBC implemented and is much faster, or checking out the original DimeNet source code.