hhiim/Lacan

使用时报错

Charley-xiao opened this issue · 1 comments

大佬你好,我在训练之后使用模型时报错如下:

  0%|          | 0/1000 [00:00<?, ?it/s]
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
Cell In[11], line 5
      1 # 运行本单元格,开始成为精神分析大师!
      2 # 确实有新的句子,但严重过拟合……
      3 # 没什么好说的,它需要更多数据!
      4 import using
----> 5 using.eval()

File [d:\Lacan\using.py:34](file:///D:/Lacan/using.py:34), in eval()
     32 for i in tqdm(range(count)):
     33     x = torch.Tensor(data).to(device)
---> 34     y = Model(x)[0][-1]
     35     y = y.to("cpu")
     36     p = y.detach().numpy().reshape((-1,))

File [c:\Users\Charley\miniconda3\envs\pytorch\lib\site-packages\torch\nn\modules\module.py:1501](file:///C:/Users/Charley/miniconda3/envs/pytorch/lib/site-packages/torch/nn/modules/module.py:1501), in Module._call_impl(self, *args, **kwargs)
   1496 # If we don't have any hooks, we want to skip the rest of the logic in
   1497 # this function, and just call forward.
   1498 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
   1499         or _global_backward_pre_hooks or _global_backward_hooks
   1500         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1501     return forward_call(*args, **kwargs)
   1502 # Do not call functions when jit is used
   1503 full_backward_hooks, non_full_backward_hooks = [], []

File [d:\Lacan\net.py:23](file:///D:/Lacan/net.py:23), in myModle.forward(self, x)
     21 h0 = torch.zeros(4, x.shape[0], self.hidden_size, device=device)
     22 c0 = torch.zeros(4, x.shape[0], self.hidden_size, device=device)
---> 23 out, (_, _) = self.LSTM(x, (h0, c0))
     24 out = self.Linear(out)
     25 return out

File [c:\Users\Charley\miniconda3\envs\pytorch\lib\site-packages\torch\nn\modules\module.py:1501](file:///C:/Users/Charley/miniconda3/envs/pytorch/lib/site-packages/torch/nn/modules/module.py:1501), in Module._call_impl(self, *args, **kwargs)
   1496 # If we don't have any hooks, we want to skip the rest of the logic in
   1497 # this function, and just call forward.
   1498 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
   1499         or _global_backward_pre_hooks or _global_backward_hooks
   1500         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1501     return forward_call(*args, **kwargs)
   1502 # Do not call functions when jit is used
   1503 full_backward_hooks, non_full_backward_hooks = [], []

File [c:\Users\Charley\miniconda3\envs\pytorch\lib\site-packages\torch\nn\modules\rnn.py:812](file:///C:/Users/Charley/miniconda3/envs/pytorch/lib/site-packages/torch/nn/modules/rnn.py:812), in LSTM.forward(self, input, hx)
    810 self.check_forward_args(input, hx, batch_sizes)
    811 if batch_sizes is None:
--> 812     result = _VF.lstm(input, hx, self._flat_weights, self.bias, self.num_layers,
    813                       self.dropout, self.training, self.bidirectional, self.batch_first)
    814 else:
    815     result = _VF.lstm(input, batch_sizes, hx, self._flat_weights, self.bias,
    816                       self.num_layers, self.dropout, self.training, self.bidirectional)

RuntimeError: Expected sequence length to be larger than 0 in RNN

请问是什么原因呢?谢谢!

找到原因了,初始给的句子得在训练语料里面存在至少一个字