JiapengWu/TeMP

Some issues with dgl

Opened this issue · 0 comments

(TeMP) D:\PyChrom\PythonProject\TeMP>python -u main.py -c configs/grid/icews15/config_bisargcn.json --rec-only-last-layer --use-time-embedding --post-ensemble
Test tube created git tag: tt_BiSARGCN-icews05-15-complex-5-0.1-time-embed-only-last-layer-no-dropout-not-learnable-score-ensemble-without-impute_v202307311757
gpu available: True, used: True
VISIBLE GPUS: 0
Name Type Params
0 ent_encoder SARGCN 1 M
1 ent_encoder.layer_1 RGCNLayer 594 K
2 ent_encoder.layer_1.dropout Dropout 0
3 ent_encoder.layer_2 SARGCNLayer 644 K
4 ent_encoder.layer_2.dropout Dropout 0
5 ent_encoder.layer_2.q_linear Linear 16 K
6 ent_encoder.layer_2.v_linear Linear 16 K
7 ent_encoder.layer_2.k_linear Linear 16 K
0%| | 0/1 [00:00<?, ?it/s]T File "D:\PyChrom\PythonProject\TeMP\models\BiSelfAttentionRGCN.py", line 76, in evaluate
hist_embeddings_forward, attn_mask_forward = self.pre_forward(g_forward_batched_list, t_forward_batched_list, forward=True)
File "D:\PyChrom\PythonProject\TeMP\models\BiSelfAttentionRGCN.py", line 38, in pre_forward
first_per_graph_ent_embeds, second_per_graph_ent_embeds = self.get_per_graph_ent_embeds(
File "D:\PyChrom\PythonProject\TeMP\models\SelfAttentionRGCN.py", line 55, in get_per_graph_ent_embeds
batched_graph = self.get_batch_graph_embeds(g_batched_list_t, full, rate)
File "D:\PyChrom\PythonProject\TeMP\models\DynamicRGCN.py", line 94, in get_batch_graph_embeds
batched_graph.ndata['h'] = self.ent_embeds[batched_graph.ndata['id']].view(-1, self.embed_size).to(torch.device('cuda:0'))
File "D:\SoftLocation\Anaconda12\lib\site-packages\dgl\view.py", line 99, in setitem
self._graph._set_n_repr(self._ntid, self._nodes, {key: val})
File "D:\SoftLocation\Anaconda12\lib\site-packages\dgl\heterograph.py", line 4347, in _set_n_repr
raise DGLError(
dgl._ffi.base.DGLError: Cannot assign node feature "h" on device cuda:0 to a graph on device cpu. Call DGLGraph.to() to copy the graph to the same device.

When I run with version 1.1.1 of dgl, the above error will occur. Could you please help me solve it if you have time? I apologize for taking up your valuable time