about attention core
Closed this issue · 5 comments
hi, pcpLiu,Thank you very much for your this contribution.
and I have some question to ask you:
- I run python main.py "../Models/benchmark_weekly/model_bd2013.pytorch" "DRA01:01" "DRB101:01" "AAYSDQATPLLLSPR" output only is IC50 value , This value is used to determine whether it can be presented. If the value is less than 0.5, it cannot be presented, and if it is greater than 0.5, it can be presented. Is that right?
- how can i get attention core ??
very thanks your help!!!!!!!!!
- Yeah. I remember that's the conventional threshold
- The binding core weight is omitted here (https://github.com/pcpLiu/DeepSeqPanII/blob/master/code_and_dataset/main.py#L51)
Thank you very much for your quick reply!
I add :pred_ic50, core = model() run again:
and The binding core weight is 1*25 tensor , As shown below
but the input sequence is "AAYSDQATPLLLSPR" the length only 15 .
On this basis, how can i get attention as shown below 👍
i use the core output tensor to list:
but finally,attencore's length is 17 and != "AAYSDQATPLLLSPR"(15)
so Is it something I didn't consider??
- The binding core weight is omitted here (https://github.com/pcpLiu/DeepSeqPanII/blob/master/code_and_dataset/main.py#L51)
i think this is due to padding, so I delete list1 to new list1 👍
and i can use 0.46840000000000004 ' s index to find out "attention core", Is that right?
@zhangweizhenGitHub right. when you are sliding the core window (9-length), in our paper, we select the window with max sum weights
@zhangweizhenGitHub right. when you are sliding the core window (9-length), in our paper, we select the window with max sum weights
yeah,thank you for your help!