AttributeError: 'EasyDict' object has no attribute 'PCA
xmlyqing00 opened this issue · 10 comments
Hi! When I run the codes, it says AttributeError: 'EasyDict' object has no attribute 'PCA.
Traceback (most recent call last):
File "eval.py", line 138, in <module>
model = Net()
File "/Ship03/Sources/FeatureMatching/QC-DGM/QCDGM/model.py", line 48, in __init__
self.add_module('cross_graph_{}'.format(i), nn.Linear(cfg.PCA.GNN_FEAT * 2 + 4, cfg.PCA.GNN_FEAT + 2))
AttributeError: 'EasyDict' object has no attribute 'PCA
Hi!Thank you for reaching out. I just found that I submitted a wrong file, which may cause your problem. It should be fine now!
Hi, Thanks for fixing this problem. I also found other issues:
- In the parameter --epoch k, if I want to use your pretrained model, does the k equal to 20 in PASCAL VOC dataset?
- In parse_args.py line 29, cfg_from_list(['TRAIN.START_EPOCH', args.epoch, 'EVAL.EPOCH', args.epoch, 'VISUAL.EPOCH', args.epoch]). VISUAL.EPOCH can not be found in the config file and it raises a problem.
- Yes, and actually you can change the epoch number in file "./experiments/QCDGM_voc.py" Line51 to 20. And I recommend you to run eval.py in this way. For now, I change the file "./experiments/QCDGM_voc.py" so that you can download this file again and run: "python eval.py --cfg ./experiments/QCDGM_voc.yaml".
- It seems weird... becasue I never meet this problem before, how about try eval.py as suggested in 1. first and see what will happen?
Thanks. In the second problem, if you don't specify --epoch, it works well. But if you specify --epoch, it raises the problem.
Indeed, you are right ...... Thank you very much for pointing out! I will try my best so solve this bug :).
Thank you for your quick response, I can run the codes now. I still have some questions about the codes. Please correct me if I am wrong.
- You use a gradient descent method to solve Eqn 7, but I can't follow the relationship between Eqn 7 and 8. What does y do here?
- In the pseudo-code Algorithm 1, there are two for-loops
iter
andk
. My understanding is that you do some calculations on \bar{X}. After m2 iterations, you use Sinkhorn normalization to project \bar{X} to X. In your codes, I found
Xnew = X + lam*(S - X)
X = Xnew
but from Line 133 to 144, it is hard to follow. I guess they are related to Eqn 8 and Eqn 9? Gradient calculations and Hungarian algorithm? These codes look similar to qc_opt function. What's the difference between them? It would be great if you can add some comments in the codes LOL.
- Sinkhorn layer seems like a kind of normalization, could you talk more about it? I know the Sinkhorn algorithm for optimal transport problems with an entropy trick. But I think the Sinkhorn layer is different from what I know.
Thanks again for your time.
- Hi! Actually, Eq(8) and Eq(9) can be together considered as: s_new = Sinkhorn(\nabla{g(X_k)}), which is implementated in our code. We move iterative point X along the gradient of the objective function to a small step.
- You can rewrite Xnew = X + lam*(S-X) and X = Xnew together as X = X - lam*(X-S), which is is Eq(10) in our paper.
- Yes you are right, it can be considered as a normalization and actually you can try Sinkhorn algorithm with entropy trick and I believe the result maybe better.
I read some Frank-Wolfe articles and have some feelings about the calculations now. Thanks for your explanation.
I used your 0020.pt pretrained model for PASCAL VOC dataset. I found the results are slightly different from the numbers in the paper. Should I use these numbers or the numbers from original paper?
aeroplane = 0.4956
bicycle = 0.6694
bird = 0.6190
boat = 0.5675
bottle = 0.8258
bus = 0.7890
car = 0.7194
cat = 0.7155
chair = 0.4294
cow = 0.6801
diningtable = 0.7749
dog = 0.6526
horse = 0.7152
motorbike = 0.6610
person = 0.4874
pottedplant = 0.9315
sheep = 0.6965
sofa = 0.6590
train = 0.8805
tvmonitor = 0.9203
average = 0.6945
Process finished with exit code 0
I think it is ok so feel free to use either the results in the paper or the results implemented by youeself
Great. Thanks for your time!