How does the loss function for grade task work (CNN-only)?
omniaalwazzan opened this issue · 3 comments
Hi, Richard. Your code has inspired me greatly:)
I'm not sure how you trained the vgg_19_bn to classify the grade; as I can see in your customised CNN, the output nueoron is one (with a probability ranging from -3 to 3); correct me if I'm wrong. However, in the CSV file, the grade has three labels: [0,1,2].
So, as far as I understand, the cnn model's output shape will be = [batch size,C), which is a one probability in this code.
For instance in case the batch size =2:
model=vgg()
output = model(image)
output_size = torch.Size([2, 1])
output_values = [[0.09075058],[0.10227629]]
Though, this code snippet confuses me
python train_cv.py --exp_name grad_15 --task grad --mode path --model_name path --niter 0 --niter_decay 50 --batch_size 8 --lr 0.0005 --reg_type none --lambda_reg 0 --act LSM **--label_dim 3** --gpu_ids 0
If the label dim = 3, how does the loss function work?
loss nll = F.nll loss(pred, grade)
In this case, pred and grade have different shapes.
Is there anything I'm missing or don't understand?
Thanks in advance,
Omnia
Output neuron is 1 for the Cox Loss, but 3 for the GBMLGG grading task. You can see here where the last layer gets defined, which will vary based on the task.
Oh my bad 😥
I’ve read the network.py from start to end and didn’t notice this🤦🏻♀️
Thanks a lot for the prompt response.
No worries! Happy to help :)