when NUM_OF_CLASSES=2,loss doesn't work?
lily10086 opened this issue · 9 comments
loss = tf.reduce_mean((tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits,
labels=tf.squeeze(annotation, squeeze_dims=[3]),
name="entropy")))
when classes is 2,I try train it ,But the loss is not normal,How can adapt it??
Hi,
What do you mean by not normal? Can you elaborate it or could you show me the loss graph?
I think tf.nn.sparse_softmax_cross_entropy_with_logits can't be used when num_classes is 2.
loss = tf.reduce_mean((tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits,
labels=tf.squeeze(annotation, squeeze_dims=[3]),
name="entropy")))
According to the doc of tensorflow, tf.nn.sparse_softmax_cross_entropy_with_logits should work fine when dealt with the 2-class case. Can you double check the size of your arguments? And what's the error information?
https://www.tensorflow.org/versions/r0.12/api_docs/python/nn/classification
我用多类训练的时候没有问题,但训练只有两类的时候出了问题,所以我怀疑这个loss不适合两类的...
出问题是指报错吗?报错信息是什么?
其他的指标比如原论文中的那几个metrics是有计算的 要切换到test模式下运行 结果存到了metrics.txt文件中 (inception_FCN.py, line227-272)
好的,不用谢!