The loss value is negative
Opened this issue · 1 comments
tianlili1 commented
Excuse me,author. I met a problem when I use the mutual information, its loss value is negative at the beginning. Is this normal?
DuaneNielsen commented
Hi Sorry for the late reply... because the loss function itself is a neural
net, it's possible for the loss to start out negative.
From memory, the loss is log_probablity, so negative loss just means the
loss is less than one. Which would be expected.
…On Mon, Jul 27, 2020 at 8:02 PM tianlili1 ***@***.***> wrote:
Excuse me,author. I met a problem when I use the mutual information, its
loss value is negative at the beginning. Is this normal?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#11>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACUOYLCKN3WFZRTFSIMVHGTR5Y5V5ANCNFSM4PJ7WG5Q>
.
--
Duane
Broadcom
480 760 1559