Dear * Community, I have the following loss function in keras:
亲爱的*社区,我在keras中有以下损失功能:
return K.mean((y_true+K.epsilon()) * K.square(y_pred - y_true), axis=-1)
When I try to train my network (y normalized to 0 - 1) with it, the loss appears to get to an negative value, which I just can't understand. I calculated the same thing with numpy, and everything worked fine and as intended.
当我尝试用它训练我的网络(y标准化为0 - 1)时,损失似乎达到负值,这是我无法理解的。我用numpy计算了同样的东西,一切都很好,并且按照预期。
I would be really delighted If someone knows the cause for this weird negative solutions, so thank you for your help.
我会非常高兴如果有人知道这个奇怪的否定解决方案的原因,那么谢谢你的帮助。
1 个解决方案
#1
1
If y_true
is really normalized to 0-1 that only possible cause that I see is K.epsilon()
. As this page suggest epsilon can be changed by user and this can cause a problem.
如果y_true真的归一化为0-1,那么只有我看到的可能是K.epsilon()。由于此页面建议用户可以更改epsilon,这可能会导致问题。
Try to hardcode epsilon value or just throw it away.
尝试硬编码epsilon值或将其扔掉。
#1
1
If y_true
is really normalized to 0-1 that only possible cause that I see is K.epsilon()
. As this page suggest epsilon can be changed by user and this can cause a problem.
如果y_true真的归一化为0-1,那么只有我看到的可能是K.epsilon()。由于此页面建议用户可以更改epsilon,这可能会导致问题。
Try to hardcode epsilon value or just throw it away.
尝试硬编码epsilon值或将其扔掉。