Softmax 损失
- 层类型: SoftmaxWithLoss
平方和/欧式损失 Sum-of-Squares / Euclidean
- 层类型: EuclideanLoss
![【caffe学习笔记】loss layer 损失层 【caffe学习笔记】loss layer 损失层](https://image.shishitao.com:8440/aHR0cHM6Ly93d3cuaXRkYWFuLmNvbS9nby9hSFIwY0RvdkwybHRaeTVpYkc5bkxtTnpaRzR1Ym1WMEx6SXdNVGN3TWpJNE1EazBNekV4TURFNVAzZGhkR1Z5YldGeWF5OHlMM1JsZUhRdllVaFNNR05FYjNaTU1rcHpZakpqZFZrelRtdGlhVFYxV2xoUmRsWXlPWFZhUjFaNVRXcE5laTltYjI1MEx6VmhOa3cxVERKVUwyWnZiblJ6YVhwbEx6UXdNQzltYVd4c0wwa3dTa0pSYTBaRFRVRTlQUzlrYVhOemIyeDJaUzgzTUM5bmNtRjJhWFI1TDBObGJuUmxjZz09.jpg?w=700&webp=1)
Hinge / Margin 损失
- 层类型: HingeLoss
- CPU implementation: ./src/caffe/layers/hinge_loss_layer.cpp
- CUDA GPU implementation: 未实现
- norm [default L1]:正则项类型,目前有L1和L2
- n * c * h * w 预测值
- n * 1 * 1 * 1 真实标签
- 1 * 1 * 1 * 1 计算的损失
# L1 Norm
layer {
name: "loss"
type: "HingeLoss"
bottom: "pred"
bottom: "label"
}
# L2 Norm
layer {
name: "loss"
type: "HingeLoss"
bottom: "pred"
bottom: "label"
top: "loss"
hinge_loss_param {
norm: L2
}
}
hinge 损失层用来计算 one-vs-all hinge 或者 squared hinge 损失。
交叉熵损失 Sigmoid Cross-Entropy SigmoidCrossEntropyLoss
信息熵损失 Infogain InfogainLoss
准确率 Accuracy and Top-k Accuracy 用来计算网络输出相对目标值的准确率, 它实际上并不是一个损失层, 所以没有反传操作