激活函数--activation function

时间:2021-04-16 19:02:33

https://en.wikipedia.org/wiki/Rectifier_(neural_networks)

在人工神经网络,rectifier是一个激活函数,定义如下:

rectified linear unit (ReLU):激活函数--activation function

他的平滑模式是:

softplus function:激活函数--activation function

他的导数是:

激活函数--activation function

也就是逻辑函数,逻辑斯蒂函数logistic function

Noisy ReLUs:在restricted Boltzmann machines有好的应用

激活函数--activation function

Leaky ReLUs:Leaky ReLUs allow a small, positive gradient when the unit is not active

激活函数--activation function

Parametric ReLUs:若0.01为参数,则是参数形式的ReLUs。

激活函数--activation function

如果a<=1,则

激活函数--activation function

ELUs:Exponential linear units try to make the mean activations closer to zero which speeds up learning. It has been shown that ELUs can obtain higher classification accuracy than ReLUs

激活函数--activation function