Sometimes the default standard activations like ReLU, tanh, softmax, ... and the advanced activations like LeakyReLU aren't enough. And it might also not be in keras-contrib.
有时默认的标准激活如ReLU, tanh, softmax…像LeakyReLU这样的高级激活是不够的。它也可能不在keras-悔过。
How do you create your own activation function?
如何创建自己的激活函数?
1 个解决方案
#1
22
Credits to this Github issue comment by Ritchie Ng.
感谢Github上的这篇评论,作者是Ritchie Ng。
from keras import backend as K
from keras.utils.generic_utils import get_custom_objects
def custom_activation(x):
return (K.sigmoid(x) * 5) - 1
get_custom_objects().update({'custom_activation': Activation(custom_activation)})
model.add(Activation(custom_activation))
Please keep in mind that you have to import this function when you save and restore the model. See the note of keras-contrib.
请记住,在保存和恢复模型时必须导入此函数。看keras-悔悟的音符。
#1
22
Credits to this Github issue comment by Ritchie Ng.
感谢Github上的这篇评论,作者是Ritchie Ng。
from keras import backend as K
from keras.utils.generic_utils import get_custom_objects
def custom_activation(x):
return (K.sigmoid(x) * 5) - 1
get_custom_objects().update({'custom_activation': Activation(custom_activation)})
model.add(Activation(custom_activation))
Please keep in mind that you have to import this function when you save and restore the model. See the note of keras-contrib.
请记住,在保存和恢复模型时必须导入此函数。看keras-悔悟的音符。