keras-layer-normalization-rnn:keras的层归一化LSTM和GRU的实现

时间:2024-05-22 09:25:36
【文件属性】:

文件名称:keras-layer-normalization-rnn:keras的层归一化LSTM和GRU的实现

文件大小:63KB

文件格式:ZIP

更新时间:2024-05-22 09:25:36

Python

内容 如此处所述,通过层归一化扩展标准keras LSTM和GRU层。 用法示例 这些图层可以像普通图层一样容易使用: from LayerNormalizationRNN import LSTM , GRU inputs = Input ( shape = ( maxlen ,)) x = Embedding ( max_features , 128 )( inputs ) x = LSTM ( 64 , layer_to_normalize = ( "input" , "output" , "recurrent" ), normalize_seperately = True )( x ) # x = GRU(64, layer_to_normalize=("input_gate", "input_recurrent", "recurrent_gate", "recurrent_rec


【文件预览】:
keras-layer-normalization-rnn-master
----LayerNormalizationRNN.py(63KB)
----images()
--------LSTM_explanation_with_arrows.jpg(26KB)
--------GRU_explanation_with_arrows.jpg(34KB)
----README.md(2KB)
----LayerNormalization.ipynb(5KB)

网友评论