1 Star 0 Fork 0

kangchi/Competition_CAIL

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
attention.py 1.14 KB
一键复制 编辑 原始数据 按行查看 历史
renjunxiang 提交于 2018-07-13 14:21 . 全部代码
from keras.layers import *
from keras.models import *
from keras.utils import plot_model
def attention(input=None, depth=None):
attention = Dense(1, activation='tanh')(input)
attention = Flatten()(attention)
attention = Activation('softmax')(attention)
attention = RepeatVector(depth)(attention)
attention = Permute([2, 1], name='attention_vec')(attention)
attention_mul = Multiply(name='attention_mul')([input, attention])
return attention_mul
if __name__ == '__main__':
data_input = Input(shape=[400])
word_vec = Embedding(input_dim=40000 + 1,
input_length=400,
output_dim=512,
mask_zero=False,
name='Embedding')(data_input)
x = word_vec
x = Conv1D(filters=512, kernel_size=[3], strides=1, padding='same', activation='relu')(x)
x = attention(input=x, depth=512)
x = GlobalMaxPool1D()(x)
x = BatchNormalization()(x)
x = Dense(500, activation="relu")(x)
x = Dense(202, activation="sigmoid")(x)
model = Model(inputs=data_input, outputs=x)
plot_model(model, './attention.png', show_shapes=True)
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
Python
1
https://gitee.com/kangchi/Competition_CAIL.git
git@gitee.com:kangchi/Competition_CAIL.git
kangchi
Competition_CAIL
Competition_CAIL
master

搜索帮助

0d507c66 1850385 C8b1a773 1850385