Skip to content

tensorflow2 implementation of se2seq with attention for context generation

License

Notifications You must be signed in to change notification settings

swayam01/tf2.0_abstractive_summarisation

Repository files navigation

Introduction

This is an implementation of Bahdanau Attention

How to use

Just like you would use any other tensoflow.python.keras.layers object.

from attention_keras.layers.attention import AttentionLayer

attn_layer = AttentionLayer(name='attention_layer')
attn_out, attn_states = attn_layer([encoder_outputs, decoder_outputs])

Here,

  • encoder_outputs - Sequence of encoder ouptputs returned by the RNN/LSTM/GRU (i.e. with return_sequences=True)
  • decoder_outputs - The above for the decoder
  • attn_out - Output context vector sequence for the decoder. This is to be concat with the output of decoder (refer model/nmt.py for more details)
  • attn_states - Energy values if you like to generate the heat map of attention (refer train.ipynb for usage)

About

tensorflow2 implementation of se2seq with attention for context generation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published