October 27, 2021

How to get sentence representation using LSTM and Attention in Aspect Sentiment Analysis?

How to get sentence representation using LSTM and Attention in Aspect Sentiment Analysis?

In order to get the final sentence representation using LSTM and aspect sentiment attention. We can do as follows:
1.Encode a word sequences using LSTM and we will get T hidden output.
2.Compute each word attention score using aspect sentiment attention. This attention method adds aspect embeddings in self-attention method.
3.There are two ways to get the final sentence vector. One is to sum all word hidden outputs using word attention scores. The other is to project the sum of all words with attention and the last word hidden output using tanh() function . This is the method provided in this tutorial.

Here is the full tutorial!

File: PDF

Language: English

DOWNLOAD