Quantcast
Channel: Data Science, Analytics and Big Data discussions - Latest topics
Viewing all articles
Browse latest Browse all 4448

Text summarization - Encoder Decoder Using Attention Model

$
0
0

Ideally in Encoder Decoder with attention , the attention takes into consideration the importance of words in input for that particular word in output. Hence the decoder processes the output one time-step after another with output of 1st time-step serving as input to second time-step and so on . Programatically , should this be done in a loop for each output time-step?

1 post - 1 participant

Read full topic


Viewing all articles
Browse latest Browse all 4448

Trending Articles