I have a paragraph of a certain topic.
For eg:
“The legendary world of Pokémon first reached Australian and New
Zealand shores in 1998 with Pokémon Red Version and Pokémon Blue
Version for Game Boy, becoming an instant sensation that transcended
fans of all ages and backgrounds.
Since the 1996 debut of Pokémon Red and Pokémon Green for Game Boy in
Japan, this iconic series has sold more than 279 million video games
globally, with avid Aussie and Kiwi Pokémon fans continuing to grow
their Pokémon collections.”
My task is to feed them in BiLSTM model and obtain the topic representation with a multi-layer perceptron attention mechanism using the concatenated final
state from each direction of the BiLSTM as the key,
hτ =BiLSTMτ(Xτ )
h(attn)τ =MLPSelfAttn(hτ).
I only have input values, so how can I train it without y values?
1 post - 1 participant