@premsheth wrote:
Hi Friends,
I am trying to understand mechanism of Word2Vec for word embedding. I gone through from following links:
https://www.analyticsvidhya.com/blog/2017/06/word-embeddings-count-word2veec/https://machinelearningmastery.com/develop-word-embeddings-python-gensim/
But Not clear idea behind Word2Vec or how its work.
I understand Follow steps:
- It will take sentence and split in to words.
- It will make vocabulary of all those words
- But when I see do
model['word in vocabulary']
it gives numerical vector.
Questions:
1) What is Vector representation?
2) What is each numerical value represent?I am confuse about it.
Please if anyone have tutorial or any link from which I can understand better It will be helpful for me.
Thank you so much in advance
Posts: 1
Participants: 1