Quantcast
Channel: Data Science, Analytics and Big Data discussions - Latest topics
Viewing all articles
Browse latest Browse all 4448

In word/character embedding What is being represent in vectors values of a single word?

$
0
0

@spiel wrote:

In word/character embedding What is represented in vectors values of a single word? the word “How” is represented in a vector of size 3072. So what are these 3072 values denoting about the word “how”?

from flair.embeddings import BertEmbeddings
Bertembedding = BertEmbeddings()
sentence2=Sentence("Hello good moring ")
Bertembedding.embed(sentence2)
for token in sentence2:
  print(token.get_embedding)
  print(token.embedding)
  print(token.embedding.size())

o/p

bound method Token.get_embedding of Token: 1 Hello>
tensor([ 0.1705,  0.5013,  0.8154,  ..., -0.2748, -0.6512, -0.0866])
torch.Size([3072])
<bound method Token.get_embedding of Token: 2 good>
tensor([ 0.0079,  0.4349,  0.9639,  ..., -1.2404, -0.4342, -1.9663])
torch.Size([3072])
<bound method Token.get_embedding of Token: 3 moring>
tensor([ 0.0535, -0.1277,  0.2877,  ..., -0.3166,  0.0504, -0.2165])
torch.Size([3072])

Posts: 1

Participants: 1

Read full topic


Viewing all articles
Browse latest Browse all 4448

Trending Articles