Word Embeddings
Discussions center on word embeddings like word2vec for semantic similarity, vector comparisons, and alternatives such as GPT or GloVe embeddings.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Would this be useful for word vectors/embeddings?
Isn't the point of word2vec that embeddings are semantically meaningful vectors?
Why not use GPT 4 embeddings instead of word2vec ? Won't that be more effective
Nice, does this use word2vec or some such word embedding model?
I guess this is done by vectorizing words and measuring distance between them?
embeddings are really good at that, you dont need to use similar words at all.
I thought I read it uses word2vec?
In the language of "embeddings" of machine learning.
Can someone give me the quickie on what a vector embedding is? The article assumes I know it, and everything else in the article seems trivial.
I thought this was going to be about word embeddings!