Word Embeddings

Discussions center on word embeddings like word2vec for semantic similarity, vector comparisons, and alternatives such as GPT or GloVe embeddings.

➡️ Stable 0.7x AI & Machine Learning
3,710
Comments
20
Years Active
5
Top Authors
#2024
Topic ID

Activity Over Time

2007
2
2008
6
2009
5
2010
25
2011
7
2012
29
2013
62
2014
54
2015
155
2016
245
2017
276
2018
290
2019
240
2020
96
2021
155
2022
200
2023
637
2024
650
2025
537
2026
39

Keywords

e.g NN blog.sgn ZINC stanford.edu wikipedia.org SMILES GloVe github.com arxiv.org embeddings vector vectors similarity embedding words word woman normalized semantic

Sample Comments

bagrow Apr 29, 2018 View on HN

Would this be useful for word vectors/embeddings?

thanatropism Oct 31, 2018 View on HN

Isn't the point of word2vec that embeddings are semantically meaningful vectors?

shubham13596 Jul 28, 2024 View on HN

Why not use GPT 4 embeddings instead of word2vec ? Won't that be more effective

bobosha Oct 18, 2020 View on HN

Nice, does this use word2vec or some such word embedding model?

kache_ May 2, 2022 View on HN

I guess this is done by vectorizing words and measuring distance between them?

nunodonato Mar 27, 2023 View on HN

embeddings are really good at that, you dont need to use similar words at all.

mrfusion May 16, 2022 View on HN

I thought I read it uses word2vec?

recursive Oct 22, 2024 View on HN

In the language of "embeddings" of machine learning.

jfengel May 5, 2023 View on HN

Can someone give me the quickie on what a vector embedding is? The article assumes I know it, and everything else in the article seems trivial.

JabavuAdams Jun 17, 2021 View on HN

I thought this was going to be about word embeddings!