Skip to main content

Questions tagged [word-embeddings]

Word embedding is the collective name for a set of language modeling and feature learning techniques in NLP where words are mapped to vectors of real numbers in a low dimensional space, relative to the vocabulary size.

0 votes
0 answers
13 views

I am currently studying GloVe paper about word embeddings. link In Section 3 The GloVe Model this model is derived from several desiderata, one of which confuses me. It is around Equation 3 which ...
Nourless's user avatar
  • 213
1 vote
0 answers
48 views

How can I visualise a hierarchical ontology of items in embedding space, combining text embeddings with the graphical structure? (Something similar to the example below) I have a hierarchical ...
baked goods's user avatar
7 votes
1 answer
91 views

In the work I am doing right now, I have multiple (say 5, for purposes of illustration) pieces of text, (which are somewhat close in meaning, let's say for clarity). My objective is to combine these 5 ...
Sage of Seven Paths's user avatar
1 vote
0 answers
40 views

I'm trying to find a way to get word vector embeddings that preserve linear structures/analogies in word relationships. The most well know example of this the vector for "queen" being ...
ufghd34's user avatar
  • 11
0 votes
2 answers
183 views

I have two lists of sentences ...
Leon's user avatar
  • 1
0 votes
1 answer
100 views

I created a simple neural network to train the word embeddings. I have 6 tokens only: ["apple", "banana", "lime", "red", "yellow", "green"]. ...
LevelRin's user avatar
1 vote
0 answers
252 views

I'm trying to build a recommendation on a dataset of product purchases. The dataset consists of roughly 4 Amazon products that a particular user has bought (in sequence). I want to use the first 3 ...
Hari's user avatar
  • 11
3 votes
1 answer
128 views

I am trying to develop the intuition of word2vec training. Looking into the word2vec source code, I see (for example, in skip-gram): ...
Damir Tenishev's user avatar
0 votes
1 answer
63 views

I have, for example, the following lists of words that I want to cluster. The lists could have different lengths, and the vocabulary is $W = \{a,b,c\}$. The criteria of clustering 2 lists into a same ...
Diep Luong's user avatar
4 votes
1 answer
1k views

I was looking at the Stanford CS224N NLP with Deep Learning lecture, and in the first two videos, we are introduced to word2vec models. The high-level idea mentioned was that we have a 'big corpus' of ...
theGreedyLearner's user avatar
1 vote
0 answers
1k views

I am trying to use infgrad/stella-base-en-v2 on hugging to generate embeddings using langchain The model exists on the huggingface hub The model is listed on the MTEB leaderboard The model has ...
figs_and_nuts's user avatar
1 vote
0 answers
38 views

i build word2vec network with 2 linear layers from pytorch. for every word as an input i consistently train model to predict words before and after, for example: i was visiting my grandma's house, for ...
Тима 's user avatar
0 votes
0 answers
57 views

As per my knowledge, Word2Vec is belongs to non-contextual embedding technique. this have only semantic relationship between words. We can implement Word2Vec, either in CBoW or skip-gram model. but i ...
Tovlk's user avatar
  • 43
1 vote
1 answer
1k views

I want to use OpenCLIP for generating embeddings for each slide in an array of pptx presentations. To improve the quality of the results, I want to vectorize both slide text content and preview images....
Olek Gornostal's user avatar
1 vote
1 answer
547 views

I have a BERT model which I want to use for sentiment analysis/classification. E.g. I have some tweets that need to get a POSITIVE,NEGATIVE or NEUTRAL label. I can't understand how contextual ...
average_discrete_math_enjoyer's user avatar

15 30 50 per page
1
2 3 4 5
33