Computer Science Homework Help

PSU Frequency Based Embedding Artificial Intelligence Exercise

 

Embeddings of words are often vectors of numbers capturing the contexts in

which a word occurs. There are two types of word embeddings exemplified by (1) a

word can be represented by a vector representing the frequency of other terms

occurring nearby or (2) a word’s word2vec skip-gram embedding. Provide at least

two disadvantages of using the former in comparison to the latter.