exploring_word_vectors.ipynb

时间:2023-06-22 10:01:07
【文件属性】:

文件名称:exploring_word_vectors.ipynb

文件大小:44KB

文件格式:IPYNB

更新时间:2023-06-22 10:01:07

NLP

Word Vectors are often used as a fundamental component for downstream NLP tasks, e.g. question answering, text generation, translation, etc., so it is important to build some intuitions as to their strengths and weaknesses. Here, you will explore two types of word vectors: those derived from co-occurrence matrices, and those derived via GloVe. Assignment Notes: Please make sure to save the notebook as you go along. Submission Instructions are located at the bottom of the notebook. Note on Terminology: The terms "word vectors" and "word embeddings" are often used interchangeably. The term "embedding" refers to the fact that we are encoding aspects of a word's meaning in a lower dimensional space. As Wikipedia states, "conceptually it involves a mathematical embedding from a space with one dimension per word to a continuous vector space with a much lower dimension".


网友评论