Word Embedding

Word Embedding

by Zhifei Xu -
Number of replies: 0

In natural language processing (NLP), word embedding is a term used for the representation of words for text analysis, typically in the form of a real-valued vector that encodes the meaning of the word such that the words that are closer in the vector space are expected to be similar in meaning. Thus word embedding is the process of converting a word into a computer-comprehensible expression before the next operation can be performed.

https://machinelearningmastery.com/what-are-word-embeddings/

https://zhuanlan.zhihu.com/p/27830489