Next important step in our brief history of (large) #languagemodels is the use of word embeddings, i.e. mapping words onto dense vector spaces while preserving their semantics in terms of vector distances allowing for analogies via vector arithmetics.
In 2013 Word2Vec was introduced by Mikolov et al.
Slides: https://drive.google.com/file/d/1atNvMYNkeKDwXP3olHXzloa09S5pzjXb/view?usp=drive_link
@fizise #llm #ai #artificialintelligence #wordembeddings #machinelearning #lecture