Word Embeddings


1. What are word representations?

  • One-Hot encodings
  • Featurized representaion: word embeddings
    • Word embedding is one of the most popular word representation.
    • It is capable of capturing context of a word in a document, semantic and syntactic similarity, relation with other words, etc.

2. Summary of word embeddings

Word Embeddings
Transfer Learning yes
Training Set small
Entity recognition, text summarization, co-reference resolution, parsing Good
Language modeling and machine translation Bad
Visulization t-SNE

3. How does word embedding algorithms work?

  • How to use word embeddings to solve a Word Analogy problem?

  • How to load pre-trained word vectors, and Measure Similarity using cosine similarity

  • How to modify word embeddings to reduce their gender bias

  • Try the exercise on Coursera Coursera notebook

  • Take a look at my notebook

4. How to create an embedding layer and build a Kears LSTM model for sentiment classification?



Reference