Sequence Models | Attention Model | Word Embeddings | More about Data Science
1. What are word representations?
- One-Hot encodings
- Featurized representaion: word embeddings
- Word embedding is one of the most popular word representation.
- It is capable of capturing context of a word in a document, semantic and syntactic similarity, relation with other words, etc.
2. Summary of word embeddings
Word Embeddings | |
---|---|
Transfer Learning | yes |
Training Set | small |
Entity recognition, text summarization, co-reference resolution, parsing | Good |
Language modeling and machine translation | Bad |
Visulization | t-SNE |
3. How does word embedding algorithms work?
-
How to use word embeddings to solve a Word Analogy problem?
-
How to load pre-trained word vectors, and Measure Similarity using cosine similarity
-
How to modify word embeddings to reduce their gender bias
-
Try the exercise on Coursera Coursera notebook
-
Take a look at my notebook
4. How to create an embedding layer and build a Kears LSTM model for sentiment classification?
-
Try the Emotify exercise on Coursera Coursera notebook
(Input a sentence and find the most appropriate emoji) -
Take a look at my notebook