Matthew Peters on Twitter: "Our paper "Deep contextualized word representations" is now on Arxiv. ELMo representations from pre-trained language models set new SOTA for 6 diverse NLP tasks, SQuAD, SNLI, SRL, coref,
ELMO & GLOVE word representations Part 1 - YouTube
文脈化単語表現モデル”ELMo”を動かしてみた | DevelopersIO
ELMo Embedding | by abhisht tiwari | Medium
Contextualized Word Embeddings
Clear Guide on How to Use Elmo embeddings · Issue #1737 · allenai/allennlp · GitHub