We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pre-trained on a large text corpus. We show that these representations can be easily added to existing models and significantly improve the state of the art across six challenging NLP problems, including question answering, textual entailment and sentiment analysis. We also present an analysis showing that exposing the deep internals of the pre-trained network is crucial, allowing downstream models to mix different types of semi-supervision signals.
Anthology ID:
N18-1202
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2227–2237
Language:
URL:
https://aclanthology.org/N18-1202
DOI:
10.18653/v1/N18-1202
Bibkey:
Cite (ACL):
Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, and Luke Zettlemoyer. 2018. Deep Contextualized Word Representations. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 2227–2237, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Deep Contextualized Word Representations (Peters et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-1202.pdf
Note:
 N18-1202.Notes.pdf
Video:
 http://vimeo.com/277672840
Code
 additional community code
Data
ACL ARCCoNLL++CoNLL-2003OntoNotes 5.0Penn TreebankReddit CorpusSNLISQuADSSTWord Sense Disambiguation: a Unified Evaluation Framework and Empirical Comparison