NLP经典论文

Q:你读过哪些NLP相关的论文?挑一篇简单介绍一下


模型
全称 发布时间

Word2vec 

Efficient Estimation of Word Representations in Vector Space 2013.01

Seq2seq 


Sequence to Sequence Learning with Neural Networks 2014.09

Attention

Attention Is All You Need 2015.02

Transformer

(vanilla)

Character-Level Language Modeling with Deeper Self-Attention 2017.06

ELMO 


Deep contextualized word representations 2018.02

Transformer

(universal)

Universal Transformers 2018.07

GPT 


Improving Language Understanding by Generative Pre-Training 2018

BERT


BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 2018.10

GPT-2 


Language Models are Unsupervised Multitask Learners 2019.02

Transformer-XL 

Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context 2019.06
XLNet XLNet: Generalized Autoregressive Pretraining for Language Understanding 2019.06

以上很多模型都集成在huggingface/transformers库中:

https://github.com/huggingface/transformers


下载链接:


Word2vec 

https://arxiv.org/pdf/1301.3781.pdf

https://github.com/danielfrg/word2vec


Seq2seq 

https://arxiv.org/pdf/1409.3215.pdf


Attention

https://arxiv.org/abs/1706.03762.pdf


Transformer(vanilla)

https://arxiv.org/abs/1706.03762.pdf

https://github.com/tensorflow/tensor2tensor


ELMO 

https://arxiv.org/abs/1802.05365.pdf


Transformer(universal) 

https://arxiv.org/abs/1807.03819.pdf


GPT

https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf


Bert

https://arxiv.org/pdf/1810.04805.pdf

https://github.com/google-research/bert


GPT-2

https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf

https://github.com/openai/gpt-2


Transformer-XL 

https://arxiv.org/abs/1901.02860.pdf

https://github.com/kimiyoung/transformer-xl


XLNet

https://arxiv.org/abs/1906.08237.pdf

https://github.com/zihangdai/xlnet


进入微信公众号,回复  “NLP论文”  获取完整论文




推荐阅读

自我介绍

hr常问的几道面试题

数据归一化应用场景

过拟合面试题

多分类面试题

决策树常见面试题

特征向量的缺失值处理有哪些方法


标签: pdf、arxiv、language、transformer、gpt、面试
  • 回复
隐藏