Readings

Natural Language Processing (NLP)

Language Modeling

Paper Title Year Reason
Hopfield Networks is All You Need 2020 Looks at the Transformer and BERT architecture from the perspective of Hopfield networks.
Pre-training via Paraphrasing 2020 An improvement over the Masked Language Modeling techniques.
Language Models are Few-Shot Learners 2020 The famous GPT3 model. Needs no other reason.
Big Bird - Transformers for Longer Sequences 2020 Sparse attention mechanism enabling processing of longer sequences
LayoutLM: Pre-training of Text and Layout for Document Image Understanding 2020 Incorporating layout (position and image) embeddings to BERT pre-training
Exploring the Limits of Language Modeling 2016  
Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks 2020 Investigates the effectiveness of using task-related data in pretraining LMs

Controllable Text Generation

Paper Title Year Reason
Exploring Controllable Text Generation Techniques 2020 Great work of putting controllable LG papers into a unified framework.

Machine Translation (MT)

Paper Title Year Reason
Convolutional Sequence to Sequence Learning 2017 Introduced convolution operation to Seq2Seq problems

Relation Extraction (RE)

Paper Title Year Reason
Simple BERT Models for Relation Extraction and Semantic Role Labeling 2019 Applying BERT to relation extraction

Semantic Role Labeling (SRL)

Paper Title Year Reason
Simple BERT Models for Relation Extraction and Semantic Role Labeling 2019 Applying BERT to semantic role labeling

Open Information Extraction (OpenIE)

Paper Title Year Reason
Supervised Open Information Extraction 2018 Formulation of framing OpenIE as a sequence tagging problem

Document Intelligence

Paper Title Year Reason
LayoutLM: Pre-training of Text and Layout for Document Image Understanding 2020 Incorporating layout (position and image) embeddings to BERT pre-training

Computer Vision (CV)

Other

Paper Title Year Reason
Towards Learning Convolutions from Scratch 2020  

Time Series Analysis

Paper Title Year Reason
Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting 2019