HOME
  • About Me
  • Category
    DL101 CV NLP Project
Navigation bar avatar

Dive into AI


Seungmi Oh

[Paper Review] Language Models are Unsupervised Multitask Learners (GPT-2)

Posted on June 12, 2021

GPT-2 [Read More]
Tags: gpt2

[Paper Review] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

Posted on June 11, 2021

BERT [Read More]
Tags: BERT pre-trained model NLP

[Paper Review] Deep contextualized word representations (ELMo)

Posted on June 11, 2021

ELMo [Read More]
Tags: language model word representation

Transformer and Self-Attention

Posted on June 11, 2021

NLP Basics: transformer, self-attention [Read More]
Tags: transformer attention

[Paper Review] On Calibration of Modern Neural Networks

Posted on June 1, 2021

[Read More]
Tags: reliable ai paper review
  • Older Posts →
  • Email me
  • GitHub

Seungmi Oh  •  2021

Powered by Beautiful Jekyll