Resource 1: Self-supervised Learning: Generative or Contrastive
Resource 2: Generative Self-supervised Learning in LLM Pre-Training task
Resource 3: 一文讀懂GPT家族和BERT的底層區別——自迴歸和自編碼語言模型詳解
Resource 3: The Transformer model family | Hugging Face
Generative Pre-Training task
- Auto-Encoder(AE) Models: BERT(MLM&NSP)
- Auto-Regressive(AR) Models: GPT
- Encoder-Decoder: T5
Downstream task
- Auto-Encoder(AE) Models: Text Understanding(Text Classification, Token Classification, Question Answering, Text Summarization)
- Auto-Regressive(AR) Models: Text Generation
- Encoder-Decoder: Text Translation