BERT
- 📙Paper: “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding”
- 🔑Public: ✅
- ⚲ Area: Model
- đź“… Date: 2018-10-11
- 🔎 Paper Section: methods / fine-tuning
- đź“ť References: 63
This post is licensed under CC BY 4.0 by the author.