reference
Transformer-based pre-trained language models are categorized into encoder-only models (e.g., BERT) for understanding and classifying text, decoder-only models (e.g., GPT) for generating coherent text, and encoder-decoder models (e.g., T5) for tasks requiring both comprehension and generation.
Authors
Sources
- The Synergy of Symbolic and Connectionist AI in LLM-Empowered ... arxiv.org via serper
- The Synergy of Symbolic and Connectionist AI in LLM ... arxiv.org via serper
Referenced by nodes (4)
- natural language processing concept
- BERT concept
- GPT concept
- T5 concept