【LLM论文阅读】
LLM论文阅读
论文重点 | 论文 | 链接 |
---|---|---|
Rope | RoFormer: Enhanced Transformer with Rotary Position Embedding | RoPE论文阅读 |
Yarn | Understanding YaRN: Extending Context Window of LLMs | 论文YaRN笔记 |
T5 | Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer | 谷歌T5 |
React | ReAct: Synergizing Reasoning and Acting in Language Models | LLM大模型ReAct |
Q-Former | Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models | BLIP-2论文解读 |