mcxiaoxiao / annotated-transformer-ChineseLinks
哈佛大学 Transformer 经典入门教程 annotated-transformer-Chinese 中文版 Transformer 论文 Attention is All You Need 的 pytorch 中文注释代码实现,翻译自harvardnlp/annotated-transformer
☆60Updated last year
Alternatives and similar repositories for annotated-transformer-Chinese
Users that are interested in annotated-transformer-Chinese are comparing it to the libraries listed below
Sorting:
- 收集大语言模型的学习路径和各种最佳实践☆323Updated last year
- ☆506Updated 3 weeks ago
- ☆380Updated 9 months ago
- Learning LLM Implementaion and Theory for Practical Landing☆195Updated last year
- ☆102Updated 11 months ago
- ☆418Updated last year
- ☆85Updated last year
- A comparison of deepseek grpo and qwen gspo on Qwen2.5-1.5B-Instruct fine tunning.☆116Updated 3 months ago
- 《动手学深度学习》习题解答,在线阅读地址如下:☆556Updated last year
- ☆97Updated 3 weeks ago
- LLM大模型(重点)以及搜广推等 AI 算法中手写的面试题,(非 LeetCode),比如 Self-Attention, AUC等,一般比 LeetCode 更考察一个人的综合能力,又更贴近业务和基础知识一点