zxuu / Self-AttentionLinks
Transformer的完整实现。详细构建Encoder、Decoder、Self-attention。以实际例子进行展示,有完整的输入、训练、预测过程。可用于学习理解self-attention和Transformer
☆117Updated 8 months ago
Alternatives and similar repositories for Self-Attention
Users that are interested in Self-Attention are comparing it to the libraries listed below
Sorting:
- Huggingface transformers的中文文档☆285Updated 2 years ago
- everything about llm & aigc☆108Updated 2 weeks ago
- TinyRAG☆393Updated 6 months ago
- 这里用来存储做人工智能项目的代码和参加数据挖掘比赛的代码☆108Updated 5 months ago
- LLM大模型(重点)以及搜广推等 AI 算法中手写的面试题,(非 LeetCode),比如 Self-Attention, AUC等,一般比 LeetCode 更考察一个人的综合能力,又更贴近业务和基础知识一点☆463Updated last year
- 人工智能培训课件资源☆144Updated last month
- 一些 LLM 方面的从零复现笔记☆241Updated 8 months ago
- MindSpore online courses: Step into LLM☆482Updated 2 weeks ago
- 关于Transformer模型的最简洁pytorch实现,包含详细注释☆227Updated 2 years ago
- personal chatgpt☆402Updated last year
- 通义千问的DPO训练☆61Updated last year
- Learning LLM Implementaion and Theory for Practical Landing☆192Updated last year
- ☆82Updated last month
- 包含程序员面试大厂面试题和面试经验☆201Updated 7 months ago
- modern AI for beginners☆185Updated 3 months ago
- 从0到1构建一个MiniLLM (pretrain+sft+dpo实践中)☆512Updated 9 months ago
- ☆358Updated 8 months ago
- 大模型/LLM推理和部署理论与实践☆369Updated 5 months ago
- ☆125Updated last year
- 解锁HuggingFace生态的百般用法☆97Updated last year
- 一个很小很小的RAG系统☆331Updated 8 months ago
- RAG兴趣小组,全手写的一个RAG应用。Langchain的大部分库会很方便,但是你不一定理解其中原理,所以代码尽可能展现基本算法,主打理解RAG的原理☆244Updated last year
- WWW2025 Multimodal Intent Recognition for Dialogue Systems Challenge☆129Updated last year
- 大模型技术栈一览☆122Updated last year
- DeepSpeed Tutorial☆104Updated last year
- 从0开始,将chatgpt的技术路线跑一遍。☆270Updated last year
- LLM Tokenizer with BPE algorithm☆46Updated last year
- 大模型基础学习和面试八股文☆183Updated last year
- ☆119Updated 10 months ago
- 对llama3进行全参微调、lora微调以及qlora微调。☆212Updated last year