zxuu / Self-Attention
Transformer的完整实现。详细构建Encoder、Decoder、Self-attention。以实际例子进行展示,有完整的输入、训练、预测过程。可用于学习理解self-attention和Transformer
☆76Updated 2 weeks ago
Alternatives and similar repositories for Self-Attention:
Users that are interested in Self-Attention are comparing it to the libraries listed below
- LLM大模型(重点)以及搜广推等 AI 算法中手写的面试题,(非 LeetCode),比如 Self-Attention, AUC等,一般比 LeetCode 更考察一个人的综合能力,又更贴近业务和基础知识一点☆238Updated 3 months ago
- 一个很小很小的RAG系统☆207Updated 4 months ago
- 大语言模型应用:RAG、NL2SQL、聊天机器人、预训练、MOE混合专家模型、微调训练、强化学习、天池数据竞赛☆60Updated 2 months ago
- Transformer是谷歌在17年发表的Attention Is All You Need 中使用的模型,经过这些年的大量的工业使用和论文验证,在深度学习领域已经占据重要地位。Bert就是从Transformer中衍生出来的语言模型。我会以中文翻译英文为例,来解释Tran…☆250Updated last year
- 通义千问的DPO训练☆46Updated 7 months ago
- WWW2025 Multimodal Intent Recognition for Dialogue Systems Challenge☆120Updated 5 months ago
- 一些 LLM 方面的从零复现笔记☆185Updated last week
- 大模型/LLM推理和部署理论与实践☆244Updated last month
- 快速入门RAG与私有化部署☆175Updated last year
- TinyRAG☆286Updated 5 months ago
- 这里用来存储做人工智能项目的代码和参加数据挖掘比赛的代码☆97Updated last month
- 解锁HuggingFace生态的百般用法☆89Updated 4 months ago
- RAG 论文学习☆117Updated last month
- 数据科学教程案例☆140Updated 4 months ago
- 大模型基础学习和面试八股文☆108Updated last year
- pytorch distribute tutorials☆123Updated this week
- 天池算法比赛《BetterMixture - 大模型数据混合挑战赛》的第一名top1解决方案☆29Updated 9 months ago
- 关于Transformer模型的最简洁pytorch实现,包含详细注释☆190Updated last year
- 大模型技术栈一览☆94Updated 7 months ago
- A Transformer Framework Based Translation Task☆150Updated 2 months ago
- ☆70Updated 2 months ago
- 阿里天池: 2023全球智能汽车AI挑战赛——赛道一:AI大模型检索问答 baseline 80+☆99Updated last year
- Qwen1.5-SFT(阿里, Ali), Qwen_Qwen1.5-2B-Chat/Qwen_Qwen1.5-7B-Chat微调(transformers)/LORA(peft)/推理☆57Updated 11 months ago
- an implementation of transformer, bert, gpt, and diffusion models for learning purposes☆153Updated 6 months ago
- everything about llm & aigc☆61Updated last week
- 人工智能培训课件资源☆83Updated this week
- ☆108Updated 9 months ago
- 使用单个24G显卡,从0开始训练LLM☆53Updated 6 months ago
- ☆87Updated last month
- DeepSpeed Tutorial☆95Updated 8 months ago