CastleDream / d2l_learningLinks
动手学深度学习课程相关内容
☆46Updated 2 years ago
Alternatives and similar repositories for d2l_learning
Users that are interested in d2l_learning are comparing it to the libraries listed below
Sorting:
- 博客配套视频链接: https://space.bilibili.com/383551518?spm_id_from=333.1007.0.0 b 站直接看 配套 github 链接:https://github.com/nickchen121/Pre-trainin…☆488Updated 3 years ago
- ☆246Updated 4 years ago
- 理工科-大模型入门实训课程☆116Updated 5 months ago
- Transformer的完整实现。详细构建Encoder、Decoder、Self-attention。以实际例子进行展示,有完整的输入、训练、预测过程。可用于学习理解self-attention和Transformer☆123Updated 9 months ago
- Learning LLM Implementaion and Theory for Practical Landing☆195Updated last year
- ☆260Updated 2 years ago
- 《动手学深度学习》习题解答,在线阅读地址如下:☆556Updated last year
- ☆380Updated 9 months ago
- 大模型基础学习和面试八股文☆194Updated last year
- https://hnlp.boyuai.com☆106Updated last year
- a chinese tutorial of git☆171Updated last week
- everything about llm & aigc☆111Updated last month
- LLM大模型(重点)以及搜广推等 AI 算法中手写的面试题,(非 LeetCode),比如 Self-Attention, AUC等,一般比 LeetCode 更考察一个人的综合能力,又更贴近业务和基础知识一点☆479Updated last year
- 本仓库将带大家从零开始,用pytorch的线性层搭建传统的NLP神经网络☆44Updated last year
- 包含程序员面试大厂面试题和面试经验☆206Updated 8 months ago
- Transformer是谷歌在17年发表的Attention Is All You Need 中使用的模型,经过这些年的大量的工业使用和论文验证,在深度学习领域已经占据重要地位。Bert就是从Transformer中衍生出来的语言模型。我会以中文翻译英文为例,来解释Tran…☆287Updated last year
- modern AI for beginners☆197Updated 5 months ago
- ☆124Updated 4 years ago
- 《机器学习必修课:经典算法与Python实战》配套代码☆98Updated 3 years ago
- ☆223Updated 4 years ago
- 这里用来存储做人工智能项目的代码和参加数据挖掘比赛的代码☆109Updated 6 months ago
- 学习深度学习不如边写代码边学习,实际操作一遍才能理解数据的变换过程,参数的训练过程,这里整合了B站的jupter代码,可以结合着B站的视频边看边练,希望能对大家有帮助。☆138Updated 3 years ago
- Datawhale NLP 面筋☆246Updated 4 years ago
- Huggingface transformers的中文文档☆293Updated 2 years ago
- a super easy clip model with mnist dataset for study☆163Updated last year
- ☆126Updated 11 months ago
- 算法岗笔试面试大全,励志做算法届的《五年高考,三年模拟》!☆688Updated 10 months ago
- 《跟我一起深度学习》@月来客栈 出品☆239Updated last month
- llm相关内容,包括:基础知识、八股文、面经、经典论文☆310Updated last year
- 全球顶级高校AI课程知识点笔记与速查表☆304Updated 4 years ago