mindspore-courses / d2l-mindsporeLinks
《动手学深度学习》的MindSpore实现。供MindSpore学习者配合李沐老师课程使用。
☆116Updated last year
Alternatives and similar repositories for d2l-mindspore
Users that are interested in d2l-mindspore are comparing it to the libraries listed below
Sorting:
- This repository is used for storing information about MindSpore☆11Updated last year
- MindSpore online courses: Step into LLM☆468Updated 5 months ago
- 《动手学深度学习》:面向中文读者、能运行、可讨论。中英文版被60多个国家的400多所大学用于教学。☆23Updated 2 years ago
- ☆18Updated 3 months ago
- 帮助新手快速入门、快速使用、习惯 OpenMMLab 开源库官方文档且能够自主上手实验,自由选择阅读更深层的知识。☆63Updated 2 years ago
- mindspore implementation of transformers☆67Updated 2 years ago
- Natural Language Processing Tutorial for MindSpore Users☆142Updated last year
- ☆83Updated last month
- Implementation of Denoising Diffusion Probabilistic Model in MindSpore☆36Updated 2 years ago
- a chinese tutorial of git☆155Updated last year
- 基于MindSpore的TinyRAG实现☆16Updated 5 months ago
- pytorch distribute tutorials☆136Updated last week
- 模型压缩的小白入门教程☆283Updated 6 months ago
- ☆18Updated 2 years ago
- MS COCO数据集使用教程学习笔记(目标检测)☆65Updated 5 years ago
- 必要的计算机科学及软件开发知识☆33Updated this week
- https://hcv.boyuai.com☆70Updated 5 months ago
- 这是一个blip-pytorch简化的代码,适用于了解Attention与Transformer的结构。☆51Updated last year
- Inference code for LLaMA models☆121Updated last year
- A more efficient yolov5 with oneflow backend 🎉🎉🎉☆215Updated 3 weeks ago
- ☆23Updated 2 years ago
- NumPy实现类PyTorch的动态计算图和神经网络框架(MLP, CNN, RNN, Transformer)☆81Updated 11 months ago
- ☆49Updated 2 years ago
- The official implementation of Natural Language Fine-Tuning☆50Updated 4 months ago
- ☆224Updated 2 months ago
- ☆78Updated 8 months ago
- 个人总结的大模型、自然语言处理NLP、多模态、计算机视觉CV等方向paper的阅读笔记;收集到或者使用到的一些NLP、CV等领域的优秀开源仓库;其他:如数据集、评测leaderboard等☆48Updated last week
- Transformer是谷歌在17年发表的Attention Is All You Need 中使用的模型,经过这些年的大量的工业使用和论文验证,在深度学习领域已经占据重要地位。Bert就是从Transformer中衍生出来的语言模型。我会以中文翻译英文为例,来解释Tran…☆258Updated last year
- 关于Transformer模型的最简洁pytorch实现,包含详细注释☆198Updated last year
- 飞桨护航计划集训营☆18Updated 2 weeks ago