MegEngine / MegEngineLinks
MegEngine 是一个快速、可拓展、易于使用且支持自动求导的深度学习框架
☆4,807Updated last year
Alternatives and similar repositories for MegEngine
Users that are interested in MegEngine are comparing it to the libraries listed below
Sorting:
- Jittor is a high-performance deep learning framework based on JIT compiling and meta-operators.☆3,220Updated 5 months ago
- TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is …☆4,603Updated 7 months ago
- 采用MegEngine实现的各种主流深度学习模型☆307Updated 3 years ago
- A high performance and generic framework for distributed DNN training☆3,715Updated 2 years ago
- OneFlow is a deep learning framework designed to be user-friendly, scalable and efficient.☆9,383Updated 3 weeks ago
- Homepage for the joint course of Megvii Inc. and Peking University on Deep Learning.☆448Updated 3 years ago
- LightSeq: A High Performance Library for Sequence Processing and Generation☆3,300Updated 2 years ago
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆955Updated 8 months ago
- An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.☆2,910Updated 2 years ago
- NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source compone…☆12,526Updated 2 weeks ago
- 基于旷视研究院领先的深度学习算法,提供满足多业务场景的预训练模型☆92Updated last year
- PaddleSlim is an open-source library for deep model compression and architecture search.☆1,610Updated 2 months ago
- 🛠A lite C++ AI toolkit: 100+ models with MNN, ORT and TRT, including Det, Seg, Stable-Diffusion, Face-Fusion, etc.🎉☆4,332Updated 2 weeks ago
- MegEngine Documentations☆44Updated 4 years ago
- A primitive library for neural network☆1,369Updated last year
- MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.☆4,655Updated last year
- Simplify your onnx model☆4,255Updated 4 months ago
- Officially maintained, supported by PaddlePaddle, including CV, NLP, Speech, Rec, TS, big models and so on.☆6,942Updated 11 months ago
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆5,029Updated last year
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,224Updated 6 years ago
- A flexible, high-performance carrier for machine learning models(『飞桨』服务化部署框架)☆919Updated 3 weeks ago
- An easy to use PyTorch to TensorRT converter☆4,842Updated last year
- AutoML tools chain☆852Updated 2 years ago
- Deep learning model converter for PaddlePaddle. (『飞桨』深度学习模型转换工具)☆766Updated 2 months ago
- ☆1,509Updated 5 years ago
- a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.☆1,535Updated 5 months ago
- ONNX-TensorRT: TensorRT backend for ONNX☆3,177Updated last month
- MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba. Full multimodal LLM …☆13,765Updated this week
- micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantiz…☆2,268Updated 7 months ago
- Tengine is a lite, high performance, modular inference engine for embedded device☆4,505Updated 9 months ago