MegEngine / MegEngine
MegEngine 是一个快速、可拓展、易于使用且支持自动求导的深度学习框架
☆4,787Updated 5 months ago
Alternatives and similar repositories for MegEngine:
Users that are interested in MegEngine are comparing it to the libraries listed below
- 采用MegEngine实现的各种主流深度学习模型☆304Updated 2 years ago
- Jittor is a high-performance deep learning framework based on JIT compiling and meta-operators.☆3,138Updated last month
- MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.☆4,464Updated 8 months ago
- TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is …☆4,494Updated 3 weeks ago
- FeatherCNN is a high performance inference engine for convolutional neural networks.☆1,217Updated 5 years ago
- Bolt is a deep learning library with high performance and heterogeneous flexibility.☆940Updated last week
- MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba. Full multimodal LLM …☆10,283Updated this week
- Deep learning model converter for PaddlePaddle. (『飞桨』深度学习模型转换工具)☆751Updated last month
- A high performance and generic framework for distributed DNN training☆3,676Updated last year
- An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.☆2,864Updated 2 years ago
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,539Updated 5 years ago
- Homepage for the joint course of Megvii Inc. and Peking University on Deep Learning.☆445Updated 2 years ago
- darknet深度学习框架源码分析:详细中文注释,涵盖框架原理与实现语法分析☆1,607Updated 6 years ago
- PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)☆7,069Updated 3 months ago
- PaddleSlim is an open-source library for deep model compression and architecture search.☆1,589Updated 4 months ago
- Tengine is a lite, high performance, modular inference engine for embedded device☆4,458Updated last month
- A primitive library for neural network☆1,331Updated 4 months ago
- micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantiz…☆2,239Updated this week
- ONNX-TensorRT: TensorRT backend for ONNX☆3,060Updated last month
- 🔥 (yolov3 yolov4 yolov5 unet ...)A mini pytorch inference framework which inspired from darknet.☆744Updated last year
- High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.☆533Updated 2 years ago
- MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.☆5,009Updated 10 months ago
- MegEngine Documentations☆44Updated 4 years ago
- Code for paper " AdderNet: Do We Really Need Multiplications in Deep Learning?"☆958Updated 3 years ago
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆21,308Updated last week
- Officially maintained, supported by PaddlePaddle, including CV, NLP, Speech, Rec, TS, big models and so on.☆6,922Updated 3 months ago
- Simplify your onnx model☆4,052Updated 7 months ago
- AutoML tools chain☆850Updated 2 years ago
- MegCC是一个运行时超轻量,高效,移植简单的深度学习模型编译器☆482Updated 5 months ago
- PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)☆22,688Updated this week