torch::deploy (multipy for non-torch uses) is a system that lets you get around the GIL problem by running multiple Python interpreters in a single C++ process.
☆179Dec 16, 2025Updated 3 months ago
Alternatives and similar repositories for multipy
Users that are interested in multipy are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- A Python-level JIT compiler designed to make unmodified PyTorch programs faster.☆1,077Apr 17, 2024Updated last year
- A performant, memory-efficient checkpointing library for PyTorch applications, designed with large, complex distributed workloads in mind…☆163Jan 12, 2026Updated 2 months ago
- TorchX is a universal job launcher for PyTorch applications. TorchX is designed to have fast iteration time for training/research and sup…☆420Updated this week
- Continuous builder and binary build scripts for pytorch☆357Aug 12, 2025Updated 7 months ago
- Torch Distributed Experimental☆117Aug 5, 2024Updated last year
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting with the flexibility to host WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Cloudways by DigitalOcean.
- e3nn tutorial for Materials Research Society Fall Meeting 2021☆14Nov 29, 2021Updated 4 years ago
- TorchBench is a collection of open source benchmarks used to evaluate PyTorch performance.☆1,024Updated this week
- ☆12May 25, 2021Updated 4 years ago
- functorch is JAX-like composable function transforms for PyTorch.☆1,436Aug 21, 2025Updated 7 months ago
- ☆191Jun 16, 2024Updated last year
- ☆13Mar 22, 2026Updated 2 weeks ago
- High performance model preprocessing library on PyTorch☆647Mar 29, 2024Updated 2 years ago
- PyTorch/TorchScript/FX compiler for NVIDIA GPUs using TensorRT☆2,962Updated this week
- A tensor-aware point-to-point communication primitive for machine learning