AminRezaei0x443 / PyTorch-LIT
Lite Inference Toolkit (LIT) for PyTorch
☆161Updated 2 years ago
Related projects ⓘ
Alternatives and complementary repositories for PyTorch-LIT
- Check if you have training samples in your test set☆64Updated 2 years ago
- Library for 8-bit optimizers and quantization routines.☆714Updated 2 years ago
- The "tl;dr" on a few notable transformer papers (pre-2022).☆189Updated last year
- Memory Efficient Attention (O(sqrt(n)) for Jax and PyTorch☆179Updated last year
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale☆153Updated 10 months ago
- An alternative to convolution in neural networks☆250Updated 7 months ago
- Python Research Framework☆107Updated 2 years ago
- Accelerated NLP pipelines for fast inference on CPU. Built with Transformers and ONNX runtime.☆126Updated 3 years ago
- My implementation of DeepMind's Perceiver☆63Updated 3 years ago
- FasterAI: Prune and Distill your models with FastAI and PyTorch☆243Updated last week
- Official code for "Distributed Deep Learning in Open Collaborations" (NeurIPS 2021)☆116Updated 2 years ago
- DiffQ performs differentiable quantization using pseudo quantization noise. It can automatically tune the number of bits used per weight …☆234Updated last year
- A fastai/PyTorch package for unpaired image-to-image translation.☆133Updated last year
- Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch☆247Updated 2 years ago
- Open Source Photos Platform Powered by PyTorch☆137Updated 2 years ago
- A library to synthesize text datasets using Large Language Models (LLM)☆151Updated last year
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆145Updated 3 years ago
- A library to inspect and extract intermediate layers of PyTorch models.☆470Updated 2 years ago
- A simple library that implements CLIP guided loss in PyTorch.☆77Updated 2 years ago
- Accelerated NLP pipelines for fast inference on CPU and GPU. Built with Transformers, Optimum and ONNX Runtime.☆126Updated 2 years ago
- Implementation of Feedback Transformer in Pytorch☆104Updated 3 years ago
- Functional deep learning☆106Updated last year
- Learning to Initialize Neural Networks for Stable and Efficient Training☆135Updated 2 years ago
- Deep Learning project template best practices with Pytorch Lightning, Hydra, Tensorboard.☆155Updated 3 years ago
- Babysit your preemptible TPUs☆84Updated last year
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆185Updated 2 years ago
- ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes th…☆85Updated 3 years ago
- HetSeq: Distributed GPU Training on Heterogeneous Infrastructure☆106Updated last year
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆239Updated last year
- Recipes are a standard, well supported set of blueprints for machine learning engineers to rapidly train models using the latest research…☆293Updated this week