kingoflolz / CLIP_JAXLinks
Contrastive Language-Image Pretraining
☆141Updated 2 years ago
Alternatives and similar repositories for CLIP_JAX
Users that are interested in CLIP_JAX are comparing it to the libraries listed below
Sorting:
- CLOOB training (JAX) and inference (JAX and PyTorch)☆71Updated 3 years ago
- ESGD-M is a stochastic non-convex second order optimizer, suitable for training deep learning models, for PyTorch.☆56Updated 2 years ago
- v objective diffusion inference code for JAX.☆214Updated 3 years ago
- Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.☆253Updated 2 months ago
- ☆63Updated 3 years ago
- Implementation of a Transformer that Ponders, using the scheme from the PonderNet paper☆81Updated 3 years ago
- ☆88Updated 2 years ago
- JAX implementation ViT-VQGAN☆83Updated 2 years ago
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆88Updated 3 years ago
- gpu tester detects broken and slow gpus in a cluster☆70Updated 2 years ago
- A JAX implementation of the continuous time formulation of Consistency Models☆84Updated 2 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆187Updated 2 years ago
- Simple python template☆41Updated last year
- Easy Hypernetworks in Pytorch and Jax☆100Updated 2 years ago
- Simple and efficient RevNet-Library for PyTorch with XLA and DeepSpeed support and parameter offload☆127Updated 2 years ago
- CLOOB Conditioned Latent Diffusion training and inference code☆112Updated 3 years ago
- Optimized library for large-scale extraction of frames and audio from video.☆205Updated last year
- Another attempt at a long-context / efficient transformer by me☆38Updated 3 years ago
- ☆150Updated last year
- LoRA for arbitrary JAX models and functions☆135Updated last year
- A new play-and-plug method of controlling an existing generative model with conditioning attributes and their compositions.☆73Updated 3 years ago
- Feed forward VQGAN-CLIP model, where the goal is to eliminate the need for optimizing the latent space of VQGAN for each input prompt☆137Updated last year
- HomebrewNLP in JAX flavour for maintable TPU-Training☆50Updated last year
- JAX implementation of VQGAN☆93Updated 2 years ago
- A GPT, made only of MLPs, in Jax☆58Updated 3 years ago
- Refactoring dalle-pytorch and taming-transformers for TPU VM☆60Updated 3 years ago
- Automatically take good care of your preemptible TPUs☆36Updated 2 years ago
- ☆57Updated 2 years ago
- Implementation of Token Shift GPT - An autoregressive model that solely relies on shifting the sequence space for mixing☆50Updated 3 years ago
- Train vision models using JAX and 🤗 transformers☆97Updated last month