Zasder3 / open_clip_juwelsLinks
An open source implementation of CLIP.
☆32Updated 2 years ago
Alternatives and similar repositories for open_clip_juwels
Users that are interested in open_clip_juwels are comparing it to the libraries listed below
Sorting:
- Another attempt at a long-context / efficient transformer by me☆38Updated 3 years ago
- ☆27Updated 4 years ago
- Contains my experiments with the `big_vision` repo to train ViTs on ImageNet-1k.☆22Updated 2 years ago
- Implementation of "compositional attention" from MILA, a multi-head attention variant that is reframed as a two-step attention process wi…☆51Updated 3 years ago
- clip retrieval benchmark☆17Updated 3 years ago
- A GPT, made only of MLPs, in Jax☆58Updated 3 years ago
- Implementation of Kronecker Attention in Pytorch☆19Updated 4 years ago
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆59Updated 4 years ago
- Implementation of the Remixer Block from the Remixer paper, in Pytorch☆36Updated 3 years ago
- (unofficial) - customized fork of DETR, optimized for intelligent obj detection on 'real world' custom datasets☆12Updated 4 years ago
- ☆21Updated 4 years ago
- A dashboard for exploring timm learning rate schedulers☆19Updated 6 months ago
- ImageNet-12k subset of ImageNet-21k (fall11)☆21Updated last year
- A non-JIT version implementation / replication of CLIP of OpenAI in pytorch☆34Updated 4 years ago
- Little article showing how to load pytorch's models with linear memory consumption☆34Updated 2 years ago
- Implementation of a Transformer using ReLA (Rectified Linear Attention) from https://arxiv.org/abs/2104.07012☆50Updated 3 years ago
- GAN models implemented with Pytorch Lightning and Hydra configuration☆34Updated 2 years ago
- Implementation of Multistream Transformers in Pytorch