LAION-AI / CLIP_benchmarkLinks
CLIP-like model evaluation
☆785Updated last week
Alternatives and similar repositories for CLIP_benchmark
Users that are interested in CLIP_benchmark are comparing it to the libraries listed below
Sorting:
- DataComp: In search of the next generation of multimodal datasets☆745Updated 6 months ago
- Robust fine-tuning of zero-shot models☆748Updated 3 years ago
- Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch☆1,266Updated 3 years ago
- Supervision Exists Everywhere: A Data Efficient Contrastive Language-Image Pre-training Paradigm☆669Updated 3 years ago
- GIT: A Generative Image-to-text Transformer for Vision and Language☆575Updated last year
- Implementation of CoCa, Contrastive Captioners are Image-Text Foundation Models, in Pytorch☆1,180Updated last year
- A PyTorch Lightning solution to training OpenAI's CLIP from scratch.☆715Updated 3 years ago
- 🧀 Code and models for the ICML 2023 paper "Grounding Language Models to Images for Multimodal Inputs and Outputs".☆483Updated 2 years ago
- ICLR2024 Spotlight: curation/training code, metadata, distribution and pre-trained models for MetaCLIP; CVPR 2024: MoDE: CLIP Data Expert…☆1,704Updated last month
- [NeurIPS 2023] This repository includes the official implementation of our paper "An Inverse Scaling Law for CLIP Training"☆319Updated last year
- When do we not need larger vision models?☆412Updated 9 months ago
- [CVPR 2022] Official code for "Unified Contrastive Learning in Image-Text-Label Space"☆402Updated 2 years ago
- A concise but complete implementation of CLIP with various experimental improvements from recent papers☆718Updated 2 years ago
- A method to increase the speed and lower the memory footprint of existing vision transformers.☆1,115Updated last year
- Conceptual 12M is a dataset containing (image-URL, caption) pairs collected for vision-and-language pre-training.☆405Updated 4 months ago
- Code release for SLIP Self-supervision meets Language-Image Pre-training