EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
☆130Jan 4, 2024Updated 2 years ago
Alternatives and similar repositories for efficientnet-jax
Users that are interested in efficientnet-jax are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- ☆775Jan 27, 2024Updated 2 years ago
- ☆13Aug 17, 2020Updated 5 years ago
- ImageNet-12k subset of ImageNet-21k (fall11)☆22Jun 13, 2023Updated 2 years ago
- ☆91Sep 19, 2022Updated 3 years ago
- Unofficial JAX implementations of deep learning research papers☆162Jun 25, 2022Updated 3 years ago
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).☆119Jun 5, 2022Updated 3 years ago
- Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.☆265Mar 21, 2025Updated last year
- Implementation of Vision Transformers in Flax☆18Oct 12, 2020Updated 5 years ago
- Anytime Dense Prediction with Confidence Adaptivity (ICLR 2022)☆51Aug 23, 2024Updated last year
- Implementation of numerous Vision Transformers in Google's JAX and Flax.☆22Aug 30, 2022Updated 3 years ago
- A GPT, made only of MLPs, in Jax☆59Jun 23, 2021Updated 4 years ago
- Contrastive Language-Image Pretraining☆146Sep 6, 2022Updated 3 years ago
- JMP is a Mixed Precision library for JAX.☆212Jan 30, 2025Updated last year
- A Pytree Module system for Deep Learning in JAX☆212Feb 26, 2023Updated 3 years ago
- Bare Metal GPUs on DigitalOcean Gradient AI • AdPurpose-built for serious AI teams training foundational models, running large-scale inference, and pushing the boundaries of what's possible.
- JAX-based neural network library☆3,210Mar 31, 2026Updated 2 weeks ago
- Unofficial implementation of NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis, using Flax with the Linen API☆13Sep 25, 2021Updated 4 years ago
- Optax is a gradient processing and optimization library for JAX.☆2,232Apr 3, 2026Updated last week
- CLU lets you write beautiful training loops in JAX.☆368Mar 3, 2026Updated last month
- A PyTorch baseline defense example for the NIPS 2017 adversarial competition☆11Aug 3, 2017Updated 8 years ago
- Contains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).☆122Dec 3, 2022Updated 3 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆190Jun 24, 2022Updated 3 years ago
- Simple, extensible implementations of some meta-learning algorithms in Jax☆11Oct 6, 2020Updated 5 years ago
- 🧀 Pytorch code for the Fromage optimiser.☆128Jul 14, 2024Updated last year
- Simple, predictable pricing with DigitalOcean hosting • AdAlways know what you'll pay with monthly caps and flat pricing. Enterprise-grade infrastructure trusted by 600k+ customers.
- Implementation of the Remixer Block from the Remixer paper, in Pytorch☆36Sep 27, 2021Updated 4 years ago
- Models and code for the ICLR 2020 workshop paper "Towards Understanding Normalization in Neural ODEs"☆16Apr 27, 2020Updated 5 years ago
- 1st Place Solution to ECCV-TAO-2020: Detect and Represent Any Object for Tracking☆82Mar 16, 2021Updated 5 years ago
- ☆20Nov 8, 2022Updated 3 years ago
- JAX - A curated list of resources https://github.com/google/jax☆2,082Jan 20, 2026Updated 2 months ago
- ☆25May 3, 2024Updated last year
- functorch is JAX-like composable function transforms for PyTorch.☆1,436Aug 21, 2025Updated 7 months ago
- Supplementary code for the paper "Meta-Solver for Neural Ordinary Differential Equations" https://arxiv.org/abs/2103.08561☆25Mar 30, 2021Updated 5 years ago
- Flax is a neural network library for JAX that is designed for flexibility.☆7,152Updated this week
- GPUs on demand by Runpod - Special Offer Available • AdRun AI, ML, and HPC workloads on powerful cloud GPUs—without limits or wasted spend. Deploy GPUs in under a minute and pay by the second.
- The official implementation of paper "Drop-Activation: Implicit Parameter Reduction and Harmonious Regularization".☆10May 30, 2019Updated 6 years ago
- 100 exercises to learn JAX☆607Jun 11, 2022Updated 3 years ago
- Training code of Cornell Birdcall Identification Challenge 6th place solution☆50Oct 12, 2020Updated 5 years ago
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆242May 12, 2023Updated 2 years ago
- Contains my experiments with the `big_vision` repo to train ViTs on ImageNet-1k.☆22Jan 16, 2023Updated 3 years ago
- Pretrained models for Jax/Haiku; MobileNet, ResNet, VGG, Xception.☆24Apr 22, 2022Updated 3 years ago
- Pretrained EfficientNet, EfficientNet-Lite, MixNet, MobileNetV3 / V2, MNASNet A1 and B1, FBNet, Single-Path NAS☆1,584Jun 13, 2024Updated last year