sssingh / mnist-digit-generation-gan
A Generative Adversarial Network (GAN) trained on the MNIST dataset, capable of creating fake but realistic looking MNIST digit images that appear to be drawn from the original dataset.
☆9Updated last year
Alternatives and similar repositories for mnist-digit-generation-gan:
Users that are interested in mnist-digit-generation-gan are comparing it to the libraries listed below
- Notes on quantization in neural networks☆66Updated last year
- A curated list of resources for learning and exploring Triton, OpenAI's programming language for writing efficient GPU code.☆175Updated this week
- ☆110Updated 3 weeks ago
- Diffusion Reading Group at EleutherAI☆317Updated last year
- 94% on CIFAR-10 in 2.6 seconds 💨 96% in 27 seconds☆197Updated 2 months ago
- LoRA and DoRA from Scratch Implementations☆195Updated 10 months ago
- NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Day☆255Updated last year
- ☆140Updated 11 months ago
- This repo contains my solutions to “Introduction to Machine Learning Interviews” by Chip Huyen.☆144Updated 6 months ago
- List of papers related to neural network quantization in recent AI conferences and journals.☆516Updated last month
- Annotated version of the Mamba paper☆471Updated 11 months ago
- A library for researching neural networks compression and acceleration methods.☆139Updated 5 months ago
- Simple Byte pair Encoding mechanism used for tokenization process . written purely in C☆122Updated 2 months ago
- This handbook helps demystify the PhD admission process in Computer Science in US universities.☆63Updated last week
- 🎓 Advice and resources for thriving and surviving graduate school☆347Updated 3 months ago
- Building blocks for foundation models.☆440Updated last year
- PyTorch Lightning implementation of the paper Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and H…☆27Updated last month
- ☆301Updated this week
- LORA: Low-Rank Adaptation of Large Language Models implemented using PyTorch☆95Updated last year
- Implementation of Diffusion Transformer (DiT) in JAX☆261Updated 7 months ago
- ☆413Updated 3 months ago
- ☆296Updated 7 months ago
- UNet diffusion model in pure CUDA☆596Updated 7 months ago
- Everything you want to know about Google Cloud TPU☆506Updated 6 months ago
- An implementation of the transformer architecture onto an Nvidia CUDA kernel☆167Updated last year
- Documentation, notes, links, etc for streams.☆77Updated 11 months ago
- Creating a diffusion model from scratch in PyTorch to learn exactly how they work.☆344Updated 8 months ago