MaxRobinsonTheGreat / mandelbrotnn
Torturing neural networks by forcing them to learn the Mandelbrot set.
☆132Updated last year
Related projects ⓘ
Alternatives and complementary repositories for mandelbrotnn
- ☆69Updated last year
- Visualizing some of the internals of a neural network during training and inference.☆70Updated 9 months ago
- A simplistic linear and multiprocessed approach to sentiment analysis using Gzip Normalized Compression Distances with k nearest neighbor…☆142Updated last year
- Material for the Systems and Cognitive NeuroScience online course☆119Updated 2 years ago
- My writings about ARC (Abstraction and Reasoning Corpus)☆59Updated last week
- Simple Mandelbrot☆58Updated 5 months ago
- Code used in creating YouTube videos☆90Updated last year
- Training small GPT-2 style models using Kolmogorov-Arnold networks.☆108Updated 5 months ago
- Some helpers and examples for creating an LLM fine-tuning dataset☆63Updated 8 months ago
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆109Updated 2 years ago
- Because tinygrad got out of hand with line count☆146Updated last month
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆35Updated 4 years ago
- Variations of Kolmogorov-Arnold Networks☆111Updated 6 months ago
- documentation for content creation☆140Updated this week
- The Tensor (or Array)☆411Updated 3 months ago
- Understanding Kolmogorov-Arnold Networks: A Tutorial Series on KAN using Toy Examples☆166Updated last month
- Machine Learning library for educational purpose.☆296Updated 5 months ago
- Repo where I recreate some popular machine learning models from scratch in Python☆102Updated last month
- The Multilayer Perceptron Language Model☆523Updated 3 months ago
- ☆32Updated 5 months ago
- Swarming algorithms like PSO, Ant Colony, Sakana, and more in PyTorch 😊☆111Updated last week
- The boundary of neural network trainability is fractal☆161Updated 9 months ago
- ☆62Updated 4 months ago
- Domain Specific Language for the Abstraction and Reasoning Corpus☆214Updated last month
- ☆34Updated last week
- Brain-Inspired Modular Training (BIMT), a method for making neural networks more modular and interpretable.☆163Updated last year
- Simple, minimal implementation of the Mamba SSM in one pytorch file. Using logcumsumexp (Heisen sequence).☆102Updated last month
- This is the source code for the animations in the series "Visualizing Deep Learning"☆194Updated 5 months ago
- ☆105Updated last month
- Reverse Engineering the Abstraction and Reasoning Corpus☆196Updated last month