sdascoli / boolformerLinks
☆164Updated last year
Alternatives and similar repositories for boolformer
Users that are interested in boolformer are comparing it to the libraries listed below
Sorting:
- Evaluation of neuro-symbolic engines☆40Updated last year
- Brain-Inspired Modular Training (BIMT), a method for making neural networks more modular and interpretable.☆174Updated 2 years ago
- ☆69Updated last year
- Learning Universal Predictors☆81Updated last year
- Swarming algorithms like PSO, Ant Colony, Sakana, and more in PyTorch 😊☆134Updated last month
- Automatic gradient descent☆215Updated 2 years ago
- ☆82Updated last year
- Examining how large language models (LLMs) perform across various synthetic regression tasks when given (input, output) examples in their…☆157Updated last month
- ☆53Updated last year
- Code to reproduce "Transformers Can Do Arithmetic with the Right Embeddings", McLeish et al (NeurIPS 2024)☆195Updated last year
- Repo for solving arc problems with an Neural Cellular Automata☆22Updated 6 months ago
- Repository for code used in the xVal paper☆145Updated last year
- ☆53Updated last year
- ☆231Updated last week
- Explorations into the proposal from the paper "Grokfast, Accelerated Grokking by Amplifying Slow Gradients"☆103Updated 11 months ago
- ☆105Updated 11 months ago
- Code for the paper "What's the Magic Word? A Control Theory of LLM Prompting"☆110Updated last year
- The GeoV model is a large langauge model designed by Georges Harik and uses Rotary Positional Embeddings with Relative distances (RoPER).…☆121Updated 2 years ago
- ☆62Updated 2 years ago
- σ-GPT: A New Approach to Autoregressive Models☆70Updated last year
- ☆72Updated last year
- gzip Predicts Data-dependent Scaling Laws☆34Updated last year
- Q-Probe: A Lightweight Approach to Reward Maximization for Language Models☆41Updated last year
- Code repository for Black Mamba☆260Updated last year
- Memoria is a human-inspired memory architecture for neural networks.☆78Updated last year
- Jax Codebase for Evolutionary Strategies at the Hyperscale☆181Updated 3 weeks ago
- Predicting the Future of AI with AI: High-quality link prediction in an exponentially growing knowledge network☆79Updated 2 years ago
- Functional Benchmarks and the Reasoning Gap☆90Updated last year
- ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward exp…☆226Updated 2 months ago
- ☆105Updated 4 months ago