sdascoli / boolformer
☆162Updated last year
Alternatives and similar repositories for boolformer
Users that are interested in boolformer are comparing it to the libraries listed below
Sorting:
- Learning Universal Predictors☆73Updated 9 months ago
- ☆81Updated 4 months ago
- Evaluation of neuro-symbolic engines☆35Updated 9 months ago
- A package for defining deep learning models using categorical algebraic expressions.☆60Updated 9 months ago
- Genetics for Language Models☆13Updated 10 months ago
- ☆30Updated last year
- Code to reproduce "Transformers Can Do Arithmetic with the Right Embeddings", McLeish et al (NeurIPS 2024)☆189Updated 11 months ago
- Brain-Inspired Modular Training (BIMT), a method for making neural networks more modular and interpretable.☆170Updated last year
- The history files when recording human interaction while solving ARC tasks☆109Updated 2 weeks ago
- Swarming algorithms like PSO, Ant Colony, Sakana, and more in PyTorch 😊☆122Updated last month
- ☆53Updated last year
- ☆81Updated last year
- Automatic gradient descent☆207Updated last year
- ☆111Updated 4 months ago
- Functional Benchmarks and the Reasoning Gap☆86Updated 7 months ago
- Code for Fooling Contrastive Language-Image Pre-trainined Models with CLIPMasterPrints☆15Updated 6 months ago
- PyTorch Implementation of the paper "A Neuro-vector-symbolic architecture for Solving Raven's Progressive Matrices" published in Nature M…☆92Updated last year
- ☆61Updated last year
- ☆68Updated 9 months ago
- Materials for ConceptARC paper☆92Updated 6 months ago
- Repository for the paper Stream of Search: Learning to Search in Language☆146Updated 3 months ago
- Q-Probe: A Lightweight Approach to Reward Maximization for Language Models☆41Updated 11 months ago
- Neural Networks and the Chomsky Hierarchy☆206Updated last year
- Explorations into the proposal from the paper "Grokfast, Accelerated Grokking by Amplifying Slow Gradients"☆99Updated 4 months ago
- TART: A plug-and-play Transformer module for task-agnostic reasoning☆197Updated last year
- Latent Program Network (from the "Searching Latent Program Spaces" paper)☆82Updated 2 months ago
- ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward exp…☆220Updated last year
- Bootstrapping ARC☆115Updated 5 months ago
- Certified Reasoning with Language Models☆31Updated last year
- σ-GPT: A New Approach to Autoregressive Models☆64Updated 9 months ago