google-deepmind / neural_networks_solomonoff_inductionLinks
Learning Universal Predictors
☆81Updated last year
Alternatives and similar repositories for neural_networks_solomonoff_induction
Users that are interested in neural_networks_solomonoff_induction are comparing it to the libraries listed below
Sorting:
- A domain-specific probabilistic programming language for modeling and inference with language models☆137Updated 7 months ago
- Evaluation of neuro-symbolic engines☆40Updated last year
- Materials for ConceptARC paper☆108Updated last year
- Probabilistic programming with large language models☆145Updated 3 weeks ago
- Language-annotated Abstraction and Reasoning Corpus☆98Updated 2 years ago
- Notebooks accompanying Anthropic's "Toy Models of Superposition" paper☆130Updated 3 years ago
- The Energy Transformer block, in JAX☆62Updated last year
- Latent Program Network (from the "Searching Latent Program Spaces" paper)☆106Updated 2 weeks ago
- Harmonic Datasets☆52Updated last year
- ☆105Updated 4 months ago
- ☆72Updated last year
- Neural theorem proving tutorial, version II☆40Updated last year
- Code to reproduce "Transformers Can Do Arithmetic with the Right Embeddings", McLeish et al (NeurIPS 2024)☆195Updated last year
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆92Updated last year
- Brain-Inspired Modular Training (BIMT), a method for making neural networks more modular and interpretable.☆174Updated 2 years ago
- ☆53Updated last year
- ☆201Updated 3 months ago
- ☆62Updated last year
- Neural Networks and the Chomsky Hierarchy☆211Updated last year
- ☆53Updated last year
- Implementing RASP transformer programming language https://arxiv.org/pdf/2106.06981.pdf.☆59Updated last month
- Universal Neurons in GPT2 Language Models☆31Updated last year
- Code for minimum-entropy coupling.☆32Updated 2 weeks ago
- ☆105Updated 11 months ago
- ☆31Updated 2 years ago
- Code Release for "Broken Neural Scaling Laws" (BNSL) paper☆59Updated 2 years ago
- Scaling is a distributed training library and installable dependency designed to scale up neural networks, with a dedicated module for tr…☆66Updated 3 weeks ago
- Emergent world representations: Exploring a sequence model trained on a synthetic task☆191Updated 2 years ago
- A MAD laboratory to improve AI architecture designs 🧪☆135Updated 11 months ago
- Code implementing "Efficient Parallelization of a Ubiquitious Sequential Computation" (Heinsen, 2023)☆97Updated last year