pentagonalize / Transformer-Cookbook
☆13Updated 3 months ago
Alternatives and similar repositories for Transformer-Cookbook:
Users that are interested in Transformer-Cookbook are comparing it to the libraries listed below
- Minimum Description Length Recurrent Neural Networks☆18Updated last year
- ☆11Updated 5 years ago
- Silly twitter torch implementations.☆46Updated 2 years ago
- ☆13Updated last year
- This repository holds the code for the NeurIPS 2022 paper, Semantic Probabilistic Layers☆27Updated last year
- A scalable abstraction learning library☆78Updated last year
- ☆43Updated 2 years ago
- EMNLP 2020: On the Ability and Limitations of Transformers to Recognize Formal Languages☆23Updated 4 years ago
- Language-annotated Abstraction and Reasoning Corpus☆86Updated last year
- General-purpose program synthesiser☆45Updated 6 months ago
- ☆61Updated 4 months ago
- An environment for learning formal mathematical reasoning from scratch☆66Updated 8 months ago
- Neural Networks and the Chomsky Hierarchy☆206Updated last year
- NaturalProofs: Mathematical Theorem Proving in Natural Language (NeurIPS 2021 Datasets & Benchmarks)☆129Updated 2 years ago
- ☆34Updated 4 months ago
- Implementing RASP transformer programming language https://arxiv.org/pdf/2106.06981.pdf.☆53Updated 3 years ago
- A library for research in unnatural language semantics☆11Updated this week
- ☆29Updated 3 years ago
- Codebase for VAEL: Bridging Variational Autoencoders and Probabilistic Logic Programming☆20Updated last year
- Code for the paper "The Surprising Computational Power of Nondeterministic Stack RNNs" (DuSell and Chiang, 2023)☆18Updated last year
- Python package for Sentential Decision Diagrams (SDD)☆59Updated 2 months ago
- A set of tools for analyzing languages via logic and automata☆24Updated 2 weeks ago
- Mechanistic Interpretability for Transformer Models☆50Updated 2 years ago
- Materials for ConceptARC paper☆92Updated 6 months ago
- ☆28Updated last year
- NaturalProver: Grounded Mathematical Proof Generation with Language Models☆37Updated 2 years ago
- ☆13Updated last month
- How do transformer LMs encode relations?☆48Updated last year
- ☆28Updated 4 months ago
- Minimum Description Length Recurrent Neural Networks (MDLRNNs) in PyTorch☆21Updated this week