PrimeIntellect-ai / primeLinks
Official CLI and Python SDK for Prime Intellect - access GPU compute, remote sandboxes, RL environments, and distributed training infrastructure for AI development at scale.
☆106Updated this week
Alternatives and similar repositories for prime
Users that are interested in prime are comparing it to the libraries listed below
Sorting:
- Plotting (entropy, varentropy) for small LMs☆98Updated 5 months ago
- look how they massacred my boy☆63Updated last year
- Training-Ready RL Environments + Evals☆164Updated this week
- A tree-based prefix cache library that allows rapid creation of looms: hierarchal branching pathways of LLM generations.☆72Updated 9 months ago
- SIMD quantization kernels☆92Updated 2 months ago
- peer-to-peer compute and intelligence network that enables decentralized AI development at scale☆131Updated this week
- ☆89Updated 9 months ago
- MLX port for xjdr's entropix sampler (mimics jax implementation)☆62Updated last year
- ☆14Updated 6 months ago
- Super basic implementation (gist-like) of RLMs with REPL environments.☆242Updated 3 weeks ago
- Modify Entropy Based Sampling to work with Mac Silicon via MLX☆49Updated last year
- ☆135Updated 7 months ago
- EXO Gym is an open-source Python toolkit that facilitates distributed AI research.☆84Updated 2 months ago
- Modded vLLM to run pipeline parallelism over public networks☆39Updated 5 months ago
- explore token trajectory trees on instruct and base models☆148Updated 5 months ago
- ☆40Updated last year
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆107Updated 8 months ago
- they've simulated websites, worlds, and imaginary CLIs... but what if they simulated *you*?☆126Updated last month
- ☆68Updated 5 months ago
- A simple MLX implementation for pretraining LLMs on Apple Silicon.☆84Updated 2 months ago
- Train your own SOTA deductive reasoning model☆108Updated 8 months ago
- Approximating the joint distribution of language models via MCTS☆22Updated last year
- train entropix like a champ!☆20Updated last year
- Synthetic data derived by templating, few shot prompting, transformations on public domain corpora, and monte carlo tree search.☆32Updated last month
- Claude Deep Research config for Claude Code.☆224Updated 7 months ago
- Letting Claude Code develop his own MCP tools :)☆123Updated 8 months ago
- An MCP Server that's also an MCP Client. Useful for letting Claude develop and test MCPs without needing to reset the application.☆124Updated 8 months ago
- j1-micro (1.7B) & j1-nano (600M) are absurdly tiny but mighty reward models.☆98Updated 3 months ago
- Automated Capability Discovery via Foundation Model Self-Exploration☆65Updated 9 months ago
- smol models are fun too☆93Updated last year