katsumiok / pyaskitLinks
AskIt: Unified programming interface for programming with LLMs (GPT-3.5, GPT-4, Gemini, Claude, Cohere, Llama 2)
☆80Updated last year
Alternatives and similar repositories for pyaskit
Users that are interested in pyaskit are comparing it to the libraries listed below
Sorting:
- ReLM is a Regular Expression engine for Language Models☆107Updated 2 years ago
- Tutorial to get started with SkyPilot!☆58Updated last year
- Code for the paper: CodeTree: Agent-guided Tree Search for Code Generation with Large Language Models☆30Updated 9 months ago
- ☆28Updated 9 months ago
- Pre-training code for CrystalCoder 7B LLM☆57Updated last year
- [ICLR 2024] Skeleton-of-Thought: Prompting LLMs for Efficient Parallel Generation☆184Updated last year
- Open Implementations of LLM Analyses☆107Updated last year
- Estimating hardware and cloud costs of LLMs and transformer projects☆20Updated last week
- [ICML 2023] "Outline, Then Details: Syntactically Guided Coarse-To-Fine Code Generation", Wenqing Zheng, S P Sharan, Ajay Kumar Jaiswal, …☆43Updated 2 years ago
- Official repo for NAACL 2024 Findings paper "LeTI: Learning to Generate from Textual Interactions."☆66Updated 2 years ago
- vLLM: A high-throughput and memory-efficient inference and serving engine for LLMs☆93Updated this week
- Formal-LLM: Integrating Formal Language and Natural Language for Controllable LLM-based Agents☆132Updated last year
- Code repository for the paper - "AdANNS: A Framework for Adaptive Semantic Search"☆66Updated 2 years ago
- Data preparation code for CrystalCoder 7B LLM☆45Updated last year
- Experiments to assess SPADE on different LLM pipelines.☆17Updated last year
- ☆85Updated 2 years ago
- The code for the paper ROUTERBENCH: A Benchmark for Multi-LLM Routing System☆153Updated last year
- [ICML '24] R2E: Turn any GitHub Repository into a Programming Agent Environment☆138Updated 9 months ago
- LLM Optimize is a proof-of-concept library for doing LLM (large language model) guided blackbox optimization.☆61Updated 2 years ago
- ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward exp…☆226Updated 4 months ago
- ToK aka Tree of Knowledge for Large Language Models LLM. It's a novel dataset that inspires knowledge symbolic correlation in simple inpu…☆55Updated 2 years ago
- Official code for "SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient"☆148Updated 2 years ago
- Repository for CPU Kernel Generation for LLM Inference☆27Updated 2 years ago
- Advanced Reasoning Benchmark Dataset for LLMs☆47Updated 2 years ago
- AI Evaluation Platform☆47Updated 7 months ago
- Benchmark suite for LLMs from Fireworks.ai☆85Updated last week
- ☆74Updated 2 years ago
- ☆83Updated last year
- ☆120Updated last year
- Bamboo-7B Large Language Model☆93Updated last year