smallcloudai / refact-lsp
Rust executable for Refact Agent, it lives inside your IDE and keeps AST and VecDB indexes up to date, offers agentic tools for an AI model to call. Yes, it works as a LSP server from IDE point of view.
☆59Updated 2 months ago
Alternatives and similar repositories for refact-lsp
Users that are interested in refact-lsp are comparing it to the libraries listed below
Sorting:
- Refact - Open-Source AI Agent, Code Generator & Chat for JavaScript, Python, TypeScript, Java, PHP, Go, and more.☆104Updated this week
- Multi-language code navigation API in a container☆76Updated last week
- Split code into semantic chunks☆24Updated 7 months ago
- Using Large Language Models for Repo-wide Type Prediction☆109Updated last year
- Create chatbot and AI agent workflows with unified access.☆49Updated this week
- LLM finetuning☆42Updated last year
- "Zero setup" & "Blazingly fast" general code file relationship analysis. With Python & Rust. Based on tree-sitter and git analysis. Suppo…☆63Updated this week
- Refact AI: Open-source AI Code assistant with autocompletion, chat, refactoring and more for IntelliJ JetBrains IDEs☆60Updated this week
- Python module that creates a context map for AI code generation☆22Updated 9 months ago
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆59Updated last year
- proof-of-concept of Cursor's Instant Apply feature☆80Updated 8 months ago
- An LLM-powered (CodeLlama or OpenAI) local diff code review tool.☆37Updated 9 months ago
- Built for demanding AI workflows, this gateway offers low-latency, provider-agnostic access, ensuring your AI applications run smoothly a…☆64Updated 2 months ago
- The Self-Hosted Solution for Running AI-Generated Code Securely☆58Updated this week
- Langchain Agent utilizing OpenAI Function Calls to execute Git commands using Natural Language☆44Updated last year
- A new benchmark for measuring LLM's capability to detect bugs in large codebase.☆30Updated 11 months ago
- Contains the prompts we use to talk to various LLMs for different utilities inside the editor☆76Updated last year
- TensorRT-LLM server with Structured Outputs (JSON) built with Rust☆52Updated 3 weeks ago
- 🛤️ Pathik - High-Performance Web Crawler ⚡☆26Updated last month
- Python client for accessing the turbopuffer API.☆47Updated this week
- ☆55Updated 5 months ago
- ☆29Updated last year
- GPU accelerated client-side embeddings for vector search, RAG etc.☆66Updated last year
- Run AI generated code in isolated sandboxes☆71Updated 3 months ago
- A multi-language source code analyzer and docstrings parser☆64Updated last year
- A list of flow functions☆36Updated last year
- A high-performance constrained decoding engine based on context free grammar in Rust☆51Updated 4 months ago
- Light WebUI for lm.rs☆23Updated 7 months ago
- LLama implementations benchmarking framework☆12Updated last year
- The DPAB-α Benchmark☆21Updated 4 months ago