xNul / code-llama-for-vscode
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
β569Updated 8 months ago
Alternatives and similar repositories for code-llama-for-vscode:
Users that are interested in code-llama-for-vscode are comparing it to the libraries listed below
- C++ implementation for π«StarCoderβ453Updated last year
- Self-evaluating interview for AI codersβ579Updated this week
- A llama.cpp drop-in replacement for OpenAI's GPT endpoints, allowing GPT-powered apps to run off local llama.cpp models instead of OpenAIβ¦β598Updated last year
- An Autonomous LLM Agent that runs on Wizcoder-15Bβ335Updated 6 months ago
- INT4/INT5/INT8 and FP16 inference on CPU for RWKV language modelβ1,509Updated last month
- LLM powered development for VSCodeβ1,292Updated 9 months ago
- Python bindings for the Transformer models implemented in C/C++ using GGML library.β1,859Updated last year
- Uses Auto-GPT with Llama.cppβ386Updated last year
- ggml implementation of BERTβ487Updated last year
- The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM β¦β552Updated 2 months ago
- An AI assistant beyond the chat box.β325Updated last year
- Make Llama2 use Code Execution, Debug, Save Code, Reuse it, Access to Internetβ687Updated last year
- Customizable implementation of the self-instruct paper.β1,043Updated last year
- MiniLLM is a minimal system for running modern LLMs on consumer-grade GPUsβ903Updated last year
- β1,468Updated last year
- Chat language model that can use tools and interpret the results