RapidAI / Open-LlamaLinks
The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.
☆68Updated 2 years ago
Alternatives and similar repositories for Open-Llama
Users that are interested in Open-Llama are comparing it to the libraries listed below
Sorting:
- ☆106Updated last year
- the newest version of llama3,source code explained line by line using Chinese☆22Updated last year
- A Python Package to Access World-Class Generative Models☆129Updated last year
- CodeGPT: A Code-Related Dialogue Dataset Generated by GPT and for GPT☆114Updated 2 years ago
- ☆92Updated last year
- Summarize all open source Large Languages Models and low-cost replication methods for Chatgpt.☆137Updated 2 years ago
- 百度QA100万数据集☆48Updated last year
- Evaluation for AI apps and agent☆43Updated last year
- SuperCLUE琅琊榜:中文通用大模型匿名对战评价基准☆145Updated last year
- A light proxy solution for HuggingFace hub.☆47Updated last year
- Light local website for displaying performances from different chat models.☆87Updated last year
- Imitate OpenAI with Local Models☆89Updated last year
- 大语言模型训练和服务调研☆36Updated 2 years ago
- 顾名思义:手搓的RAG☆127Updated last year
- ☆100Updated last year
- ☆194Updated 7 months ago
- An opensource ChatBot built with ExpertPrompting which achieves 96% of ChatGPT's capability.☆300Updated 2 years ago
- ☆125Updated last year
- Gaokao Benchmark for AI☆108Updated 3 years ago
- SearchGPT: Building a quick conversation-based search engine with LLMs.☆47Updated 8 months ago
- ☆230Updated 2 years ago
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆140Updated last year
- 首个llama2 13b 中文版模型 (Base + 中文对话SFT,实现流畅多轮人机自然语言交互)☆91Updated 2 years ago
- ☆231Updated last year
- LLM Zoo collects information of various open- and close-sourced LLMs☆271Updated 2 years ago
- TianGong-AI-Unstructure☆69Updated 2 months ago
- 如需体验textin文档解析,请点击https://cc.co/16YSIy☆22Updated last year
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆266Updated last year
- deep learning☆149Updated 4 months ago
- ChatGLM-6B-Slim:裁减掉20K图片Token的ChatGLM-6B,完全一样的性能,占用更小的显存。☆127Updated 2 years ago