enze5088 / Chatterbox
Chinese large language model
☆117Updated last year
Alternatives and similar repositories for Chatterbox:
Users that are interested in Chatterbox are comparing it to the libraries listed below
- Code and data for crosstalk text generation tasks, exploring whether large models and pre-trained language models can understand humor. …☆144Updated 2 years ago
- 从预训练到强化学习的中文llama2☆88Updated last year
- 保险行业回访外呼机器人☆62Updated 2 years ago
- 最容易上手的0门槛 chatglm3 & agent & langchain 项目☆220Updated last year
- “英特尔创新大师杯”深度学习挑战赛 赛道3:CCKS2021中文NLP地址相关性任务☆145Updated 2 years ago
- This tool(enhance_long) aims to enhance the LlaMa2 long context extrapolation capability in the lowest-cost approach, preferably without …☆45Updated last year
- RASA中文任务型机器人☆98Updated 4 months ago
- CCKS‘2021:《SGSum:一个面向体育赛事摘要的人工标注数据集》☆21Updated 3 years ago
- [EMNLP 2023] FreeAL: Towards Human-Free Active Learning in the Era of Large Language Models☆86Updated last year
- ☆50Updated last year
- GoGPT:基于Llama/Llama 2训练的中英文增强大模型|Chinese-Llama2☆78Updated last year
- [ACL 2024] User-friendly evaluation framework: Eval Suite & Benchmarks: UHGEval, HaluEval, HalluQA, etc.☆161Updated 4 months ago
- 使用deepspeed从头开始训练一个LLM,经过pretrain和sft阶段,验证llm学习知识、理解语言、回答问题的能力☆147Updated 8 months ago
- ☆54Updated last year
- 用VLLM框架部署千问1.5并进行流式输出☆91Updated 11 months ago
- bert、roberta ner命名实体识别☆91Updated 3 years ago
- notes for Multi-hop Reading Comprehension and open-domain question answering☆85Updated 2 years ago
- bert、roberta、ernie等方法进行文本分类☆88Updated last year
- DeepRetrieval - Hacking 🔥Real Search Engines and Text/Data Retrievers with LLM + RL☆196Updated this week
- improve Llama-2's proficiency in comprehension, generation, and translation of Chinese.☆450Updated last year
- This is a repo for my NanoGPT Pytorch2.0 Implementation when torch2.0 released soon, faster and simpler, a good tutorial learning GPT.☆52Updated last year
- 一个基于预训练的句向量生成工具☆136Updated last year
- 专注于中文领域大语言模型,落地到某个行业某个领域,成为一个行业大模型、公司级别或行业级别领域大模型。☆116Updated 3 weeks ago
- 使用Qwen1.5-0.5B-Chat模型进行通用信息抽取任务的微调,旨在: 验证生成式方法相较于抽取式NER的效果; 为新手提 供简易的模型微调流程,尽量减少代码量; 大模型训练的数据格式处理。☆11Updated 6 months ago
- 使用sentencepiece中BPE训练中文词表,并在transformers中进行使用。☆116Updated last year
- 使用qlora对中文大语言模型进行微调,包含ChatGLM、Chinese-LLaMA-Alpaca、BELLE☆85Updated last year
- A private nlp coding package, which quickly implements the SOTA solutions.☆295Updated 2 years ago
- ChatGLM-6B fine-tuning.☆135Updated last year
- Official codes for ACL 2023 paper "WebCPM: Interactive Web Search for Chinese Long-form Question Answering"☆918Updated last year
- Implementation of Chinese ChatGPT☆287Updated last year