GioGioBond / NBCEonChatGLM6b
(NBCE)Naive Bayes-based Context Extension on ChatGLM-6b
☆14Updated last year
Alternatives and similar repositories for NBCEonChatGLM6b:
Users that are interested in NBCEonChatGLM6b are comparing it to the libraries listed below
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆47Updated last year
- BLOOM 模型的指令微调☆24Updated last year
- NTK scaled version of ALiBi position encoding in Transformer.☆67Updated last year
- 记录NLP、CV、搜索、推荐等AI岗位最新情况。☆29Updated 2 years ago
- using lear to do ner extraction☆29Updated 3 years ago
- 零样本学习测评基准,中文版☆56Updated 3 years ago
- This repository open-sources our GEC system submitted by THU KELab (sz) in the CCL2023-CLTC Track 1: Multidimensional Chinese Learner Tex…☆14Updated last year
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆75Updated 2 years ago
- ☆97Updated last year
- GOAT(山羊)是中英文大语言模型,基于LlaMa进行SFT。☆12Updated 2 years ago
- ☆57Updated 2 years ago
- LORA微调BLOOMZ,参考BELLE☆25Updated 2 years ago
- deep training task☆29Updated 2 years ago
- CTC2021-中文文本纠错大赛的SOTA方案及在线演示☆72Updated last year
- 基于 LoRA 和 P-Tuning v2 的 ChatGLM-6B 高效参数微调☆55Updated last year
- benchmark of KgCLUE, with different models and methods☆27Updated 3 years ago
- moss chat finetuning☆50Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆19Updated last year
- 1.4B sLLM for Chinese and English - HammerLLM🔨☆44Updated last year
- OPD: Chinese Open-Domain Pre-trained Dialogue Model☆75Updated last year
- GoGPT:基于Llama/Llama 2训练的中英文增强大模型|Chinese-Llama2☆78Updated last year
- SinglepassTextCluster, an TextCluster tools based on Singlepass cluster algorithm that use tfidf vector and doc2vec,which can be used for…☆62Updated 3 years ago
- ☆25Updated last year
- 格物-多语言和中文大规模预训练模型-轻量版,涵盖纯中文、知识增强、113个语种多语言,采用主流Roberta架构,适用于NLU和NLG任务, 支持pytorch、tensorflow、uer、huggingface等框架。 Multilingual and Chinese …☆28Updated 2 years ago
- 文本智能校对大赛(Chinese Text Correction)的baseline☆67Updated 2 years ago
- The baseline method for CCIR 22 https://www.datafountain.cn/competitions/573☆13Updated 2 years ago
- ☆37Updated 8 months ago
- use chatGLM to perform text embedding☆45Updated 2 years ago
- Qwen-WisdomVast is a large model trained on 1 million high-quality Chinese multi-turn SFT data, 200,000 English multi-turn SFT data, and …☆18Updated last year
- A repo for update and debug Mixtral-7x8B、MOE、ChatGLM3、LLaMa2、 BaChuan、Qwen an other LLM models include new models mixtral, mixtral 8x7b, …☆43Updated last month