fudanchenjiahao / RLAlgorithmLinks
☆11Updated 2 years ago
Alternatives and similar repositories for RLAlgorithm
Users that are interested in RLAlgorithm are comparing it to the libraries listed below
Sorting:
- 收集了目前为止中文领域的MRC抽取式数据集☆122Updated last year
- Implementation of Chinese ChatGPT☆288Updated 2 years ago
- A framework for cleaning Chinese dialog data☆273Updated 4 years ago
- [SIGIR 2022] Multi-CPR: A Multi Domain Chinese Dataset for Passage Retrieval☆200Updated 3 years ago
- 对ChatGLM直接使用RLHF提升或降低目标输出概率|Modify ChatGLM output with only RLHF☆197Updated 2 years ago
- ☆273Updated last year
- Mengzi Pretrained Models☆540Updated 3 years ago
- CoSENT、STS、SentenceBERT☆170Updated 11 months ago
- OCNLI: 中文原版自然语言推理任务☆163Updated 4 years ago
- Finetuning LLaMA with RLHF (Reinforcement Learning with Human Feedback) based on DeepSpeed Chat☆117Updated 2 years ago
- 中文机器阅读理解数据集☆109Updated 4 years ago
- T2Ranking: A large-scale Chinese benchmark for passage ranking.☆162Updated 2 years ago
- This is the repository of the Ape210K dataset and baseline models.☆199Updated 6 years ago
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆98Updated 3 years ago
- llama,chatglm 等模型的微调☆91Updated last year
- ☆420Updated last year
- 3000000+语义理解与匹配数据集。可用于无监督对比学习、半监督学习等构建中文领域效果最好的预训练模型☆311Updated 3 years ago
- A Multi-modal Model Chinese Spell Checker Released on ACL2021.☆161Updated 2 years ago
- P-tuning方法在中文上的简单实验☆140Updated 4 years ago
- Pattern-Exploiting Training在中文上的简单实验☆173Updated 5 years ago
- SimCSE有监督与无监督实验复现☆151Updated last year
- Efficient, Low-Resource, Distributed transformer implementation based on BMTrain☆266Updated 2 years ago
- 真 · “Deep Learning for Humans”☆141Updated 4 years ago
- ☆277Updated 3 years ago
- 零样本学习测评基准,中文版☆59Updated 4 years ago
- 基于模板的文本纠错;Automatically Mining Error Templates for Grammatical Error Correction☆44Updated 3 years ago
- 一些代码实践分享。☆22Updated 5 years ago
- pytorch分布式训练☆73Updated 2 years ago
- 🙈 An unofficial implementation of SoftMaskedBert based on huggingface/transformers.☆97Updated 4 years ago
- 使用Mask LM预训练任务来预训练Bert模型。训练垂直领域语料的模型表征,提升下游任务的表现。☆48Updated 2 years ago