AI-for-Science / MoZiLinks
首个全参数训练的知识产权大模型 MoZi (墨子)
☆21Updated last year
Alternatives and similar repositories for MoZi
Users that are interested in MoZi are comparing it to the libraries listed below
Sorting:
- [ACL 2024] IEPile: A Large-Scale Information Extraction Corpus☆201Updated 8 months ago
- 中文原生检索增强生成测评基准☆122Updated last year
- 国内首个全参数训练的法律大模型 HanFei-1.0 (韩非)☆124Updated last year
- TianGong-AI-Unstructure☆69Updated last week
- Official github repo for ACLUE, an evaluation benchmark focused on ancient Chinese language comprehension☆29Updated last year
- CIKM2023 Best Demo Paper Award. HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NL…☆389Updated last year
- SeqGPT: An Out-of-the-box Large Language Model for Open Domain Sequence Understanding☆226Updated last year
- kbqa,langchain,large langauge model, chatgpt☆81Updated 11 months ago
- ☆163Updated 2 years ago
- 大语言模型微调的项目,包 含了使用QLora微调ChatGLM和LLama☆27Updated 2 years ago
- ☆96Updated last year
- This is the repo which record the evolution of LM-based dialogue system. More details can be found in our original survey paper: A Survey…☆62Updated 5 months ago
- 怎么训练一个LLM分词器☆152Updated 2 years ago
- ☆15Updated last year
- LAiW: A Chinese Legal Large Language Models Benchmark☆83Updated last year
- ☆67Updated last year
- llama,chatglm 等模型的微调☆90Updated last year
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆110Updated 2 years ago
- ☆147Updated last year
- ☆73Updated last year
- ☆98Updated last year
- 雅意信息抽取大模型:在百万级人工构造的高质量信息抽取数据上进行指令微调,由中科闻歌算法团队研发。 (Repo for YAYI Unified Information Extraction Model)☆311Updated last year
- 在中文开源大模型的基础上进行定制化的微调,拥有自己专属的语言模型。☆50Updated 2 years ago
- ☆34Updated 2 months ago
- ChatGPT WebUI using gradio. 给 LLM 对话和检索知识问答RAG提供一个简单好用的Web UI界面☆134Updated last year
- The LLM of NL2GQL with NebulaGraph or Neo4j☆95Updated last year
- "桃李“: 国际中文教育大模型☆182Updated last year
- ☆236Updated last year
- LLM for NER☆82Updated last year
- 基于 LoRA 和 P-Tuning v2 的 ChatGLM-6B 高效参数微调☆55Updated 2 years ago