infinigence / InfiniWebSearchLinks
A demo built on Megrez-3B-Instruct, integrating a web search tool to enhance the model's question-and-answer capabilities.
☆39Updated last year
Alternatives and similar repositories for InfiniWebSearch
Users that are interested in InfiniWebSearch are comparing it to the libraries listed below
Sorting:
- Imitate OpenAI with Local Models☆90Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs☆140Updated last year
- ☆234Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆141Updated last year
- A dataset template for guiding chat-models to self-cognition, including information about the model’s identity, capabilities, usage, limi…☆31Updated 2 years ago
- [ACL2025 demo track] ROGRAG: A Robustly Optimized GraphRAG Framework☆194Updated last month
- gpt_server是一个用于生产级部署LLMs、Embedding、Reranker、ASR、TTS、文生图、图片编辑和文生视频的开源框架。☆244Updated last week
- ☆175Updated last year
- 首个llama2 13b 中文版模型 (Base + 中文对话SFT,实现流畅多轮人机自然语言交互)☆91Updated 2 years ago
- 中文基于满血DeepSeek-R1蒸馏数据集☆64Updated 11 months ago
- Its an open source LLM based on MOE Structure.☆58Updated last year
- code for piccolo embedding model from SenseTime☆145Updated last year
- NLP 项目记录档案☆62Updated 9 months ago
- 中文原生检索增强生成测评基准☆124Updated last year
- Alpaca Chinese Dataset -- 中文指令微调数据集☆216Updated last year
- GLM Series Edge Models☆158Updated 7 months ago
- vLLM Documentation in Chinese Simplified / vLLM 中文文档☆154Updated last month
- 360zhinao☆290Updated 8 months ago
- Mixture-of-Experts (MoE) Language Model☆195Updated last year
- large language model training-3-stages+deployment☆48Updated 2 years ago
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆264Updated last year
- Baichuan2代码的逐行解析版本,适合小白☆213Updated 2 years ago
- 大语言模 型指令调优工具(支持 FlashAttention)☆177Updated 2 years ago
- (撰写ing..)本仓库偏教程性质,以「模型中文化」为一个典型的模型训练问题切入场景,指导读者上手学习LLM二次微调训练。☆36Updated last year
- XVERSE-MoE-A4.2B: A multilingual large language model developed by XVERSE Technology Inc.☆39Updated last year
- 探索 LLM 在法律行业的应用潜力☆96Updated last year
- 文本去重☆77Updated last year
- llama inference for tencentpretrain☆99Updated 2 years ago
- ☆341Updated 3 months ago
- 大语言模型训练和服务调研☆37Updated 2 years ago