infinigence / InfiniWebSearch
A demo built on Megrez-3B-Instruct, integrating a web search tool to enhance the model's question-and-answer capabilities.
☆37Updated 3 months ago
Alternatives and similar repositories for InfiniWebSearch:
Users that are interested in InfiniWebSearch are comparing it to the libraries listed below
- Imitate OpenAI with Local Models☆88Updated 6 months ago
- 中文原生检索增强生成测评基准☆112Updated 11 months ago
- A high-throughput and memory-efficient inference and serving engine for LLMs☆132Updated 3 months ago
- Its an open source LLM based on MOE Structure.☆58Updated 8 months ago
- Delta-CoMe can achieve near loss-less 1-bit compressin which has been accepted by NeurIPS 2024☆54Updated 4 months ago
- 我们是第 一个完全可商用的角色大模型。☆39Updated 7 months ago
- GLM Series Edge Models☆132Updated last month
- 纯c++的全平台llm加速库,支持python调用,支持baichuan, glm, llama, moss基座,手机端流畅运行chatglm-6B级模型单卡可达10000+token / s,☆45Updated last year
- ☆134Updated 10 months ago
- ☆105Updated last year
- 首个llama2 13b 中文版模型 (Base + 中文对话SFT,实现流畅多轮人机自然语言交互)☆90Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆139Updated 11 months ago
- (撰写ing..)本仓库偏教程性质,以「模型中文化」为一个典型的模型训练问题切入场景,指导读者上手学习LLM二次微调训练。☆32Updated 7 months ago
- ☆225Updated 10 months ago
- 大语言模型训练和服务调研☆37Updated last year
- A dataset template for guiding chat-models to self-cognition, including information about the model’s identity, capabilities, usage, limi…☆27Updated last year
- zero零训练llm调参☆31Updated last year
- Python3 package for Chinese/English OCR, with paddleocr-v4 onnx model(~14MB). 基于ppocr-v4-onnx模型推理,可实现 CPU 上毫秒级的 OCR 精准预测,通用场景中英文OCR达到开源SO…☆63Updated last month
- 文本去重☆69Updated 9 months ago
- Alpaca Chinese Dataset -- 中文指令微调数据集☆193Updated 5 months ago
- the newest version of llama3,source code explained line by line using Chinese☆22Updated 11 months ago
- XVERSE-MoE-A4.2B: A multilingual large language model developed by XVERSE Technology Inc.☆36Updated 10 months ago
- 国内首个全参数训练的法律大模型 HanFei-1.0 (韩非)☆114Updated last year
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆260Updated 10 months ago
- SearchGPT: Building a quick conversation-based search engine with LLMs.☆45Updated 2 months ago
- DSPy中文文档☆24Updated 9 months ago
- ☆29Updated 6 months ago
- Mixture-of-Experts (MoE) Language Model☆185Updated 6 months ago
- 骆驼QA,中文大语言阅读理解模型。☆75Updated last year