xverse-ai / XVERSE-MoE-A36B
XVERSE-MoE-A36B: A multilingual large language model developed by XVERSE Technology Inc.
☆36Updated 6 months ago
Alternatives and similar repositories for XVERSE-MoE-A36B:
Users that are interested in XVERSE-MoE-A36B are comparing it to the libraries listed below
- imagetokenizer is a python package, helps you encoder visuals and generate visuals token ids from codebook, supports both image and video…☆31Updated 9 months ago
- ☆32Updated 2 months ago
- Fast LLM Training CodeBase With dynamic strategy choosing [Deepspeed+Megatron+FlashAttention+CudaFusionKernel+Compiler];☆36Updated last year
- ☆27Updated last month
- XVERSE-MoE-A4.2B: A multilingual large language model developed by XVERSE Technology Inc.☆37Updated 11 months ago
- 最简易的R1结果在小模型上的复现,阐述类O1与DeepSeek R1最重要的本质。Think is all your need。利用实验佐证,对于强推理能力,think思考过程性内容是AGI/ASI的核心。☆42Updated 2 months ago
- Empirical Study Towards Building An Effective Multi-Modal Large Language Model☆23Updated last year
- ☆16Updated 2 months ago
- ☆29Updated 7 months ago
- Our 2nd-gen LMM☆33Updated 10 months ago
- Pytorch implementation of HyperLLaVA: Dynamic Visual and Language Expert Tuning for Multimodal Large Language Models☆28Updated last year
- Its an open source LLM based on MOE Structure.☆58Updated 9 months ago
- SELF-GUIDE: Better Task-Specific Instruction Following via Self-Synthetic Finetuning. COLM 2024 Accepted Paper☆30Updated 10 months ago
- [NAACL 2025] Representing Rule-based Chatbots with Transformers☆20Updated 2 months ago
- A Framework for Decoupling and Assessing the Capabilities of VLMs☆41Updated 9 months ago
- ☆62Updated last week
- ☆13Updated 5 months ago
- Official repository for ICML 2024 paper "MoRe Fine-Tuning with 10x Fewer Parameters"☆17Updated 3 weeks ago
- ☆36Updated 7 months ago
- SkyScript-100M: 1,000,000,000 Pairs of Scripts and Shooting Scripts for Short Drama: https://arxiv.org/abs/2408.09333v2☆118Updated 4 months ago
- ☆33Updated 4 months ago
- ☆20Updated 5 months ago
- FuseAI Project☆84Updated 2 months ago
- Data preparation code for CrystalCoder 7B LLM☆44Updated 11 months ago
- Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models☆130Updated 9 months ago
- Family of instruction-following LLMs powered by Evol-Instruct: WizardLM, WizardCoder☆45Updated 11 months ago
- A Toolkit for Running On-device Large Language Models (LLMs) in APP☆72Updated 9 months ago
- Web2Code: A Large-scale Webpage-to-Code Dataset and Evaluation Framework for Multimodal LLMs☆76Updated 5 months ago
- ☆11Updated 7 months ago
- From Hours to Minutes: Lossless Acceleration of Ultra Long Sequence Generation☆85Updated 3 weeks ago