XVERSE-MoE-A36B: A multilingual large language model developed by XVERSE Technology Inc.
☆39Sep 12, 2024Updated last year
Alternatives and similar repositories for XVERSE-MoE-A36B
Users that are interested in XVERSE-MoE-A36B are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆139Apr 9, 2024Updated 2 years ago
- FBI-LLM: Scaling Up Fully Binarized LLMs from Scratch via Autoregressive Distillation☆52Aug 24, 2025Updated 8 months ago
- ☆44Sep 19, 2024Updated last year
- Code for paper: "Executing Arithmetic: Fine-Tuning Large Language Models as Turing Machines"☆11Oct 11, 2024Updated last year
- Coursera Corpus Mining and Multistage Fine-Tuning for Improving Lectures Translation☆15Aug 27, 2024Updated last year
- Simple, predictable pricing with DigitalOcean hosting • AdAlways know what you'll pay with monthly caps and flat pricing. Enterprise-grade infrastructure trusted by 600k+ customers.
- ☆14Oct 11, 2023Updated 2 years ago
- ☆85Oct 28, 2024Updated last year
- The code and data for the paper JiuZhang3.0☆49May 26, 2024Updated last year
- LongLLaVA: Scaling Multi-modal LLMs to 1000 Images Efficiently via Hybrid Architecture