LowinLi / transformers-stream-generatorLinks
This is a text generation method which returns a generator, streaming out each token in real-time during inference, based on Huggingface/Transformers.
☆97Updated last year
Alternatives and similar repositories for transformers-stream-generator
Users that are interested in transformers-stream-generator are comparing it to the libraries listed below
Sorting:
- A unified tokenization tool for Images, Chinese and English.☆153Updated 2 years ago
- train llama on a single A100 80G node using 🤗 transformers and 🚀 Deepspeed Pipeline Parallelism☆225Updated 2 years ago
- Imitate OpenAI with Local Models☆89Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆73Updated 2 years ago
- Code used for sourcing and cleaning the BigScience ROOTS corpus☆316Updated 2 years ago
- A high-throughput and memory-efficient inference and serving engine for LLMs☆131Updated last year
- Inference script for Meta's LLaMA models using Hugging Face wrapper☆110Updated 2 years ago
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆68Updated 2 years ago
- Official repository for LongChat and LongEval☆531Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs☆138Updated 11 months ago
- Scalable PaLM implementation of PyTorch☆188Updated 2 years ago
- a Fine-tuned LLaMA that is Good at Arithmetic Tasks☆178Updated 2 years ago
- Inspired by google c4, here is a series of colossal clean data cleaning scripts focused on CommonCrawl data processing. Including Chinese…☆131Updated 2 years ago
- ☆123Updated last year
- Open efforts to implement ChatGPT-like models and beyond.☆107Updated last year
- ☆172Updated 2 years ago
- Light local website for displaying performances from different chat models.☆87Updated 2 years ago
- Train llm (bloom, llama, baichuan2-7b, chatglm3-6b) with deepspeed pipeline mode. Faster than zero/zero++/fsdp.☆98Updated last year
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆67Updated 2 years ago
- NTK scaled version of ALiBi position encoding in Transformer.☆69Updated 2 years ago
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆49Updated 2 years ago
- Open Instruction Generalist is an assistant trained on massive synthetic instructions to perform many millions of tasks☆209Updated last year
- Source code for ACL 2023 paper Decoder Tuning: Efficient Language Understanding as Decoding☆51Updated 2 years ago
- 本项目旨在对大量文本文件进行快速编码检测和转换以辅助mnbvc语料集项目的数据清洗工作☆67Updated last month
- LongQLoRA: Extent Context Length of LLMs Efficiently☆167Updated 2 years ago
- MultilingualShareGPT, the free multi-language corpus for LLM training☆73Updated 2 years ago
- Naive Bayes-based Context Extension☆325Updated 11 months ago
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆141Updated last year
- Multi-language Enhanced LLaMA☆303Updated 2 years ago
- A Multi-Turn Dialogue Corpus based on Alpaca Instructions☆177Updated 2 years ago