yuunnn-w / RWKV_PytorchLinks
This is an inference framework for the RWKV large language model implemented purely in native PyTorch. The official native implementation is overly complex and lacks extensibility. Let's join the flexible PyTorch ecosystem and open-source it together!
☆131Updated last year
Alternatives and similar repositories for RWKV_Pytorch
Users that are interested in RWKV_Pytorch are comparing it to the libraries listed below
Sorting:
- ☆139Updated last month
- VisualRWKV is the visual-enhanced version of the RWKV language model, enabling RWKV to handle various visual tasks.☆234Updated 2 months ago
- Reinforcement Learning Toolkit for RWKV.(v6,v7,ARWKV) Distillation,SFT,RLHF(DPO,ORPO), infinite context training, Aligning. Exploring the…☆48Updated last month
- Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton☆41Updated last week
- Evaluating LLMs with Dynamic Data☆91Updated last week
- RWKV infctx trainer, for training arbitary context sizes, to 10k and beyond!☆148Updated 11 months ago
- ☆18Updated 7 months ago
- 用户友好、开箱即用的 RWKV Prompts 示例,适用于所有用户 。Awesome RWKV Prompts for general users, more user-friendly, ready-to-use prompt examples.☆36Updated 6 months ago
- This project is established for real-time training of the RWKV model.☆50Updated last year
- rwkv finetuning☆36Updated last year
- RAG SYSTEM FOR RWKV☆50Updated 8 months ago
- Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models☆323Updated 5 months ago
- State tuning tunes the state☆35Updated 5 months ago
- Get down and dirty with FlashAttention2.0 in pytorch, plug in and play no complex CUDA kernels☆106Updated 2 years ago
- RWKV-LM-V7(https://github.com/BlinkDL/RWKV-LM) Under Lightning Framework☆42Updated 2 weeks ago
- A quantization algorithm for LLM☆141Updated last year
- A large-scale RWKV v6, v7(World, PRWKV, Hybrid-RWKV) inference. Capable of inference by combining multiple states(Pseudo MoE). Easy to de…☆40Updated last week
- RWKV in nanoGPT style☆191Updated last year
- Low-bit optimizers for PyTorch☆130Updated last year
- ☆82Updated last year
- continous batching and parallel acceleration for RWKV6☆24Updated last year
- The WorldRWKV project aims to implement training and inference across various modalities using the RWKV7 architecture. By leveraging diff…☆53Updated 2 weeks ago
- RWKV, in easy to read code☆72Updated 4 months ago
- This project is to extend RWKV LM's capabilities including sequence classification/embedding/peft/cross encoder/bi encoder/multi modaliti…☆10Updated 11 months ago
- ☆13Updated 7 months ago
- ☆23Updated 7 months ago
- The homepage of OneBit model quantization framework.☆185Updated 6 months ago
- ☆10Updated last year
- A converter and basic tester for rwkv onnx☆42Updated last year
- ☆124Updated last year