yuunnn-w / RWKV_PytorchLinks
This is an inference framework for the RWKV large language model implemented purely in native PyTorch. The official native implementation is overly complex and lacks extensibility. Let's join the flexible PyTorch ecosystem and open-source it together!
☆129Updated 11 months ago
Alternatives and similar repositories for RWKV_Pytorch
Users that are interested in RWKV_Pytorch are comparing it to the libraries listed below
Sorting:
- ☆134Updated 3 weeks ago
- VisualRWKV is the visual-enhanced version of the RWKV language model, enabling RWKV to handle various visual tasks.☆231Updated last month
- RAG SYSTEM FOR RWKV☆50Updated 7 months ago
- ☆18Updated 6 months ago
- rwkv finetuning☆36Updated last year
- Evaluating LLMs with Dynamic Data☆93Updated last month
- Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton☆38Updated this week
- RWKV infctx trainer, for training arbitary context sizes, to 10k and beyond!☆148Updated 11 months ago
- Reinforcement Learning Toolkit for RWKV.(v6,v7,ARWKV) Distillation,SFT,RLHF(DPO,ORPO), infinite context training, Aligning. Exploring the…☆47Updated 2 weeks ago
- The WorldRWKV project aims to implement training and inference across various modalities using the RWKV7 architecture. By leveraging diff…☆52Updated last week
- This project is established for real-time training of the RWKV model.☆49Updated last year
- Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models☆322Updated 4 months ago
- ☆23Updated 6 months ago
- 用户友好、开箱即用的 RWKV Prompts 示例,适用于所有用户。Awesome RWKV Prompts for general users, more user-friendly, ready-to-use prompt examples.☆36Updated 5 months ago
- Get down and dirty with FlashAttention2.0 in pytorch, plug in and play no complex CUDA kernels☆105Updated last year
- ☆13Updated 6 months ago
- continous batching and parallel acceleration for RWKV6☆24Updated last year
- ☆82Updated last year
- The homepage of OneBit model quantization framework.☆184Updated 5 months ago
- RWKV in nanoGPT style☆191Updated last year
- RWKV, in easy to read code☆72Updated 3 months ago
- Low-bit optimizers for PyTorch☆129Updated last year
- MiSS is a novel PEFT method that features a low-rank structure but introduces a new update mechanism distinct from LoRA, achieving an exc…☆20Updated 3 weeks ago
- Inference RWKV with multiple supported backends.☆51Updated this week
- ☆124Updated last year
- State tuning tunes the state☆34Updated 5 months ago
- This project is to extend RWKV LM's capabilities including sequence classification/embedding/peft/cross encoder/bi encoder/multi modaliti…☆10Updated 11 months ago
- RWKV-LM-V7(https://github.com/BlinkDL/RWKV-LM) Under Lightning Framework☆35Updated this week
- A large-scale RWKV v6, v7(World, PRWKV, Hybrid-RWKV) inference. Capable of inference by combining multiple states(Pseudo MoE). Easy to de…☆38Updated last week
- Official implementation of TransNormerLLM: A Faster and Better LLM☆247Updated last year