☆27Jul 28, 2025Updated 7 months ago
Alternatives and similar repositories for wind_rwkv
Users that are interested in wind_rwkv are comparing it to the libraries listed below
Sorting:
- ☆12Dec 14, 2024Updated last year
- continous batching and parallel acceleration for RWKV6☆22Jun 28, 2024Updated last year
- Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton☆48Aug 22, 2025Updated 6 months ago
- Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)☆24Jun 6, 2024Updated last year
- This project demonstrates the computation process of the RWKV (Receptance Weighted Key Value) model through Excel spreadsheets.☆18Jun 7, 2025Updated 8 months ago
- Experiments on the impact of depth in transformers and SSMs.☆40Oct 23, 2025Updated 4 months ago
- ☆32Jan 7, 2024Updated 2 years ago
- ☆175Jan 13, 2026Updated last month
- Fast modular code to create and train cutting edge LLMs☆68May 16, 2024Updated last year
- ☆32May 26, 2024Updated last year
- ☆20May 30, 2024Updated last year
- Evaluating LLMs with Dynamic Data☆111Feb 11, 2026Updated 2 weeks ago
- AGaLiTe: Approximate Gated Linear Transformers for Online Reinforcement Learning (Published in TMLR)☆23Oct 15, 2024Updated last year
- Official Repository for Efficient Linear-Time Attention Transformers.☆18Jun 2, 2024Updated last year
- Here we will test various linear attention designs.☆62Apr 25, 2024Updated last year
- Stick-breaking attention☆62Jul 1, 2025Updated 8 months ago
- 用户友好、开箱即用的 RWKV Prompts 示例,适用于所有用户。Awesome RWKV Prompts for general users, more user-friendly, ready-to-use prompt examples.☆35Jan 24, 2025Updated last year
- RWKV-X is a Linear Complexity Hybrid Language Model based on the RWKV architecture, integrating Sparse Attention to improve the model's l…☆54Jan 12, 2026Updated last month
- ☆22Dec 15, 2023Updated 2 years ago
- RADLADS training code☆37May 7, 2025Updated 9 months ago
- Efficient Transformers with Dynamic Token Pooling☆67May 20, 2023Updated 2 years ago
- PyTorch implementation for PaLM: A Hybrid Parser and Language Model.☆10Jan 7, 2020Updated 6 years ago
- ☆10Oct 28, 2020Updated 5 years ago
- Solving puzzles with RWKV locally in your browser.☆12Jan 3, 2026Updated last month
- Advanced Formal Language Theory (263-5352-00L; Frühjahr 2023)☆10Feb 21, 2023Updated 3 years ago
- Official Chinese documentation for RWKV | RWKV官方中文文档☆15Feb 20, 2026Updated last week
- Official Implementation of ACL2023: Don't Parse, Choose Spans! Continuous and Discontinuous Constituency Parsing via Autoregressive Span …☆14Aug 25, 2023Updated 2 years ago
- ☆29Jul 9, 2024Updated last year
- GoldFinch and other hybrid transformer components☆45Jul 20, 2024Updated last year
- ☆51Jan 28, 2024Updated 2 years ago
- ☆106Mar 9, 2024Updated last year
- APPy (Annotated Parallelism for Python) enables users to annotate loops and tensor expressions in Python with compiler directives akin to…☆30Jan 28, 2026Updated last month
- BlackGoose Rimer: RWKV as a Superior Architecture for Large-Scale Time Series Modeling☆32Jul 11, 2025Updated 7 months ago
- RWKV infctx trainer, for training arbitary context sizes, to 10k and beyond!☆148Aug 13, 2024Updated last year
- GoldFinch and other hybrid transformer components☆12Dec 9, 2025Updated 2 months ago
- ☆13Jul 25, 2023Updated 2 years ago
- RWKV-7 mini☆12Mar 29, 2025Updated 11 months ago
- RWKV6 in native pytorch and triton:)☆11Aug 4, 2024Updated last year
- ☆13Dec 21, 2024Updated last year