princeton-nlp / ELIZA-Transformer
Representing Rule-based Chatbots with Transformers
☆18Updated 3 months ago
Related projects ⓘ
Alternatives and complementary repositories for ELIZA-Transformer
- ☆29Updated this week
- Official repository for ICML 2024 paper "MoRe Fine-Tuning with 10x Fewer Parameters"☆16Updated this week
- SELF-GUIDE: Better Task-Specific Instruction Following via Self-Synthetic Finetuning. COLM 2024 Accepted Paper☆25Updated 5 months ago
- DPO, but faster 🚀☆20Updated last week
- ☆15Updated 3 months ago
- Enable Next-sentence Prediction for Large Language Models with Faster Speed, Higher Accuracy and Longer Context☆16Updated 2 months ago
- In-Context Alignment: Chat with Vanilla Language Models Before Fine-Tuning☆33Updated last year
- Source code for MMEvalPro, a more trustworthy and efficient benchmark for evaluating LMMs☆22Updated last month
- ☆34Updated 2 months ago
- ☆57Updated last month
- Code Implementation, Evaluations, Documentation, Links and Resources for Min P paper☆18Updated last month
- Official Repository for Paper "BaichuanSEED: Sharing the Potential of ExtensivE Data Collection and Deduplication by Introducing a Compet…☆18Updated 2 months ago
- imagetokenizer is a python package, helps you encoder visuals and generate visuals token ids from codebook, supports both image and video…☆27Updated 4 months ago
- ☆13Updated last year
- ☆17Updated 2 months ago
- Fast LLM Training CodeBase With dynamic strategy choosing [Deepspeed+Megatron+FlashAttention+CudaFusionKernel+Compiler];☆34Updated 10 months ago
- Implementation of the LDP module block in PyTorch and Zeta from the paper: "MobileVLM: A Fast, Strong and Open Vision Language Assistant …☆14Updated 8 months ago
- ☆22Updated 2 months ago
- A public implementation of the ReLoRA pretraining method, built on Lightning-AI's Pytorch Lightning suite.☆33Updated 8 months ago
- ☆21Updated 2 months ago
- Implementation of the model: "Reka Core, Flash, and Edge: A Series of Powerful Multimodal Language Models" in PyTorch☆29Updated this week
- A Framework for Decoupling and Assessing the Capabilities of VLMs☆38Updated 4 months ago
- XVERSE-MoE-A36B: A multilingual large language model developed by XVERSE Technology Inc.☆36Updated last month
- This is a personal reimplementation of Google's Infini-transformer, utilizing a small 2b model. The project includes both model and train…☆52Updated 6 months ago
- [ICLR'24 spotlight] Tool-Augmented Reward Modeling☆36Updated 8 months ago
- A huge dataset for Document Visual Question Answering☆13Updated 3 months ago
- ☆17Updated last year
- Codebase for Instruction Following without Instruction Tuning☆29Updated last month
- A Data Source for Reasoning Embodied Agents☆19Updated last year
- Implementation of the paper: "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" from Google in pyTO…☆51Updated this week