☆19Nov 6, 2023Updated 2 years ago
Alternatives and similar repositories for Quantize-Watermark
Users that are interested in Quantize-Watermark are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- [AAAI 2024] DenoSent: A Denoising Objective for Self-Supervised Sentence Representation Learning☆15Apr 29, 2024Updated last year
- ☆38Oct 2, 2024Updated last year
- SLMTokBench for paper "SpeechTokenizer: Unified Speech Tokenizer for Speech Large Language Models"☆37Aug 29, 2023Updated 2 years ago
- ☆17Apr 7, 2025Updated 11 months ago
- [CVPR 2025 Highlight] FIMA-Q: Post-Training Quantization for Vision Transformers by Fisher Information Matrix Approximation☆25Jun 16, 2025Updated 9 months ago
- ☆25Oct 31, 2024Updated last year
- ☆31Oct 12, 2023Updated 2 years ago
- [ICLR 2025] BitStack: Any-Size Compression of Large Language Models in Variable Memory Environments☆38Feb 17, 2025Updated last year
- Multi-GPU supported kmeans clustering for cluser-clip☆15Jun 3, 2024Updated last year
- ☆12Jul 30, 2025Updated 7 months ago
- Code and pretrained models for "DUB: Discrete Unit Back-translation for Speech Translation" (ACL 2023 Findings)☆28Jun 28, 2023Updated 2 years ago
- ☆12Jul 23, 2024Updated last year
- GAOGAO-Bench-Updates is a supplement to the GAOKAO-Bench, a dataset to evaluate large language models.☆39Jan 7, 2025Updated last year
- ☆13Jun 22, 2025Updated 9 months ago
- Unofficial Scalable-Softmax Is Superior for Attention☆20May 30, 2025Updated 9 months ago
- Official Code For Dual Grained Quantization: Efficient Fine-Grained Quantization for LLM☆14Dec 27, 2023Updated 2 years ago
- [ICLR 2025] Official Pytorch Implementation of "Mix-LN: Unleashing the Power of Deeper Layers by Combining Pre-LN and Post-LN" by Pengxia…☆29Jul 24, 2025Updated 7 months ago
- ☆32Jun 6, 2024Updated last year
- Source code of paper "An Unforgeable Publicly Verifiable Watermark for Large Language Models" accepted by ICLR 2024☆34May 23, 2024Updated last year
- Code for the AAAI 2024 Oral paper "OWQ: Outlier-Aware Weight Quantization for Efficient Fine-Tuning and Inference of Large Language Model…☆69Mar 7, 2024Updated 2 years ago
- PyTorch implementation of "Deep Transferring Quantization" (ECCV2020)☆18Jun 22, 2022Updated 3 years ago
- [ICLR 2024] This is the official PyTorch implementation of "QLLM: Accurate and Efficient Low-Bitwidth Quantization for Large Language Mod…☆39Mar 11, 2024Updated 2 years ago
- [ACL'2024 Findings] GAOKAO-MM: A Chinese Human-Level Benchmark for Multimodal Models Evaluation☆78Mar 13, 2024Updated 2 years ago
- Adversarial Lipschitz Regularization☆10Jun 10, 2021Updated 4 years ago
- FusedChat is a dialogue dataset. It contains dialogue sessions fusing task-oriented dialogues and open-domain dialogues.☆29Jul 20, 2022Updated 3 years ago
- SpeechAgents: Human-Communication Simulation with Multi-Modal Multi-Agent Systems☆85Jan 9, 2024Updated 2 years ago
- AFPQ code implementation☆23Nov 6, 2023Updated 2 years ago
- [ICML 2023] Protecting Language Generation Models via Invisible Watermarking☆13Sep 8, 2023Updated 2 years ago
- [COLM 2025] Official PyTorch implementation of "Quantization Hurts Reasoning? An Empirical Study on Quantized Reasoning Models"☆72Jul 8, 2025Updated 8 months ago
- A framework to compare low-bit integer and float-point formats☆71Feb 6, 2026Updated last month
- [ICML 2024] When Linear Attention Meets Autoregressive Decoding: Towards More Effective and Efficient Linearized Large Language Models☆35Jun 12, 2024Updated last year
- Official code repository for AAAI2021 paper Finding Sparse Structures for Domain Specific Neural Machine Translation☆11Apr 1, 2021Updated 4 years ago
- ☆17Dec 7, 2023Updated 2 years ago
- ☆38Jul 25, 2022Updated 3 years ago
- [ACL 2024] An Easy-to-use Hallucination Detection Framework for LLMs.☆39Feb 25, 2025Updated last year
- Expert Specialization MoE Solution based on CUTLASS☆27Jan 19, 2026Updated 2 months ago
- ☆25Dec 11, 2021Updated 4 years ago
- ☆30Jul 22, 2024Updated last year
- ☆13Dec 28, 2023Updated 2 years ago