vantienpham / Awesome-Tensor-DecompositionLinks
π A curated list of tensor decomposition resources for model compression.
β90Updated 2 weeks ago
Alternatives and similar repositories for Awesome-Tensor-Decomposition
Users that are interested in Awesome-Tensor-Decomposition are comparing it to the libraries listed below
Sorting:
- β290Updated last year
- [ICLR'24] "DeepZero: Scaling up Zeroth-Order Optimization for Deep Model Training" by Aochuan Chen*, Yimeng Zhang*, Jinghan Jia, James Diβ¦β69Updated last year
- β13Updated 4 years ago
- Second-Order Fine-Tuning without Pain for LLMs: a Hessian Informed Zeroth-Order Optimizerβ20Updated 11 months ago
- β45Updated 2 years ago
- Efficient Expert Pruning for Sparse Mixture-of-Experts Language Models: Enhancing Performance and Reducing Inference Costsβ22Updated 2 months ago
- Reading list for research topics in state-space modelsβ342Updated 7 months ago
- Neural Tangent Kernel Papersβ120Updated last year
- (NeurIPS 2024) QuanTA: Efficient High-Rank Fine-Tuning of LLMs with Quantum-Informed Tensor Adaptationβ34Updated last month
- Collect optimizer related papers, data, repositoriesβ99Updated last year
- Official code implementation for 2025 ICLR accepted paper "Dobi-SVD : Differentiable SVD for LLM Compression and Some New Perspectives"β50Updated 2 months ago
- β11Updated 2 years ago
- [NAACL 24 Oral] LoRETTA: Low-Rank Economic Tensor-Train Adaptation for Ultra-Low-Parameter Fine-Tuning of Large Language Modelsβ39Updated last year
- β38Updated last month
- Compressible Dynamics in Deep Overparameterized Low-Rank Learning & Adaptation (ICML'24 Oral)β13Updated last year
- [ICMLβ24] Official code for the paper "Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark ".