Theia-4869 / FasterVLMLinks
Official code for paper: [CLS] Attention is All You Need for Training-Free Visual Token Pruning: Make VLM Inference Faster.
☆104Updated 7 months ago
Alternatives and similar repositories for FasterVLM
Users that are interested in FasterVLM are comparing it to the libraries listed below
Sorting: