Theia-4869 / FasterVLMView on GitHub
Official code for paper: [CLS] Attention is All You Need for Training-Free Visual Token Pruning: Make VLM Inference Faster.
112Jun 29, 2025Updated 10 months ago

Alternatives and similar repositories for FasterVLM

Users that are interested in FasterVLM are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?