Theia-4869 / FasterVLMView on GitHub
Official code for paper: [CLS] Attention is All You Need for Training-Free Visual Token Pruning: Make VLM Inference Faster.
109Jun 29, 2025Updated 8 months ago

Alternatives and similar repositories for FasterVLM

Users that are interested in FasterVLM are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?