Theia-4869 / FasterVLMLinks

Official code for paper: [CLS] Attention is All You Need for Training-Free Visual Token Pruning: Make VLM Inference Faster.
78Updated 6 months ago

Alternatives and similar repositories for FasterVLM

Users that are interested in FasterVLM are comparing it to the libraries listed below

Sorting: