Theia-4869 / FasterVLMLinks

Official code for paper: [CLS] Attention is All You Need for Training-Free Visual Token Pruning: Make VLM Inference Faster.
82Updated 2 weeks ago

Alternatives and similar repositories for FasterVLM

Users that are interested in FasterVLM are comparing it to the libraries listed below

Sorting: