megvii-research / IntLLaMAView on GitHub
IntLLaMA: A fast and light quantization solution for LLaMA
18Jul 21, 2023Updated 2 years ago

Alternatives and similar repositories for IntLLaMA

Users that are interested in IntLLaMA are comparing it to the libraries listed below

Sorting:

Are these results useful?