kingabzpro / Deploying-Llama-3.3-70B
View external linksLinks

Serve Llama 3.3 70B (with AWQ quantization) using vLLM and deploy it on BentoCloud.
31Jan 27, 2025Updated last year

Alternatives and similar repositories for Deploying-Llama-3.3-70B

Users that are interested in Deploying-Llama-3.3-70B are comparing it to the libraries listed below

Sorting:

Are these results useful?