AIAnytime / On-device-LLM-Inference-using-MediapipeView on GitHub
On-device LLM Inference using Mediapipe LLM Inference API.
23Mar 30, 2024Updated 2 years ago

Alternatives and similar repositories for On-device-LLM-Inference-using-Mediapipe

Users that are interested in On-device-LLM-Inference-using-Mediapipe are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?