eigencore / Tlama_124MLinks

Tlama (124M) is a language model based on LlaMa3 (127M) optimized by EigenCore. It is designed for computational efficiency and scalability, allowing its use on resource-limited hardware without compromising performance.
12Updated 2 months ago

Alternatives and similar repositories for Tlama_124M

Users that are interested in Tlama_124M are comparing it to the libraries listed below

Sorting: