eigencore / Tlama_124M

Tlama (124M) is a language model based on LlaMa3 (127M) optimized by EigenCore. It is designed for computational efficiency and scalability, allowing its use on resource-limited hardware without compromising performance.
12Updated last week

Alternatives and similar repositories for Tlama_124M:

Users that are interested in Tlama_124M are comparing it to the libraries listed below