YuvrajSingh-mist / SmolLlamaLinks

So, I trained a Llama a 130M architecture I coded from ground up to build a small instruct model from scratch. Trained on FineWeb dataset form HuggingFace consisting of 15 M texts (10BT snapshot) for a total of full 3 epochs
14Updated 2 months ago

Alternatives and similar repositories for SmolLlama

Users that are interested in SmolLlama are comparing it to the libraries listed below

Sorting: