hkproj / bert-from-scratch
BERT explained from scratch
☆13Updated last year
Alternatives and similar repositories for bert-from-scratch:
Users that are interested in bert-from-scratch are comparing it to the libraries listed below
- Notes on quantization in neural networks☆66Updated last year
- Prune transformer layers☆67Updated 8 months ago
- A set of scripts and notebooks on LLM finetunning and dataset creation☆101Updated 4 months ago
- Unofficial implementation of https://arxiv.org/pdf/2407.14679☆41Updated 4 months ago
- Complete implementation of Llama2 with/without KV cache & inference 🚀☆47Updated 8 months ago
- ☆77Updated 4 months ago
- A collection of LLM related papers, thesis, tools, datasets, courses, open source models, benchmarks☆44Updated 3 months ago
- ☆110Updated 3 weeks ago
- Official repo for the paper PHUDGE: Phi-3 as Scalable Judge. Evaluate your LLMs with or without custom rubric, reference answer, absolute…☆48Updated 6 months ago
- Notes about LLaMA 2 model☆52Updated last year
- I learn about and explain quantization☆26Updated 9 months ago
- ☆22Updated 3 months ago
- Textbook on reinforcement learning from human feedback☆154Updated this week
- Distributed training (multi-node) of a Transformer model☆50Updated 9 months ago
- LLM_library is a comprehensive repository serves as a one-stop resource hands-on code, insightful summaries.☆68Updated last year
- Set of scripts to finetune LLMs