VHellendoorn / Code-LMs
Guide to using pre-trained large language models of source code
☆1,821Updated 9 months ago
Alternatives and similar repositories for Code-LMs:
Users that are interested in Code-LMs are comparing it to the libraries listed below
- Home of CodeT5: Open Code LLMs for Code Understanding and Generation☆2,961Updated last year
- CodeGen is a family of open-source model for program synthesis. Trained on TPU-v4. Competitive with OpenAI Codex.☆5,063Updated 2 months ago
- ☆1,468Updated last year
- Code for the paper "Evaluating Large Language Models Trained on Code"☆2,688Updated 3 months ago
- CodeXGLUE☆1,655Updated 11 months ago
- Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Fl…☆2,469Updated 8 months ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,160Updated this week
- CodeTF: One-stop Transformer Library for State-of-the-art Code LLM☆1,477Updated 2 months ago
- Full description can be found here: https://discuss.huggingface.co/t/pretrain-gpt-neo-for-open-source-github-copilot-model/7678?u=ncoop57☆3,297Updated 3 years ago
- Training and serving large-scale neural networks with auto parallelization.☆3,123Updated last year
- The RedPajama-Data repository contains code for preparing large datasets for training large language models.