stas00 / portingLinks
Helper scripts and notes that were used while porting various nlp models
β48Updated 3 years ago
Alternatives and similar repositories for porting
Users that are interested in porting are comparing it to the libraries listed below
Sorting:
- A diff tool for language modelsβ44Updated last year
- π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.β81Updated 3 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.β96Updated 2 years ago
- Official code and model checkpoints for our EMNLP 2022 paper "RankGen - Improving Text Generation with Large Ranking Models" (https://arxβ¦β138Updated 2 years ago
- β101Updated 2 years ago
- Official repository with code and data accompanying the NAACL 2021 paper "Hurdles to Progress in Long-form Question Answering" (https://aβ¦β46Updated 3 years ago
- β67Updated 3 years ago
- β46Updated 3 years ago
- β78Updated 2 years ago
- Embedding Recycling for Language modelsβ38Updated 2 years ago
- Implementation of Marge, Pre-training via Paraphrasing, in Pytorchβ76Updated 4 years ago
- Open source library for few shot NLPβ78Updated 2 years ago
- β30Updated 4 years ago
- Evaluation suite for large-scale language models.β128Updated 4 years ago
- π οΈ Tools for Transformers compression using PyTorch Lightning β‘β85Updated last week
- β75Updated 4 years ago
- Viewer for the π€ datasets library.β86Updated 4 years ago
- Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)β61Updated 2 years ago
- Tutorial to pretrain & fine-tune a π€ Flax T5 model on a TPUv3-8 with GCPβ58Updated 3 years ago
- β21Updated 4 years ago
- Shared code for training sentence embeddings with Flax / JAXβ28Updated 4 years ago
- Seahorse is a dataset for multilingual, multi-faceted summarization evaluation. It consists of 96K summaries with human ratings along 6 qβ¦β89Updated last year
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+β37Updated 4 years ago
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scaleβ157Updated last year
- Training and evaluation code for the paper "Headless Language Models: Learning without Predicting with Contrastive Weight Tying" (https:/β¦β28Updated last year
- β44Updated 5 years ago
- LM Pretraining with PyTorch/TPUβ137Updated 6 years ago
- β54Updated 2 years ago
- The Python library with command line tools to interact with Dynabench(https://dynabench.org/), such as uploading models.β55Updated 3 years ago
- Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.β105Updated 3 years ago