microsoft / xtreme-distil-transformersLinks
XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale
☆155Updated last year
Alternatives and similar repositories for xtreme-distil-transformers
Users that are interested in xtreme-distil-transformers are comparing it to the libraries listed below
Sorting:
- This repository contains the code for "Generating Datasets with Pretrained Language Models".☆188Updated 3 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆93Updated 2 years ago
- ☆76Updated 3 years ago
- ☆87Updated 3 years ago
- Viewer for the 🤗 datasets library.☆84Updated 3 years ago
- Shared code for training sentence embeddings with Flax / JAX☆27Updated 3 years ago
- Code and data to support the paper "PAQ 65 Million Probably-Asked Questions andWhat You Can Do With Them"☆203Updated 3 years ago
- Pipeline for pulling and processing online language model pretraining data from the web