huggingface / model_cardLinks
β30Updated 4 years ago
Alternatives and similar repositories for model_card
Users that are interested in model_card are comparing it to the libraries listed below
Sorting:
- Viewer for the π€ datasets library.β85Updated 4 years ago
- The repository for the paper "When Do You Need Billions of Words of Pretraining Data?"β21Updated 5 years ago
- Helper scripts and notes that were used while porting various nlp modelsβ48Updated 3 years ago
- Minimal code to train ELMo models in recent versions of TensorFlowβ14Updated 2 years ago
- A Streamlit app to add structured tags to a dataset cardβ22Updated 3 years ago
- β87Updated 3 years ago
- Generate BERT vocabularies and pretraining examples from Wikipediasβ17Updated 5 years ago
- π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.β81Updated 3 years ago
- A generic library for crafting adversarial NLP examples - WIPβ41Updated 7 years ago
- β75Updated 4 years ago
- A diff tool for language modelsβ44Updated last year
- A lightweight but powerful library to build token indices for NLP tasks, compatible with major Deep Learning frameworks like PyTorch and β¦β51Updated 11 months ago
- Factorization of the neural parameter space for zero-shot multi-lingual and multi-task transferβ39Updated 5 years ago
- Open source library for few shot NLPβ78Updated 2 years ago
- SMASHED is a toolkit designed to apply transformations to samples in datasets, such as fields extraction, tokenization, prompting, batchiβ¦β35Updated last year
- Implementation of the paper 'Sentence Bottleneck Autoencoders from Transformer Language Models'β17Updated 3 years ago
- Agents that build knowledge graphs and explore textual worlds by asking questionsβ79Updated 2 years ago
- Tutorial to pretrain & fine-tune a π€ Flax T5 model on a TPUv3-8 with GCPβ58Updated 3 years ago
- Code repo for "Transformer on a Diet" paperβ31Updated 5 years ago
- BERT models for many languages created from Wikipedia textsβ33Updated 5 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.β96Updated 2 years ago
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+β37Updated 4 years ago
- On Generating Extended Summaries of Long Documentsβ78Updated 4 years ago
- classy is a simple-to-use library for building high-performance Machine Learning models in NLP.β87Updated last month
- The Python library with command line tools to interact with Dynabench(https://dynabench.org/), such as uploading models.β55Updated 3 years ago
- As good as new. How to successfully recycle English GPT-2 to make models for other languages (ACL Findings 2021)β48Updated 4 years ago
- A highly sophisticated sequence-to-sequence model for code generationβ40Updated 4 years ago
- β22Updated 3 years ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.β147Updated 4 years ago
- β21Updated 4 years ago