Ankur3107 / dpr-tfLinks
Dense Passage Retrieval using tensorflow-keras on TPU
☆16Updated 4 years ago
Alternatives and similar repositories for dpr-tf
Users that are interested in dpr-tf are comparing it to the libraries listed below
Sorting:
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Updated 4 years ago
- Shared code for training sentence embeddings with Flax / JAX☆28Updated 4 years ago
- Helper scripts and notes that were used while porting various nlp models☆48Updated 3 years ago
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆81Updated 3 years ago
- ☆21Updated 4 years ago
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆58Updated 3 years ago
- Viewer for the 🤗 datasets library.☆85Updated 4 years ago
- Embedding Recycling for Language models☆38Updated 2 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆96Updated 2 years ago
- A diff tool for language models☆44Updated last year
- A Benchmark Dataset for Understanding Disfluencies in Question Answering☆64Updated 4 years ago
- A 🤗-style implementation of BERT using lambda layers instead of self-attention☆69Updated 5 years ago
- NLP Examples using the 🤗 libraries☆40Updated 4 years ago
- Distillation of BERT model with catalyst framework☆78Updated 2 years ago
- Open source library for few shot NLP☆78Updated 2 years ago
- ☆30Updated 4 years ago
- Agents that build knowledge graphs and explore textual worlds by asking questions☆79Updated 2 years ago
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale☆156Updated last year
- 🛠️ Tools for Transformers compression using PyTorch Lightning ⚡☆85Updated 3 weeks ago
- Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)☆61Updated 2 years ago
- A deep learning library based on Pytorch focussed on low resource language research and robustness☆70Updated 3 years ago
- ☆12Updated 3 years ago
- ☆87Updated 3 years ago
- LM Pretraining with PyTorch/TPU☆136Updated 6 years ago
- Implementation of the paper 'Sentence Bottleneck Autoencoders from Transformer Language Models'☆17Updated 3 years ago
- ☆75Updated 4 years ago
- A highly sophisticated sequence-to-sequence model for code generation☆40Updated 4 years ago
- Factorization of the neural parameter space for zero-shot multi-lingual and multi-task transfer☆39Updated 5 years ago
- RATransformers 🐭- Make your transformer (like BERT, RoBERTa, GPT-2 and T5) Relation Aware!☆41Updated 2 years ago
- The Python library with command line tools to interact with Dynabench(https://dynabench.org/), such as uploading models.☆55Updated 3 years ago