amazon-science / dq-bart
DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization (ACL 2022)
โ50Updated last year
Alternatives and similar repositories for dq-bart:
Users that are interested in dq-bart are comparing it to the libraries listed below
- โ55Updated 2 years ago
- Repo for "On Learning to Summarize with Large Language Models as References"โ44Updated last year
- ๐ฆฎ Code and pretrained models for Findings of ACL 2022 paper "LaPraDoR: Unsupervised Pretrained Dense Retriever for Zero-Shot Text Retrieโฆโ49Updated 2 years ago
- The official repository for Efficient Long-Text Understanding Using Short-Text Models (Ivgi et al., 2022) paperโ68Updated last year
- Mr. TyDi is a multi-lingual benchmark dataset built on TyDi, covering eleven typologically diverse languages.โ74Updated 3 years ago
- This repository contains the code for paper Prompting ELECTRA Few-Shot Learning with Discriminative Pre-Trained Models.