ChenghaoMou / text-dedup
All-in-one text de-duplication
☆633Updated 6 months ago
Alternatives and similar repositories for text-dedup:
Users that are interested in text-dedup are comparing it to the libraries listed below
- Code used for sourcing and cleaning the BigScience ROOTS corpus☆305Updated last year
- ☆1,143Updated 4 months ago
- A curated list of awesome instruction tuning datasets, models, papers and repositories.☆316Updated last year
- Contriever: Unsupervised Dense Information Retrieval with Contrastive Learning☆696Updated last year
- Pytorch implementation of DoReMi, a method for optimizing the data mixture weights in language modeling datasets☆311Updated 11 months ago
- Code repository for supporting the paper "Atlas Few-shot Learning with Retrieval Augmented Language Models",(https//arxiv.org/abs/2208.03…☆519Updated last year
- Tevatron - A flexible toolkit for neural retrieval research and development.☆543Updated 3 weeks ago
- DSIR large-scale data selection framework for language model training☆234Updated 8 months ago
- [EMNLP 2023] Enabling Large Language Models to Generate Text with Citations. Paper: https://arxiv.org/abs/2305.14627☆466Updated 2 months ago
- Reading list of Instruction-tuning. A trend starts from Natrural-Instruction (ACL 2022), FLAN (ICLR 2022) and T0 (ICLR 2022).☆758Updated last year
- Collection of training data management explorations for large language models☆297Updated 4 months ago
- Expanding natural instructions☆963Updated last year
- This repository contains code to quantitatively evaluate instruction-tuned models such as Alpaca and Flan-T5 on held-out tasks.☆535Updated 9 months ago
- Deita: Data-Efficient Instruction Tuning for Alignment [ICLR2024]☆506Updated last week
- Crosslingual Generalization through Multitask Finetuning☆518Updated 2 months ago
- Tools to download and cleanup Common Crawl data☆975Updated last year
- Papers and Datasets on Instruction Tuning and Following. ✨✨✨☆469Updated 8 months ago
- [EMNLP 2023] Adapting Language Models to Compress Long Contexts☆285Updated 3 months ago
- All available datasets for Instruction Tuning of Large Language Models☆238Updated last year
- Official repository of NEFTune: Noisy Embeddings Improves Instruction Finetuning☆388Updated 7 months ago
- Code for T-Few from "Few-Shot Parameter-Efficient Fine-Tuning is Better and Cheaper than In-Context Learning"☆433Updated last year
- Fast Inference Solutions for BLOOM☆561Updated 2 months ago
- Scalable training for dense retrieval models.☆272Updated last year
- Fusion-in-Decoder☆555Updated last year
- Reproduce results and replicate training fo T0 (Multitask Prompted Training Enables Zero-Shot Task Generalization)☆458Updated 2 years ago
- What's In My Big Data (WIMBD) - a toolkit for analyzing large text datasets☆195Updated last month
- OpenICL is an open-source framework to facilitate research, development, and prototyping of in-context learning.☆541Updated last year
- Implementation of paper Data Engineering for Scaling Language Models to 128K Context☆446Updated 8 months ago
- A curated list of awesome papers related to pre-trained models for information retrieval (a.k.a., pretraining for IR).☆645Updated 11 months ago
- Generative Representational Instruction Tuning☆573Updated 3 weeks ago