nlp-uoregon / famie
FAMIE: A Fast Active Learning Framework for Multilingual Information Extraction
☆24Updated 2 years ago
Alternatives and similar repositories for famie:
Users that are interested in famie are comparing it to the libraries listed below
- ZS4IE: A Toolkit for Zero-Shot Information Extraction with Simple Verbalizations☆26Updated 2 years ago
- The NewSHead dataset is a multi-doc headline dataset used in NHNet for training a headline summarization model.☆37Updated 3 years ago
- SMASHED is a toolkit designed to apply transformations to samples in datasets, such as fields extraction, tokenization, prompting, batchi…☆32Updated 8 months ago
- No Parameter Left Behind: How Distillation and Model Size Affect Zero-Shot Retrieval☆28Updated 2 years ago
- SWIM-IR is a Synthetic Wikipedia-based Multilingual Information Retrieval training set with 28 million query-passage pairs spanning 33 la…☆45Updated last year
- Helper scripts and notes that were used while porting various nlp models☆45Updated 2 years ago
- Ranking of fine-tuned HF models as base models.☆35Updated last year
- ☆14Updated 4 months ago
- ☆74Updated 3 years ago
- ☆17Updated last year
- Participant Kit for the TextGraphs-15 Shared Task on Explanation Regeneration☆19Updated 3 years ago
- Generate BERT vocabularies and pretraining examples from Wikipedias☆18Updated 4 years ago
- Corresponding code repo for the paper at COLING 2020 - ARGMIN 2020: "DebateSum: A large-scale argument mining and summarization dataset"☆53Updated 3 years ago
- Large-scale query-focused multi-document Summarization dataset☆10Updated 3 years ago
- Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.☆44Updated 9 months ago
- Code for Paper "Target-oriented Fine-tuning for Zero-Resource Named Entity Recognition"☆21Updated 2 years ago
- Training T5 to perform numerical reasoning.☆23Updated 3 years ago
- ☆16Updated last year
- 🌾 Universal, customizable and deployable fine-grained evaluation for text generation.☆22Updated last year
- SPRINT Toolkit helps you evaluate diverse neural sparse models easily using a single click on any IR dataset.☆44Updated last year
- Code and pre-trained models for "ReasonBert: Pre-trained to Reason with Distant Supervision", EMNLP'2021