renatoviolin / Bart_T5-summarizationLinks
Summarization Task using Bart and T5 models.
☆171Updated 5 years ago
Alternatives and similar repositories for Bart_T5-summarization
Users that are interested in Bart_T5-summarization are comparing it to the libraries listed below
Sorting:
- Paraphrase any question with T5 (Text-To-Text Transfer Transformer) - Pretrained model and training script provided☆185Updated 2 years ago
- Question Answering using Albert and Electra☆208Updated 2 years ago
- Question Generation using Google T5 and Text2Text☆153Updated 4 years ago
- Tutorial for first time BERT users,☆103Updated 2 years ago
- ☆66Updated 5 years ago
- Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive…☆439Updated 4 months ago
- architectures and pre-trained models for long document classification.☆155Updated 4 years ago
- Making BERT stretchy. Semantic Elasticsearch with Sentence Transformers☆160Updated 5 years ago
- A repo to explore different NLP tasks which can be solved using T5☆172Updated 4 years ago
- multi_task_NLP is a utility toolkit enabling NLP developers to easily train and infer a single model for multiple tasks.☆372Updated 2 years ago
- Neural Question Generation using the SQuAD and NewsQA datasets☆110Updated 2 years ago
- Enhancing the BERT training with Semi-supervised Generative Adversarial Networks☆228Updated 2 years ago
- SentAugment is a data augmentation technique for NLP that retrieves similar sentences from a large bank of sentences. It can be used in c…☆362Updated 3 years ago
- Named Entity Recognition with Pretrained XLM-RoBERTa☆91Updated 4 years ago
- Semantic search using Transformers and others☆110Updated 5 years ago
- BERT for question answering starting with HotpotQA☆168Updated 6 years ago
- Important paper implementations for Question Answering using PyTorch☆270Updated 4 years ago
- ⛔ [NOT MAINTAINED] A web-based annotator for closed-domain question answering datasets with SQuAD format.☆88Updated 2 years ago
- BERT which stands for Bidirectional Encoder Representations from Transformations is the SOTA in Transfer Learning in NLP.☆56Updated 5 years ago
- IEEE/ACM TASLP 2020: SBERT-WK: A Sentence Embedding Method By Dissecting BERT-based Word Models☆180Updated 4 years ago
- Pre-Trained Models for ToD-BERT☆294Updated 2 years ago
- ☆345Updated 4 years ago
- A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.☆156Updated 2 years ago
- Lecture summarization with BERT☆152Updated 3 years ago
- Question-answer generation from text☆71Updated 2 years ago
- A Natural Language Inference (NLI) model based on Transformers (BERT and ALBERT)☆137Updated last year
- MobileBERT and DistilBERT for extractive summarization☆92Updated 2 years ago
- Interpretable Evaluation for (Almost) All NLP Tasks☆195Updated 2 weeks ago
- Do NLP tasks with some SOTA methods☆92Updated 4 years ago
- Deep Keyphrase Extraction using BERT☆261Updated 3 years ago