edumunozsala / RoBERTa_Encoder_Decoder_Product_NamesLinks
Define Transformers, T5 model and RoBERTa Encoder decoder model for product names generation
☆48Updated 4 years ago
Alternatives and similar repositories for RoBERTa_Encoder_Decoder_Product_Names
Users that are interested in RoBERTa_Encoder_Decoder_Product_Names are comparing it to the libraries listed below
Sorting:
- A simple recipe for training and inferencing Transformer architecture for Multi-Task Learning on custom datasets. You can find two approa…☆99Updated 3 years ago
- A repo to explore different NLP tasks which can be solved using T5☆173Updated 5 years ago
- pyTorch implementation of Recurrence over BERT (RoBERT) based on this paper https://arxiv.org/abs/1910.10781 and comparison with pyTorch …☆82Updated 3 years ago
- This is where I put things I find useful that speed up my work with Machine Learning. Ever looked in your old projects to reuse those coo…☆264Updated 3 years ago
- Some notebooks for NLP☆207Updated 2 years ago
- Abstractive and Extractive Text summarization using Transformers.☆86Updated 2 years ago
- ViDeBERTa: A powerful pre-trained language model for Vietnamese, EACL 2023☆58Updated 2 years ago
- Tutorial for first time BERT users,☆103Updated 3 years ago
- A multi-purpose toolkit for table-to-text generation: web interface, Python bindings, CLI commands.☆57Updated last year
- [NAACL 2021] This is the code for our paper `Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self…☆206Updated 3 years ago
- Neural information retrieval / Semantic search / Bi-encoders☆175Updated 2 years ago
- [DEPRECATED] Adapt Transformer-based language models to new text domains☆86Updated last year
- ☆60Updated 4 years ago
- Efficient Attention for Long Sequence Processing☆98Updated 2 years ago
- Repository for XLM-T, a framework for evaluating multilingual language models on Twitter data☆160Updated 3 years ago
- Deep Learning for Natural Language Processing - Lectures 2023☆171Updated last year
- simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.☆400Updated 2 years ago
- Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.☆105Updated 3 years ago
- ☆125Updated 4 years ago
- Finetune multiple pre-trained Transformer-based models to solve Vietnamese Fake News Detection problem (ReINTEL) in VLSP2020 shared task☆18Updated 5 years ago
- BERT-based joint intent detection and slot filling with intent-slot attention mechanism (INTERSPEECH 2021)☆87Updated last year
- Important paper implementations for Question Answering using PyTorch☆270Updated 5 years ago
- Summarization Task using Bart and T5 models.☆171Updated 5 years ago
- Benchmarking various Deep Learning models such as BERT, ALBERT, BiLSTMs on the task of sentence entailment using two datasets - MultiNLI …☆28Updated 5 years ago
- Python-based implementation of the Translate-Align-Retrieve method to automatically translate the SQuAD Dataset to Spanish.☆59Updated 3 years ago
- Long-context pretrained encoder-decoder models☆96Updated 3 years ago
- This repository contains the code, data, and models of the paper titled "XL-Sum: Large-Scale Multilingual Abstractive Summarization for 4…☆277Updated last year
- Natural Language Processing Analysis☆34Updated 2 years ago
- Introduction to the recently released T5 model from the paper - Exploring the Limits of Transfer Learning with a Unified Text-to-Text Tra…☆35Updated 5 years ago
- Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher m…☆26Updated 4 years ago