edumunozsala / RoBERTa_Encoder_Decoder_Product_NamesLinks
Define Transformers, T5 model and RoBERTa Encoder decoder model for product names generation
☆48Updated 4 years ago
Alternatives and similar repositories for RoBERTa_Encoder_Decoder_Product_Names
Users that are interested in RoBERTa_Encoder_Decoder_Product_Names are comparing it to the libraries listed below
Sorting:
- A simple recipe for training and inferencing Transformer architecture for Multi-Task Learning on custom datasets. You can find two approa…☆96Updated 3 years ago
- A repo to explore different NLP tasks which can be solved using T5☆172Updated 4 years ago
- pyTorch implementation of Recurrence over BERT (RoBERT) based on this paper https://arxiv.org/abs/1910.10781 and comparison with pyTorch …☆82Updated 2 years ago
- ☆60Updated 4 years ago
- Fine-tuning GPT-2 Small for Question Answering☆130Updated 2 years ago
- This is where I put things I find useful that speed up my work with Machine Learning. Ever looked in your old projects to reuse those coo…☆261Updated 3 years ago
- Some notebooks for NLP☆207Updated last year
- Tensorflow, Pytorch, Huggingface Transformer, Fastai, etc. tutorial Colab Notebooks.☆76Updated 2 years ago
- Tutorial for first time BERT users,☆103Updated 2 years ago
- [NAACL 2021] This is the code for our paper `Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self…☆204Updated 3 years ago
- This repository contains the code, data, and models of the paper titled "XL-Sum: Large-Scale Multilingual Abstractive Summarization for 4…☆275Updated last year
- Introduction to the recently released T5 model from the paper - Exploring the Limits of Transfer Learning with a Unified Text-to-Text Tra…☆35Updated 5 years ago
- Natural Language Processing Analysis☆34Updated 2 years ago
- code for the paper "Zero-Shot Text Classification with Self-Training" for EMNLP 2022☆50Updated 3 weeks ago
- [DEPRECATED] Adapt Transformer-based language models to new text domains☆87Updated last year
- Easy to use and understand multiple-choice question generation algorithm using T5 Transformers.☆138Updated 3 years ago
- Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.☆105Updated 3 years ago
- ☆124Updated 4 years ago
- ☆42Updated 4 years ago
- Efficient Attention for Long Sequence Processing☆97Updated last year
- In this implementation, using the Flan T5 large language model, we performed the Text Classification task on the IMDB dataset and obtaine…☆21Updated 2 years ago
- Collection of NLP model explanations and accompanying analysis tools☆144Updated 2 years ago
- ☆46Updated 3 years ago
- simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.☆399Updated 2 years ago
- Abstractive and Extractive Text summarization using Transformers.☆85Updated 2 years ago
- BERT-based joint intent detection and slot filling with intent-slot attention mechanism (INTERSPEECH 2021)☆87Updated last year
- ICONIP2021 - A Vietnamese Medical Dataset for IC and NER☆21Updated 2 years ago
- Repository for XLM-T, a framework for evaluating multilingual language models on Twitter data☆158Updated 2 years ago
- Master thesis with code investigating methods for incorporating long-context reasoning in low-resource languages, without the need to pre…☆34Updated 4 years ago
- Neural information retrieval / Semantic search / Bi-encoders☆174Updated 2 years ago