edumunozsala / RoBERTa_Encoder_Decoder_Product_Names
Define Transformers, T5 model and RoBERTa Encoder decoder model for product names generation
☆48Updated 3 years ago
Alternatives and similar repositories for RoBERTa_Encoder_Decoder_Product_Names
Users that are interested in RoBERTa_Encoder_Decoder_Product_Names are comparing it to the libraries listed below
Sorting:
- pyTorch implementation of Recurrence over BERT (RoBERT) based on this paper https://arxiv.org/abs/1910.10781 and comparison with pyTorch …☆82Updated 2 years ago
- A simple recipe for training and inferencing Transformer architecture for Multi-Task Learning on custom datasets. You can find two approa…☆96Updated 2 years ago
- A repo to explore different NLP tasks which can be solved using T5