ai-forever / mgpt
Multilingual Generative Pretrained Model
☆205Updated 9 months ago
Alternatives and similar repositories for mgpt:
Users that are interested in mgpt are comparing it to the libraries listed below
- ☆57Updated last year
- Pipeline for easy fine-tuning of BERT architecture for sequence classification☆22Updated last year
- NEREL: A Russian Dataset with Nested Named Entities, Relations and Events☆27Updated last year
- Code and data of "Methods for Detoxification of Texts for the Russian Language" paper☆46Updated last week
- SAGE: Spelling correction, corruption and evaluation for multiple languages☆147Updated 2 months ago
- Russian paraphrasers. Generate paraphrases with mt5, gpt2, etc.☆54Updated last year
- Russian Artificial Text Detection☆17Updated 2 years ago
- BSNLP 2021☆33Updated 3 months ago
- Russian Corpus of Linguistic Acceptability☆42Updated 4 months ago
- Accelerated NLP pipelines for fast inference on CPU and GPU. Built with Transformers, Optimum and ONNX Runtime.☆125Updated 2 years ago
- Unofficial implementation of QaNER: Prompting Question Answering Models for Few-shot Named Entity Recognition.☆66Updated 2 years ago
- ☆12Updated 2 years ago
- A small library with distillation, quantization and pruning pipelines☆26Updated 3 years ago
- CLIP implementation for Russian language☆142Updated last year
- Russian SuperGLUE benchmark☆109Updated last year
- RuBLiMP: Russian Benchmark of Linguistic Minimal Pairs☆17Updated last week
- Probing suite for evaluation of Russian embedding and language models☆33Updated 4 months ago
- RuSimpleSentEval (RSSE) shared task repo☆22Updated 3 years ago
- MMLU eval for RU/EN☆15Updated last year
- A library for preparing data for machine translation research (monolingual preprocessing, bitext mining, etc.) built by the FAIR NLLB te…☆267Updated last month
- ☆18Updated last year
- A Russian data set for question answering over Wikidata☆47Updated 3 years ago
- Dual Encoders for State-of-the-art Natural Language Processing.☆61Updated 2 years ago
- RuCLIP tiny (Russian Contrastive Language–Image Pretraining) is a neural network trained to work with different pairs (images, texts).☆32Updated 2 years ago
- RuLeanALBERT is a pretrained masked language model for the Russian language that uses a memory-efficient architecture.☆93Updated last year
- RuTransform: python framework for adversarial attacks and text data augmentation for Russian☆19Updated last year
- ☆24Updated 3 months ago
- ☆182Updated last year
- This repository contains the code for "Generating Datasets with Pretrained Language Models".☆187Updated 3 years ago