stefan-it / italian-bertelectraLinks
๐ฎ๐น Italian BERT and ELECTRA models (incl. evaluation)
โ18Updated 2 years ago
Alternatives and similar repositories for italian-bertelectra
Users that are interested in italian-bertelectra are comparing it to the libraries listed below
Sorting:
- Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.โ84Updated last year
- Materials for "IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation" ๐ฎ๐นโ30Updated last year
- A library to synthesize text datasets using Large Language Models (LLM)โ151Updated 2 years ago
- Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.โ105Updated 3 years ago
- A python package to run inference with HuggingFace language and vision-language checkpoints wrapping many convenient features.โ28Updated last year
- UmBERTo: an Italian Language Model trained with Whole Word Masking.โ108Updated 2 years ago
- Code and models used in "MUSS Multilingual Unsupervised Sentence Simplification by Mining Paraphrases".โ99Updated 2 years ago
- Python-based implementation of the Translate-Align-Retrieve method to automatically translate the SQuAD Dataset to Spanish.โ59Updated 2 years ago
- โ15Updated 4 years ago
- This repository contains a demonstrative implementation for pooling-based models, e.g., DeepPyramidion complementing our paper "Sparsifyiโฆโ14Updated 3 years ago
- A large scale dataset for Question Answering in Italianโ27Updated 6 years ago
- Augmenty is an augmentation library based on spaCy for augmenting texts.โ156Updated last year
- A collection of scripts to preprocess ASR datasets and finetune language-specific Wav2Vec2 XLSR modelsโ31Updated 4 years ago
- A python package for benchmarking interpretability techniques on Transformers.โ213Updated last year
- โ112Updated last year
- Tutorial to pretrain & fine-tune a ๐ค Flax T5 model on a TPUv3-8 with GCPโ58Updated 3 years ago
- โ35Updated 3 years ago
- Accelerated NLP pipelines for fast inference on CPU. Built with Transformers and ONNX runtime.โ127Updated 4 years ago
- [EMNLP-Findings 2020] Adapting BERT for Word Sense Disambiguation with Gloss Selection Objective and Example Sentencesโ63Updated last year
- German small and large versions of GPT2.โ20Updated 3 years ago
- [DEPRECATED] Adapt Transformer-based language models to new text domainsโ87Updated last year
- As good as new. How to successfully recycle English GPT-2 to make models for other languages (ACL Findings 2021)โ48Updated 4 years ago
- A tiny BERT for low-resource monolingual modelsโ31Updated last week
- Sentiment analysis and emotion classification for Italian using BERT (fine-tuning). Published at the WASSA workshop (EACL2021).โ27Updated last year
- TimeLMs: Diachronic Language Models from Twitterโ111Updated last year
- A software for transferring pre-trained English models to foreign languagesโ19Updated 2 years ago
- Some notebooks for NLPโ207Updated last year
- Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: โฆโ337Updated 2 years ago
- [EMNLP'23] Official Code for "FOCUS: Effective Embedding Initialization for Monolingual Specialization of Multilingual Models"โ34Updated 4 months ago
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scaleโ156Updated last year