philschmid / optimum-transformers-optimizations
β30Updated 2 years ago
Alternatives and similar repositories for optimum-transformers-optimizations:
Users that are interested in optimum-transformers-optimizations are comparing it to the libraries listed below
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.β93Updated 2 years ago
- **ARCHIVED** Filesystem interface to π€ Hubβ58Updated last year
- A Python wrapper around HuggingFace's TGI (text-generation-inference) and TEI (text-embedding-inference) servers.β34Updated 2 months ago
- SWIM-IR is a Synthetic Wikipedia-based Multilingual Information Retrieval training set with 28 million query-passage pairs spanning 33 laβ¦β46Updated last year
- Using short models to classify long textsβ21Updated 2 years ago
- A Framework aims to wisely initialize unseen subword embeddings in PLMs for efficient large-scale continued pretrainingβ16Updated last year
- β28Updated last year
- Anh - LAION's multilingual assistant datasets and modelsβ27Updated last year
- μΈμ΄λͺ¨λΈμ νμ΅νκΈ° μν κ³΅κ° νκ΅μ΄ instruction datasetλ€μ λͺ¨μλμμ΅λλ€.β19Updated last year
- DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization (ACL 2022)β50Updated last year
- Mr. TyDi is a multi-lingual benchmark dataset built on TyDi, covering eleven typologically diverse languages.β74Updated 3 years ago
- Tutorial to pretrain & fine-tune a π€ Flax T5 model on a TPUv3-8 with GCPβ58Updated 2 years ago
- This is a new metric that can be used to evaluate faithfulness of text generated by LLMs. The work behind this repository can be found heβ¦β31Updated last year