nlp-uoregon / OkapiLinks
Okapi: Instruction-tuned Large Language Models in Multiple Languages with Reinforcement Learning from Human Feedback
☆97Updated 2 years ago
Alternatives and similar repositories for Okapi
Users that are interested in Okapi are comparing it to the libraries listed below
Sorting:
- A Multilingual Replicable Instruction-Following Model☆95Updated 2 years ago
- [Data + code] ExpertQA : Expert-Curated Questions and Attributed Answers☆136Updated last year
- Multilingual Large Language Models Evaluation Benchmark☆133Updated last year
- Code for Zero-Shot Tokenizer Transfer☆142Updated 11 months ago
- Glot500: Scaling Multilingual Corpora and Language Models to 500 Languages -- ACL 2023☆106Updated last year
- Code for Multilingual Eval of Generative AI paper published at EMNLP 2023☆71Updated last year
- BLOOM+1: Adapting BLOOM model to support a new unseen language☆74Updated last year
- MultilingualSIFT: Multilingual Supervised Instruction Fine-tuning☆96Updated 2 years ago
- [TMLR'23] Contrastive Search Is What You Need For Neural Text Generation☆121Updated 2 years ago
- GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embeddings☆44Updated last year
- Vocabulary Trimming (VT) is a model compression technique, which reduces a multilingual LM vocabulary to a target language by deleting ir…☆58Updated last year
- Code and model release for the paper "Task-aware Retrieval with Instructions" by Asai et al.☆165Updated 2 years ago
- ☆72Updated 2 years ago
- ☆65Updated 2 years ago
- Dataset from the paper "Mintaka: A Complex, Natural, and Multilingual Dataset for End-to-End Question Answering" (COLING 2022)☆117Updated 3 years ago
- Code and data for the paper "Turning English-centric LLMs Into Polyglots: How Much Multilinguality Is Needed?"☆25Updated 6 months ago
- GEMBA — GPT Estimation Metric Based Assessment☆136Updated 2 weeks ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆96Updated 2 years ago
- [EMNLP'23] Official Code for "FOCUS: Effective Embedding Initialization for Monolingual Specialization of Multilingual Models"☆34Updated 6 months ago
- The official repository for Efficient Long-Text Understanding Using Short-Text Models (Ivgi et al., 2022) paper☆70Updated 2 years ago
- ☆101Updated 3 years ago
- Do Multilingual Language Models Think Better in English?☆43Updated 2 years ago
- Code and Data for "Evaluating Correctness and Faithfulness of Instruction-Following Models for Question Answering"☆87Updated last year
- Scalable training for dense retrieval models.☆298Updated 6 months ago
- The original implementation of Min et al. "Nonparametric Masked Language Modeling" (paper https//arxiv.org/abs/2212.01349)☆158Updated 2 years ago
- Pipeline for pulling and processing online language model pretraining data from the web