bjoernpl / GermanBenchmarkLinks
A repository containing the code for translating popular LLM benchmarks to German.
☆30Updated 2 years ago
Alternatives and similar repositories for GermanBenchmark
Users that are interested in GermanBenchmark are comparing it to the libraries listed below
Sorting:
- Code for Zero-Shot Tokenizer Transfer☆140Updated 9 months ago
- A framework for few-shot evaluation of autoregressive language models.☆13Updated last year
- Erasing concepts from neural representations with provable guarantees☆239Updated 9 months ago
- Minimum Bayes Risk Decoding for Hugging Face Transformers☆60Updated last year
- ☆65Updated 2 years ago
- Datasets collection and preprocessings framework for NLP extreme multitask learning☆188Updated 3 months ago
- The Official Repository for "Bring Your Own Data! Self-Supervised Evaluation for Large Language Models"☆107Updated 2 years ago
- ☆76Updated last year
- Manage scalable open LLM inference endpoints in Slurm clusters☆273Updated last year
- BABILong is a benchmark for LLM evaluation using the needle-in-a-haystack approach.☆215Updated 2 months ago
- A fast implementation of T5/UL2 in PyTorch using Flash Attention☆109Updated this week
- A collection of datasets for language model pretraining including scripts for downloading, preprocesssing, and sampling.☆62Updated last year
- Wrapper to easily generate the chat template for Llama2☆66Updated last year
- Prune transformer layers☆69Updated last year
- [Data + code] ExpertQA : Expert-Curated Questions and Attributed Answers☆135Updated last year
- Official implementation of "GPT or BERT: why not both?"☆61Updated 3 months ago
- Code and Data Repo for the CoNLL Paper -- Future Lens: Anticipating Subsequent Tokens from a Single Hidden State☆20Updated last week
- Code repository for the c-BTM paper☆107Updated 2 years ago
- Utilities for the HuggingFace transformers library☆72Updated 2 years ago
- Experiments for efforts to train a new and improved t5☆75Updated last year
- NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Day☆256Updated 2 years ago
- ☆72Updated 2 years ago
- ☆52Updated 9 months ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆96Updated 2 years ago
- Official Code for M-RᴇᴡᴀʀᴅBᴇɴᴄʜ: Evaluating Reward Models in Multilingual Settings (ACL 2025 Main)☆37Updated 5 months ago
- [EMNLP'23] Official Code for "FOCUS: Effective Embedding Initialization for Monolingual Specialization of Multilingual Models"☆34Updated 4 months ago
- A toolkit implementing advanced methods to transfer models and model knowledge across tokenizers.☆47Updated 3 months ago
- Multipack distributed sampler for fast padding-free training of LLMs☆201Updated last year
- ☆39Updated last year
- Let's build better datasets, together!☆263Updated 10 months ago