tlkh / t2t-tuner
Convenient Text-to-Text Training for Transformers
☆19Updated 2 years ago
Related projects ⓘ
Alternatives and complementary repositories for t2t-tuner
- Hate speech detection corpus in Korean, shared with EMNLP 2023 paper☆13Updated 6 months ago
- Difference-based Contrastive Learning for Korean Sentence Embeddings☆24Updated last year
- Beyond LM: How can language model go forward in the future?☆15Updated last year
- Implementation of stop sequencer for Huggingface Transformers☆15Updated last year
- Google 공식 Rouge Implementation을 한국어에서 사용할 수 있도록 처리☆13Updated 10 months ago
- [Findings of NAACL2022] A Dog Is Passing Over The Jet? A Text-Generation Dataset for Korean Commonsense Reasoning and Evaluation☆12Updated 2 years ago
- Don't Judge a Language Model by Its Last Layer: Contrastive Learning with Layer-Wise Attention Pooling☆9Updated 2 years ago
- baikal.ai's pre-trained BERT models: descriptions and sample codes☆12Updated 3 years ago
- Megatron LM 11B on Huggingface Transformers☆27Updated 3 years ago
- 언어모델을 학습하기 위한 공개 한국어 instruction dataset들을 모아두었습니다.☆19Updated last year
- ☆10Updated last week
- Korean Nested Named Entity Corpus☆16Updated last year
- data related codebase for polyglot project☆19Updated last year
- Korean Named Entity Corpus☆24Updated last year
- MeCab model trained with OpenKorPos.☆22Updated 2 years ago
- Train 🤗transformers with DeepSpeed: ZeRO-2, ZeRO-3☆21Updated 3 years ago
- ☆19Updated 2 years ago
- 모두의 말뭉치 데이터를 분석에 편리한 형태로 변환하는 기능을 제공합니다.☆11Updated 2 years ago
- A Framework aims to wisely initialize unseen subword embeddings in PLMs for efficient large-scale continued pretraining☆12Updated 11 months ago
- 2019 국어경진대회 한국어 의존구문 분석 대상(문체부 장관상)☆16Updated 2 years ago
- Official code and dataset repository of KoBBQ (TACL 2024)☆14Updated 5 months ago
- Calculating Expected Time for training LLM.☆38Updated last year
- ☆14Updated 2 years ago
- ☆12Updated last year
- exBERT on Transformers🤗☆10Updated 3 years ago
- Abstractive summarization using Bert2Bert framework.☆31Updated 3 years ago