kakaoenterprise / KorAdvMRSTestDataLinks
Adversarial Test Dataset for Korean Multi-turn Response Selection
☆35Updated 3 years ago
Alternatives and similar repositories for KorAdvMRSTestData
Users that are interested in KorAdvMRSTestData are comparing it to the libraries listed below
Sorting:
- Character-level Korean ELECTRA Model (음절 단위 한국어 ELECTRA)☆54Updated 2 years ago
- A utility for storing and reading files for Korean LM training 💾☆36Updated last year
- kogpt를 oslo로 파인튜닝하는 예제.☆23Updated 2 years ago
- 한국어 문서에 노이즈를 추가합니다.☆27Updated 2 years ago
- T5-base model for Korean☆27Updated 4 years ago
- ☆18Updated 3 years ago
- ☆20Updated 3 years ago
- The code and models for "An Empirical Study of Tokenization Strategies for Various Korean NLP Tasks" (AACL-IJCNLP 2020)☆119Updated 4 years ago
- #Paired Question☆24Updated 5 years ago
- Kobart model on Huggingface transformers☆64Updated 3 years ago
- ELECTRA기반 한국어 대화체 언어모델☆54Updated 3 years ago
- KoGPT2 on Huggingface Transformers☆33Updated 4 years ago
- Finetuning Pipeline☆90Updated 3 years ago
- Dataset of Korean Threatening Conversations☆74Updated 2 years ago
- Korean-English Bilingual Electra Models☆110Updated 3 years ago
- Korean Math Word Problems☆59Updated 3 years ago
- ☆32Updated last year
- 한국어 생성 모델의 상식 추론을 위한 KommonGen 데이터셋입니다.☆19Updated 3 years ago
- Yet another python binding for mecab-ko☆86Updated 2 years ago
- KoBART chatbot☆47Updated 4 years ago
- 한국어 T5 모델☆54Updated 3 years ago
- Wikitext format dataset of Namuwiki (Most famous Korean wikipedia)☆51Updated 4 years ago
- 한국어 높임말 교정☆26Updated 2 years ago
- 나무위키덤프에서 정제된 텍스트를 얻기 위한 NamuwikiExtractor☆18Updated 3 years ago
- 문장단위로 분절된 한국어 위키피디아 코퍼스. Releases에서 다운로드 받거나 tfds-korean으로 사용해주세요.☆24Updated last year
- Parallel dataset of Korean Questions and Commands☆61Updated 2 years ago
- Korean Named Entity Corpus☆25Updated 2 years ago
- 🤗 최소한의 세팅으로 LM을 학습하기 위한 샘플코드☆58Updated 2 years ago
- Bias, Hate classification with KoELECTRA 👿☆27Updated 2 years ago
- 11.5기의 beyondBERT의 토론 내용을 정리하는 repository입니다.☆58Updated 5 years ago