seujung / KoBART-translation
☆26Updated 4 years ago
Alternatives and similar repositories for KoBART-translation
Users that are interested in KoBART-translation are comparing it to the libraries listed below
Sorting:
- Dataset of Korean Threatening Conversations☆73Updated 2 years ago
- Kobart model on Huggingface transformers☆65Updated 3 years ago
- KoBART chatbot☆47Updated 3 years ago
- Language Style과 감정에 따른 챗봇 답변 변화 모델☆33Updated 3 years ago
- Korean Relation Extraction Gold Standard☆35Updated 3 years ago
- Training Transformers of Huggingface with KoNLPy☆68Updated 4 years ago
- ELECTRA기반 한국어 대화체 언어모델☆54Updated 3 years ago
- Data Augmentation Toolkit for Korean text.☆51Updated 3 years ago
- Korean Light Weight Language Model☆30Updated last year
- #Paired Question☆23Updated 4 years ago
- Character-level Korean ELECTRA Model (음절 단위 한국어 ELECTRA)☆54Updated last year
- Korean Math Word Problems☆58Updated 3 years ago
- KoGPT2 on Huggingface Transformers☆33Updated 4 years ago
- Parallel dataset of Korean Questions and Commands☆60Updated 2 years ago
- 한국어 생성 모델의 상식 추론을 위한 KommonGen 데이터셋입니다.☆17Updated 3 years ago
- ☆26Updated 3 years ago
- Finetuning Pipeline☆90Updated 3 years ago
- Simple Contrastive Learning of Korean Sentence Embeddings☆50Updated 2 years ago
- 모두의 말뭉치 인공 지능 언어 능력 평가 1등 솔루션입니다.☆49Updated 3 years ago
- Transformers Pipeline with KoELECTRA☆40Updated last year
- huggingface를 이용하여 downstream task 수행하기☆64Updated 3 years ago
- 나무위키덤프에서 정제된 텍스트를 얻기 위한 NamuwikiExtractor☆18Updated 3 years ago
- BERTScore for Korean☆77Updated last year
- ☆29Updated 7 years ago
- T5-base model for Korean☆27Updated 4 years ago
- Adversarial Test Dataset for Korean Multi-turn Response Selection☆35Updated 3 years ago
- The code and models for "An Empirical Study of Tokenization Strategies for Various Korean NLP Tasks" (AACL-IJCNLP 2020)☆118Updated 4 years ago
- Korean Easy Data Augmentation☆94Updated 3 years ago
- Korean-English Bilingual Electra Models☆110Updated 3 years ago
- 문장단위로 분절된 한국어 위키피디아 코퍼스. Releases에서 다운로드 받거나 tfds-korean으로 사용해주세요.☆24Updated last year