KoBART chatbot
☆45Jun 22, 2021Updated 4 years ago
Alternatives and similar repositories for KoBART-chatbot
Users that are interested in KoBART-chatbot are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Korean BART☆465Jun 14, 2025Updated 10 months ago
- Simple Chit-Chat based on KoGPT2☆183Jun 12, 2023Updated 2 years ago
- ☆27Mar 11, 2021Updated 5 years ago
- T5-base model for Korean☆27May 20, 2021Updated 4 years ago
- KoGPT2 on Huggingface Transformers☆33May 4, 2021Updated 4 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Korean wellness chatbot models: KoGPT2 + KoBERT/KoELECTRA (PyTorch, Transformers).☆209Jan 12, 2026Updated 3 months ago
- GPT-2 pretrained on Korean datasets.☆54Oct 12, 2021Updated 4 years ago
- KoRean based ELECTRA pre-trained models (KR-ELECTRA) for Tensorflow and PyTorch☆15Feb 13, 2022Updated 4 years ago
- Korean LegalQA using SentenceKoBART☆98Mar 25, 2023Updated 3 years ago
- Summarization module based on KoBART☆202Sep 12, 2023Updated 2 years ago
- [Findings of NAACL2022] A Dog Is Passing Over The Jet? A Text-Generation Dataset for Korean Commonsense Reasoning and Evaluation☆11May 27, 2022Updated 3 years ago
- Adversarial Test Dataset for Korean Multi-turn Response Selection☆34Dec 16, 2021Updated 4 years ago
- Language Style과 감정에 따른 챗봇 답변 변화 모델☆33Aug 17, 2021Updated 4 years ago
- ☆12Mar 8, 2020Updated 6 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- TEMP☆34Apr 2, 2020Updated 6 years ago
- Open-domain chatbot (Meena-style) with a vanilla Transformer seq2seq in PyTorch.☆27Jan 12, 2026Updated 3 months ago
- Tutorial to show how to deploy KoGPT2 model to Amazon SageMaker at scale, and how to fine-tune KoGPT2 model for downstream NLP tasks☆24Jun 6, 2020Updated 5 years ago
- ELECTRA기반 한국어 대화체 언어모델☆53Aug 4, 2021Updated 4 years ago
- Google 공식 Rouge Implementation을 한국어에서 사용할 수 있도록 처리☆18Jan 3, 2024Updated 2 years ago
- https://challenge.enliple.com/☆16Jun 10, 2020Updated 5 years ago
- Intonation-aided intention identification for Korean☆83Nov 21, 2022Updated 3 years ago
- 문장단위로 분절된 한국어 위키피디아 코퍼스. Releases에서 다운로드 받거나 tfds-korean으로 사용해주세요.☆24Sep 6, 2023Updated 2 years ago
- The code and models for "An Empirical Study of Tokenization Strategies for Various Korean NLP Tasks" (AACL-IJCNLP 2020)☆119Oct 8, 2020Updated 5 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- 🤗 최소한의 세팅으로 LM을 학습하기 위한 샘플코드☆59May 23, 2023Updated 2 years ago
- 초성 해석기 based on ko-BART☆29Mar 31, 2021Updated 5 years ago
- Training Transformers of Huggingface with KoNLPy☆68Aug 28, 2020Updated 5 years ago
- ☆25Oct 28, 2020Updated 5 years ago
- Pretrained ELECTRA Model for Korean☆633Feb 19, 2024Updated 2 years ago
- Yet another python binding for mecab-ko☆88May 16, 2023Updated 2 years ago
- kogpt를 oslo로 파인튜닝하는 예제.☆23Aug 26, 2022Updated 3 years ago
- Wikitext format dataset of Namuwiki (Most famous Korean wikipedia)☆53Oct 25, 2020Updated 5 years ago
- Kobart model on Huggingface transformers☆64Feb 15, 2022Updated 4 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Sentence Embeddings using Siamese ETRI KoBERT☆162Aug 16, 2025Updated 8 months ago
- ☆23Oct 30, 2023Updated 2 years ago
- ☆11Oct 3, 2021Updated 4 years ago
- Character-level Korean ELECTRA Model (음절 단위 한국어 ELECTRA)☆54Jun 12, 2023Updated 2 years ago
- https://ailabs.enliple.com/☆105Feb 25, 2021Updated 5 years ago
- ☆20Apr 1, 2022Updated 4 years ago
- Open Source + Multilingual MLLM + Fine-tuning + Distillation + More efficient models and learning + ?☆18Jan 31, 2025Updated last year