BM-K / KoBART-summarization-pytorch
π§ KoBART summarization using pytorch
β13Updated last year
Related projects β
Alternatives and complementary repositories for KoBART-summarization-pytorch
- β17Updated 10 months ago
- Korean Light Weight Language Modelβ30Updated last year
- Troll Detectorβ14Updated last year
- KLUE Benchmark 1st place (2021.12) solutions. (RE, MRC, NLI, STS, TC)β25Updated 2 years ago
- State-of-the-art Retrieval Model implementing "Dense Passage Retrieval for Open-Domain Question Answering" with Korean Dataset.β9Updated 2 years ago
- β26Updated 4 years ago
- Language Styleκ³Ό κ°μ μ λ°λ₯Έ μ±λ΄ λ΅λ³ λ³ν λͺ¨λΈβ33Updated 3 years ago
- final-project-level3-nlp-02 created by GitHub Classroomβ11Updated 2 years ago
- This is project to analyze korquad 2.0β24Updated 2 years ago
- Character-level Korean ELECTRA Model (μμ λ¨μ νκ΅μ΄ ELECTRA)β53Updated last year
- νκ΅μ΄ T5 λͺ¨λΈβ46Updated 2 years ago
- Data Augmentation Toolkit for Korean text.β51Updated 3 years ago
- [Findings of NAACL2022] A Dog Is Passing Over The Jet? A Text-Generation Dataset for Korean Commonsense Reasoning and Evaluationβ28Updated last year
- Simple Contrastive Learning of Korean Sentence Embeddingsβ49Updated last year
- kogptλ₯Ό osloλ‘ νμΈνλνλ μμ .β23Updated 2 years ago
- bpe based korean t5 model for text-to-text unified frameworkβ63Updated 7 months ago
- βοΈ Utilizing RBERT model structure for KLUE Relation Extraction taskβ15Updated 2 years ago
- Kobart model on Huggingface transformersβ63Updated 2 years ago
- BERTScore for Koreanβ73Updated 9 months ago
- KoSentenceBERT λͺ¨λΈ ꡬ쑰 λ³κ²½μΌλ‘ μ±λ₯ ν₯μβ10Updated 4 years ago
- νκ΅μ΄ λ¬Έμμ λ Έμ΄μ¦λ₯Ό μΆκ°ν©λλ€.β27Updated 2 years ago
- KcBERT/KcELECTRA Fine Tune Benchmarks code (forked from https://github.com/monologg/KoELECTRA/tree/master/finetune)β40Updated 2 years ago
- #Paired Questionβ23Updated 4 years ago
- νκ΅μ΄ μμ°μ΄ μ²λ¦¬ λͺ¨λΈ λ―ΈμΈμ‘°μ β14Updated 3 years ago
- T5-base model for Koreanβ26Updated 3 years ago
- "μμ°μ΄μ²λ¦¬ μκ³ λ¦¬μ¦μ νμ©ν λλ¦°νμ΅μ κ΅μ‘ 컨ν μΈ μ μ" νλ‘μ νΈ "μ μκΈΈ" νμ λλ€. λ°μ΄ν° μμ§(ν¬λ‘€λ§)/EDA/Preprocessing, μ¬μ΄λ§ μμ±μμ½ AI λͺ¨λΈλ§(NLP - KoBERT, KoBART), νλ‘ν νμ μ μμ μ§ννμ΅λλ€β¦β13Updated 2 years ago
- β29Updated 2 years ago
- For the rlhf learning environment of Koreansβ23Updated last year
- Text-to-Text Transformer for Korean QA Taskβ7Updated 4 years ago
- Difference-based Contrastive Learning for Korean Sentence Embeddingsβ24Updated last year