lonelywing / POSTECH_thesis_template_latex
β22Updated 2 years ago
Alternatives and similar repositories for POSTECH_thesis_template_latex:
Users that are interested in POSTECH_thesis_template_latex are comparing it to the libraries listed below
- β87Updated 2 years ago
- π μ°λ¦¬κ° μ½μ λ Όλ¬Έμ μ°Ύμμ, Cite.GGβ89Updated 8 months ago
- β91Updated 2 years ago
- πͺ [μ λ¬Έμ°κ΅¬μμ] λ°μ΄ν° μ μ¬/μκ°ν λ° λΈλ‘κ·Έ μ 리 πͺβ61Updated this week
- GNN κ΄λ ¨ μ 보 μμΉ΄μ΄λΈβ17Updated 2 years ago
- β28Updated 3 years ago
- κ·Έλν λ₯λ¬λ λΌμ΄λΈλ¬λ¦¬ DGL μ½κ² λ°°μ°κΈ°β66Updated 4 years ago
- [2020.01-2021.01] ν¬λΉ μ€ 13κΈ° μ°μμ½λ μ μ₯μμ λλ€.β29Updated 4 years ago
- λͺ¨λλ₯Ό μν μ»¨λ°±μ€ μ΅μ νβ167Updated 2 weeks ago
- "A survey of Transformer" paper study π©π»βπ»π§π»βπ» KoreaUniv. DSBA Labβ186Updated 3 years ago
- Implementation TextRank and related utilsβ86Updated 3 years ago
- Deep Learning Paper Reading Meeting-Archiveβ245Updated last month
- λͺ¨λμ λ§λμΉ μΈκ³΅ μ§λ₯ μΈμ΄ λ₯λ ₯ νκ° 1λ± μ루μ μ λλ€.β49Updated 3 years ago
- NAVERCONNECT AI TECH 4κΈ° μ±μ₯ κΈ°λ‘β14Updated 2 years ago
- κ³ λ €λνκ΅ μΒ·λ°μ¬νμ λ Όλ¬Έ TeX ν νλ¦Ώβ63Updated 2 years ago
- λνμ μνμ νλ©° μ¬μ©νλ μκ³ μμ€ν μ½λ©ν (linux λͺ λ Ήμ΄ λ±)β383Updated 2 years ago
- Introduction to Deep Learningβ81Updated last year
- my useful torch lightning training templateβ32Updated last year
- NC NLP Techblog. NCμ NLPκ° μ΄μ΄κ° λμ κ³Ό λ³νλ₯Ό μκ°ν©λλ€.β21Updated 3 weeks ago
- β145Updated 2 years ago
- Official datasets and pytorch implementation repository of SQuARe and KoSBi (ACL 2023)β241Updated last year
- List of tech startups in South Korea. (Republic of Korea)β215Updated 2 years ago
- β58Updated this week
- β41Updated 4 years ago
- A clean and structured implementation of Transformer with wandb and pytorch-lightningβ71Updated 2 years ago
- Benchmark in Korean Contextβ126Updated last year
- β47Updated last year
- tools for Natural Language Processingβ78Updated last year
- λ§λ²νκ΅ 'μ΄μ‘°ν μ λ§λ' 리ν¬μ§ν 리μ λλ€β14Updated this week
- 42dot LLM consists of a pre-trained language model, 42dot LLM-PLM, and a fine-tuned model, 42dot LLM-SFT, which is trained to respond to β¦β126Updated 11 months ago