moreh-io / motif-llmLinks
☆31Updated 6 months ago
Alternatives and similar repositories for motif-llm
Users that are interested in motif-llm are comparing it to the libraries listed below
Sorting:
- bpe based korean t5 model for text-to-text unified framework☆63Updated last year
- huggingface에 있는 한국어 데이터 세트☆28Updated 8 months ago
- CLIcK: A Benchmark Dataset of Cultural and Linguistic Intelligence in Korean☆45Updated 6 months ago
- Evaluate gpt-4o on CLIcK (Korean NLP Dataset)☆20Updated last year
- Official repository for KoMT-Bench built by LG AI Research☆63Updated 10 months ago
- 금융 도메인에 특화된 한국어 임베딩 모델☆20Updated 10 months ago
- Reward Model을 이용하여 언어모델의 답변을 평가하기☆29Updated last year
- High-performance vector search engine with no loss of accuracy through GPU and dynamic placement☆29Updated last year
- 1-Click is all you need.☆61Updated last year
- KoCommonGEN v2: A Benchmark for Navigating Korean Commonsense Reasoning Challenges in Large Language Models☆25Updated 10 months ago
- OpenOrca-KO dataset을 활용하여 llama2를 fine-tuning한 Korean-OpenOrca☆19Updated last year
- 언어모델을 학습하기 위한 공개 한국어 instruction dataset들을 모아두었습니다.☆19Updated last year
- ☆20Updated 11 months ago
- AI model designed to test the effectiveness in handling external ethical attacks.☆11Updated 11 months ago
- BERT score for text generation☆12Updated 5 months ago
- Performs benchmarking on two Korean datasets with minimal time and effort.☆40Updated last month
- IA3방식으로 KoAlpaca를 fine tuning한 한국어 LLM모델☆69Updated last year
- 한국어 언어 모델 학습을 위한 프로젝트(Flax, Pytorch with Huggingface Accelerate)☆32Updated last year
- Difference-based Contrastive Learning for Korean Sentence Embeddings☆24Updated 2 years ago
- 🤗 최소한의 세팅으로 LM을 학습하기 위한 샘플코드☆58Updated 2 years ago
- StrategyQA 데이터 세트 번역☆22Updated last year
- The most modern LLM evaluation toolkit☆60Updated this week
- ☆62Updated last month
- ☆35Updated last year
- Make running benchmark simple yet maintainable, again. Now only supports Korean-based cross-encoder.☆18Updated this week
- "Learning-based One-line intelligence Owner Network Connectivity Tool"☆15Updated 2 years ago
- 42dot LLM consists of a pre-trained language model, 42dot LLM-PLM, and a fine-tuned model, 42dot LLM-SFT, which is trained to respond to …☆131Updated last year
- ☆41Updated last year
- Google's Conceptual Captions Dataset translated into Korean☆22Updated 2 years ago
- KoTAN: Korean Translation and Augmentation with fine-tuned NLLB☆23Updated last year