hppRC / bert-classification-tutorial-2024Links
【2024年版】BERTによるテキスト分類
☆29Updated 10 months ago
Alternatives and similar repositories for bert-classification-tutorial-2024
Users that are interested in bert-classification-tutorial-2024 are comparing it to the libraries listed below
Sorting:
- Preferred Generation Benchmark☆82Updated last week
- ☆17Updated last year
- ☆34Updated 5 years ago
- Exploring Japanese SimCSE☆68Updated last year
- ☆50Updated last year
- NLP2024 チュートリアル3 作って学ぶ日本語大規模言語モデル - 環境構築手順とソースコード / NLP2024 Tutorial 3: Practicing how to build a Japanese large-scale language model - E…☆112Updated last year
- Japanese Language Model Financial Evaluation Harness☆75Updated 2 weeks ago
- ☆47Updated 5 months ago
- ☆26Updated 7 months ago
- ☆55Updated 3 months ago
- ☆83Updated last year
- Easily turn large English text datasets into Japanese text datasets using open LLMs.☆20Updated 4 months ago
- Mecab + NEologd + Docker + Python3☆35Updated 3 years ago
- LLMとLoRAを用いたテキスト分類☆97Updated last year
- ☆26Updated 5 months ago
- 生成自動評価を行うためのPythonツール☆24Updated last month
- ☆51Updated 11 months ago
- DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.☆44Updated 2 years ago
- ☆28Updated 2 months ago
- ☆29Updated last year
- Training and evaluation scripts for JGLUE, a Japanese language understanding benchmark☆17Updated this week
- 書籍『深層ニューラルネットワークの高速化』のサポートサイトです。☆57Updated last week
- NLP 100 Exercise 2025☆22Updated last month
- The evaluation scripts of JMTEB (Japanese Massive Text Embedding Benchmark)☆60Updated last month
- LLaVA-JP is a Japanese VLM trained by LLaVA method☆62Updated 11 months ago
- JMultiWOZ: A Large-Scale Japanese Multi-Domain Task-Oriented Dialogue Dataset, LREC-COLING 2024☆24Updated last year
- ☆93Updated this week
- Japanese instruction data (日本語指示データ)☆24Updated last year
- alpacaデータセットを日本語化したものです☆89Updated 2 years ago
- JQaRA: Japanese Question Answering with Retrieval Augmentation - 検索拡張(RAG)評価のための日本語Q&Aデータセット☆30Updated 4 months ago