Nikkei / semantic-shift-stabilityLinks
Implementation of Semantic Shift Stability (AACL 2022, IC2S2 2023, JNLP)
☆17Updated 11 months ago
Alternatives and similar repositories for semantic-shift-stability
Users that are interested in semantic-shift-stability are comparing it to the libraries listed below
Sorting:
- ☆17Updated 2 years ago
- Repository for JSICK☆45Updated 2 years ago
- ☆13Updated 3 years ago
- Code for COLING 2020 Paper☆13Updated last week
- ☆16Updated 4 years ago
- Japanese data from the Google UDT 2.0.☆28Updated 2 years ago
- ☆11Updated 4 years ago
- Funer is Rule based Named Entity Recognition tool.☆22Updated 3 years ago
- Wikipediaから作成した日本語名寄せデータセット☆35Updated 5 years ago
- Latest version of MedEX/J (Japanese disease name extractor)☆18Updated 3 years ago
- ☆36Updated 4 years ago
- japanese sentence segmentation library for python☆73Updated 2 years ago
- Japanese Realistic Textual Entailment Corpus (NLP 2020, LREC 2020)☆77Updated 2 years ago
- Use custom tokenizers in spacy-transformers☆16Updated 3 years ago
- Evidence-based Explanation Dataset (AACL-IJCNLP 2020)☆18Updated 4 years ago
- Utility scripts for preprocessing Wikipedia texts for NLP☆78Updated last year
- This repository has implementations of data augmentation for NLP for Japanese.☆64Updated 2 years ago
- JGLUE: Japanese General Language Understanding Evaluation for huggingface datasets☆12Updated 8 months ago
- ☆19Updated last year
- ☆29Updated 8 months ago
- ☆31Updated 7 years ago
- 日本語T5モデル☆116Updated 2 months ago
- DIRECT: Direct and Indirect REsponses in Conversational Text Corpus☆17Updated 4 years ago
- Japanese tokenizer for Transformers☆79Updated last year
- Japanese BERT Pretrained Model☆23Updated 4 years ago
- Japanese synonym library☆55Updated 3 years ago
- Kyoto University Web Document Leads Corpus☆83Updated last year
- hottoSNS-BERT: 大規模SNSコーパスによる文分散表現モデル☆62Updated last year
- Code to train Sentence BERT Japanese model for Hugging Face Model Hub☆11Updated 4 years ago
- Pre-training Language Models for Japanese☆50Updated 2 years ago