daac-tools / python-vaporetto
🛥 Vaporetto is a fast and lightweight pointwise prediction based tokenizer. This is a Python wrapper for Vaporetto.
☆20Updated 2 months ago
Alternatives and similar repositories for python-vaporetto
Users that are interested in python-vaporetto are comparing it to the libraries listed below
Sorting:
- Japanese BERT Pretrained Model☆22Updated 3 years ago
- Japanese synonym library☆53Updated 3 years ago
- This repository has implementations of data augmentation for NLP for Japanese.☆64Updated 2 years ago
- Funer is Rule based Named Entity Recognition tool.☆22Updated 3 years ago
- Repository for JSICK☆44Updated last year
- Code for COLING 2020 Paper☆13Updated last week
- Japanese tokenizer for Transformers☆80Updated last year
- Japanese Realistic Textual Entailment Corpus (NLP 2020, LREC 2020)☆76Updated last year
- ☆26Updated 6 months ago
- AllenNLP integration for Shiba: Japanese CANINE model☆12Updated 3 years ago
- Finding all pairs of similar documents time- and memory-efficiently☆60Updated 2 months ago
- ☆50Updated last year
- ☆16Updated 3 years ago
- おーぷん2ちゃんねるをクロールして作成した対話コーパス☆96Updated 3 years ago
- Python implementation of SWEM (Simple Word-Embedding-based Methods)☆29Updated 2 years ago
- ☆20Updated 4 years ago
- ☆19Updated 11 months ago
- ☆11Updated 2 years ago
- Japanese-BPEEncoder☆41Updated 3 years ago
- Utility scripts for preprocessing Wikipedia texts for NLP☆77Updated last year
- Sentence Embeddings with BERT & XLNet☆32Updated last year
- 敬語変換タスクにおける評価用データセット☆21Updated 2 years ago
- ☆34Updated 5 years ago
- docker for UTH-BERT: https://ai-health.m.u-tokyo.ac.jp/uth-bert☆14Updated 2 years ago
- Pytorch Tutorial for M1 students. This repository include Encoder Deocder model and Classification model building code.☆12Updated 2 years ago
- ☆36Updated 4 years ago
- Wikipediaを用いた日本語の固有表現抽出データセット☆138Updated last year
- Exploring Japanese SimCSE☆68Updated last year
- 📝 A list of pre-trained BERT models for Japanese with word/subword tokenization + vocabulary construction algorithm information☆130Updated 2 years ago
- Use custom tokenizers in spacy-transformers☆16Updated 2 years ago