zbller / MecariLinks
☆28Updated last week
Alternatives and similar repositories for Mecari
Users that are interested in Mecari are comparing it to the libraries listed below
Sorting:
- 青空文庫振り仮名注釈付き音声コーパスのデータセット☆38Updated 6 months ago
- Preferred Generation Benchmark☆85Updated last week
- J-Moshi: A Japanese Full-duplex Spoken Dialogue System☆272Updated 3 months ago
- ☆36Updated 9 months ago
- Browser-based chat UI for TinySwallow-1.5B that runs without API calls.☆122Updated 2 weeks ago
- 【2024年版】BERTによるテキスト分類☆29Updated last year
- Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" in pytorch☆20Updated last year
- AI based singing voice synthesis☆37Updated last year
- Python-based chat demo for TinySwallow-1.5B that works completely offline☆57Updated 7 months ago
- AivisSpeech Engine: AI Voice Imitation System - Text to Speech Engine☆127Updated 3 weeks ago
- ☆32Updated last year
- The Remdis toolkit: Building advanced real-time multimodal dialogue systems with incremental processing and large language models☆99Updated 3 months ago
- ☆150Updated last week
- DeepLearningのAttentionモデルをPytorchの低レベルAPIを使って1から制作しようという試みのリポジトリです。☆64Updated last year
- VOICEVOXのコア内で用いられて いるディープラーニングモデルの推論コード☆31Updated 5 months ago
- This is a repository for comparing voice changer results and searching datasets and trained models.☆30Updated 2 years ago
- Aivis Voice Model File (.aivm/.aivmx) Utility Library☆22Updated last month
- ☆86Updated 2 years ago
- ☆49Updated 8 months ago
- 0️⃣1️⃣🤗 BitNet-Transformers: Huggingface Transformers Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" i…☆96Updated last year
- Japanese LLaMa experiment☆54Updated 9 months ago
- ☆12Updated 6 months ago
- ☆41Updated 3 weeks ago
- A command-line tool that uses Gemini API to generate summaries of academic papers.☆45Updated 3 months ago
- ☆16Updated last year
- This project uses llama.cpp as an LLM server to perform inference and generate speech using Synthetic voice library☆22Updated last year
- DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.☆45Updated 2 years ago
- ☆27Updated last year
- Mecab + NEologd + Docker + Python3☆36Updated 3 years ago
- ☆29Updated last year