SqueezeBits / owliteLinks
OwLite is a low-code AI model compression toolkit for AI models.
☆45Updated 3 weeks ago
Alternatives and similar repositories for owlite
Users that are interested in owlite are comparing it to the libraries listed below
Sorting:
- A performance library for machine learning applications.☆184Updated last year
- ☆88Updated last year
- ☆56Updated 2 years ago
- OwLite Examples repository offers illustrative example codes to help users seamlessly compress PyTorch deep learning models and transform…☆10Updated 8 months ago
- ☆53Updated 6 months ago
- PyTorch CoreSIG☆55Updated 5 months ago
- ☆100Updated last year
- Reproduction of Vision Transformer in Tensorflow2. Train from scratch and Finetune.☆48Updated 3 years ago
- Official Github repository for the SIGCOMM '24 paper "Accelerating Model Training in Multi-cluster Environments with Consumer-grade GPUs"☆70Updated 10 months ago
- FriendliAI Model Hub☆91Updated 2 years ago
- 42dot LLM consists of a pre-trained language model, 42dot LLM-PLM, and a fine-tuned model, 42dot LLM-SFT, which is trained to respond to …☆131Updated last year
- Study parallel programming - CUDA, OpenMP, MPI, Pthread☆57Updated 2 years ago
- ☆43Updated last year
- Official repository for EXAONE built by LG AI Research☆183Updated 10 months ago
- A library for training, compressing and deploying computer vision models (including ViT) with edge devices☆68Updated 2 weeks ago
- 한국어 언어모델 다분야 사고력 벤치마크☆190Updated 7 months ago
- Benchmark in Korean Context☆131Updated last year
- ☆47Updated last year
- ☆68Updated last year
- Review papers of NLP, mainly LLM.☆32Updated last year
- The most modern LLM evaluation toolkit☆59Updated last week
- Official repository for KoMT-Bench built by LG AI Research☆63Updated 10 months ago
- Ditto is an open-source framework that enables direct conversion of HuggingFace PreTrainedModels into TensorRT-LLM engines.☆41Updated this week
- ☆15Updated this week
- ☆106Updated 2 years ago
- ☆25Updated 4 months ago
- QUICK: Quantization-aware Interleaving and Conflict-free Kernel for efficient LLM inference☆118Updated last year
- "A survey of Transformer" paper study 👩🏻💻🧑🏻💻 KoreaUniv. DSBA Lab☆188Updated 3 years ago
- 🔍 우리가 읽을 논문을 찾아서, Cite.GG☆89Updated 11 months ago
- [KO-Platy🥮] Korean-Open-platypus를 활용하여 llama-2-ko를 fine-tuning한 KO-platypus model☆75Updated last year