rh-cmput275 / w25-cmput275Links
W25 CMPUT275 Repository
☆19Updated 3 months ago
Alternatives and similar repositories for w25-cmput275
Users that are interested in w25-cmput275 are comparing it to the libraries listed below
Sorting:
- ☆241Updated 3 months ago
- Fine-tuning LLMs to resist hallucination in Retrieval-Augmented Generation by training on mixed factual and fictitious contexts.☆44Updated last month
- llama.cpp fork with additional SOTA quants and improved performance☆800Updated this week
- A C++ implementation of a Multilayer Perceptron (MLP) neural network using Eigen, supporting multiple activation and loss functions, mini…☆137Updated 2 months ago
- OpenAlpha_Evolve is an open-source Python framework inspired by the groundbreaking research on autonomous coding agents like DeepMind's A…☆842Updated last month
- Code and explanation for all the models/technologies used in the ComputerPhile MikeBot video☆211Updated last month
- Model swapping for llama.cpp (or any local OpenAPI compatible server)☆1,048Updated last week
- LM Studio Python SDK☆562Updated this week
- Configurable response server for Project J.A.I.son☆285Updated last month
- Fully local program to make your own AI waifu! Vtuber model, voice, ect. Emphasis on personal use and companionship.☆326Updated last week
- Kyutai's Speech-To-Text and Text-To-Speech models based on the Delayed Streams Modeling framework.☆1,960Updated this week
- Eidos – A Self-Growing AI Agent with Long-Term Memory and Environmental Awareness☆19Updated 2 weeks ago
- A minimalistic framework for transparently training language models and storing comprehensive checkpoints for in-depth learning dynamics …☆286Updated last month
- Python client library for the Sesame AI API, enabling voice conversations with AI characters like Miles and Maya.☆83Updated 4 months ago
- The AI toolkit for the AI developer☆815Updated this week
- An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs☆441Updated this week
- The official API server for Exllama. OAI compatible, lightweight, and fast.☆1,008Updated this week
- A wireless mod for the TI-84 (not CE).☆1,166Updated 7 months ago
- A PowerShell script anti-virus evasion tool☆1,145Updated 2 years ago
- Game pygame☆9Updated 2 years ago
- ☆151Updated 3 months ago
- ☆10Updated 2 years ago
- The Pain and Agony Archive☆24Updated last week
- Run Orpheus 3B Locally With LM Studio☆438Updated 4 months ago
- Run LLM Agents on Ryzen AI PCs in Minutes☆462Updated 3 weeks ago
- Understand the nature of malicious software with practical examples in Python.☆2,084Updated last year
- Sesame CSM 1B Voice Cloning☆316Updated 4 months ago
- Implements harmful/harmless refusal removal using pure HF Transformers☆960Updated last year
- Documentation on setting up a local LLM server on Debian from scratch, using Ollama/llama.cpp/vLLM, Open WebUI, Kokoro FastAPI, and Comfy…☆497Updated 2 months ago
- Learn electrical engineering along side me☆11Updated 2 years ago