YuanTony / chemistry-assistantLinks
☆13Updated last year
Alternatives and similar repositories for chemistry-assistant
Users that are interested in chemistry-assistant are comparing it to the libraries listed below
Sorting:
- maker100-robotics-iot-machine-learning-curriculum☆10Updated 3 weeks ago
- This repository has code for fine-tuning LLMs with GRPO specifically for Rust Programming using cargo as feedback☆104Updated 6 months ago
- ☆86Updated 3 weeks ago
- Rust port of llm.c by @karpathy☆43Updated last year
- Roberta Question Answering using MLX.☆24Updated last year
- xet client tech, used in huggingface_hub☆236Updated this week
- Educational framework exploring ergonomic, lightweight multi-agent orchestration. Modified to use local Ollama endpoint☆49Updated 11 months ago
- ☆12Updated last year
- Implementation of nougat that focuses on processing pdf locally.☆82Updated 8 months ago
- The easiest & fastest way to run customized and fine-tuned LLMs locally or on the edge☆20Updated 6 months ago
- First token cutoff sampling inference example☆31Updated last year
- a model manager for the Transformers library, implementing S3 and IPFS downloads☆19Updated 5 months ago
- Prompt Declaration Language (PDL) is a declarative prompt programming language.☆232Updated this week
- Distributed Inference for mlx LLm☆95Updated last year
- Public repository containing METR's DVC pipeline for eval data analysis☆110Updated 5 months ago
- Contains supporting materials for developer relations blog posts, videos, and workshops☆45Updated this week
- LLM training in simple, raw C/CUDA, migrated into Rust☆51Updated 6 months ago
- LLM as Interpreter for Natural Language Programming, Pseudo-code Programming and Flow Programming of AI Agents☆41Updated last year
- Route LLM requests to the best model for the task at hand.☆107Updated this week
- Inference Llama 2 in one file of zero-dependency, zero-unsafe Rust☆39Updated 2 years ago
- Inference examples☆58Updated last week
- MLX support for the Open Neural Network Exchange (ONNX)☆59Updated last year
- A tree-based prefix cache library that allows rapid creation of looms: hierarchal branching pathways of LLM generations.☆74Updated 7 months ago
- A lightweight LLaMA.cpp HTTP server Docker image based on Alpine Linux.☆29Updated last month
- DSPY on action with OpenSource LLMs.☆94Updated last year
- ollama like cli tool for MLX models on huggingface (pull, rm, list, show, serve etc.)☆103Updated last week
- A RAG API server written in Rust following OpenAI specs☆56Updated 5 months ago
- ☆119Updated last week
- ☆11Updated 4 months ago
- TiDB Vector SDK for Python, including code examples. Join our Discord: https://discord.gg/XzSW23Jg9p☆58Updated 2 months ago