Figura-Labs-Inc / telegraf_nv_export
Ultra low overhead NVIDIA GPU telemetry plugin for telegraf with memory temperature readings.
☆63Updated 9 months ago
Alternatives and similar repositories for telegraf_nv_export:
Users that are interested in telegraf_nv_export are comparing it to the libraries listed below
- Simple Transformer in Jax☆136Updated 10 months ago
- look how they massacred my boy☆63Updated 6 months ago
- smolLM with Entropix sampler on pytorch☆151Updated 5 months ago
- NanoGPT-speedrunning for the poor T4 enjoyers☆62Updated this week
- MLX port for xjdr's entropix sampler (mimics jax implementation)☆64Updated 5 months ago
- ☆23Updated 8 months ago
- Modify Entropy Based Sampling to work with Mac Silicon via MLX☆50Updated 5 months ago
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆96Updated last month
- Just large language models. Hackable, with as little abstraction as possible. Done for my own purposes, feel free to rip.☆44Updated last year
- ☆55Updated last month
- smol models are fun too☆92Updated 5 months ago
- ☆38Updated 9 months ago
- an implementation of Self-Extend, to expand the context window via grouped attention☆119Updated last year
- compute, storage, and networking infra at home☆65Updated last year
- could we make an ml stack in 100,000 lines of code?☆42Updated 9 months ago
- Fast parallel LLM inference for MLX☆184Updated 9 months ago
- inference code for mixtral-8x7b-32kseqlen☆99Updated last year
- A tree-based prefix cache library that allows rapid creation of looms: hierarchal branching pathways of LLM generations.☆68Updated 2 months ago
- Compiling useful links, papers, benchmarks, ideas, etc.☆42Updated last month
- moondream in zig.☆63Updated 2 weeks ago
- Learning about CUDA by writing PTX code.☆128Updated last year
- Full finetuning of large language models without large memory requirements☆94Updated last year
- Helpers and such for working with Lambda Cloud☆51Updated last year
- Synthetic data derived by templating, few shot prompting, transformations on public domain corpora, and monte carlo tree search.☆32Updated last month
- PTX-Tutorial Written Purely By AIs (Deep Research of Openai and Claude 3.7)☆65Updated last month
- Cerule - A Tiny Mighty Vision Model☆67Updated 7 months ago
- ☆97Updated 6 months ago
- Turing machines, Rule 110, and A::B reversal using Claude 3 Opus.☆59Updated 11 months ago
- DeMo: Decoupled Momentum Optimization☆186Updated 4 months ago
- Lego for GRPO☆27Updated 3 weeks ago