MilesCranmer / pysr_scaling_laws
You should use PySR to find scaling laws. Here's an example.
☆33Updated last year
Alternatives and similar repositories for pysr_scaling_laws:
Users that are interested in pysr_scaling_laws are comparing it to the libraries listed below
- Fine-grained, dynamic control of neural network topology in JAX.☆21Updated last year
- Supplementary code for the paper "Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces"☆42Updated last year
- ☆41Updated 3 months ago
- ☆60Updated 3 years ago
- Pytorch implementation of SuperPolyak subgradient method.☆43Updated 2 years ago
- Turning SymPy expressions into JAX functions☆44Updated 4 years ago
- Visualize neural networks using TikZ in Julia☆13Updated last month
- ☆28Updated 3 years ago
- Automated SMC with Probabilistic Program Proposals, for the Gen PPL.☆22Updated last year
- ☆24Updated last year
- ☆18Updated 11 months ago
- Turn jitted jax functions back into python source code☆22Updated 3 months ago
- A Simple Statistical Distribution Library in JAX☆16Updated 11 months ago
- Visualize, create, and operate on pytrees in the most intuitive way possible.☆44Updated 2 months ago
- Code for paper https://arxiv.org/abs/2306.07961☆53Updated 9 months ago
- Stochastic trace estimation using JAX☆14Updated last week
- Exponential families for JAX☆63Updated this week
- ☆33Updated 4 years ago
- Stencil computations in JAX☆70Updated last year
- ☆15Updated 4 years ago
- Dive into Jax, Flax, XLA and C++☆31Updated 4 years ago
- A high performance I/O library for deep learning in Julia, based on the PyTorch WebDataset library☆12Updated 3 months ago
- Bibtex for various Python science and machine learning software☆32Updated 2 years ago
- Proof-of-concept of global switching between numpy/jax/pytorch in a library.☆18Updated 9 months ago
- Website for the book "The Elements of Differentiable Programming".☆13Updated 6 months ago
- Engineering the state of RNN language models (Mamba, RWKV, etc.)☆32Updated 9 months ago
- "Parameter origami" -- folding and unfolding collections of parameters for optimization and sensitivity analysis.☆14Updated last year
- Why multiple dispatch lets you write composable code☆40Updated 4 years ago
- Source-to-Source Debuggable Derivatives in Pure Python☆15Updated last year
- ☆21Updated last year