amirabbasasadi / RockyMLLinks
⛰️ RockyML - A High-Performance Scientific Computing Framework for Non-smooth Machine Learning Problems
☆20Updated 2 years ago
Alternatives and similar repositories for RockyML
Users that are interested in RockyML are comparing it to the libraries listed below
Sorting:
- A Gentle Principled Introduction to Deep Reinforcement Learning☆19Updated 9 months ago
- This repository hosts code for converting the original MLP Mixer models (JAX) to TensorFlow.☆15Updated 4 years ago
- This repository provides a Colab Notebook that shows how to use Spatial Transformer Networks inside CNNs in Keras.☆37Updated 3 years ago
- Companion code for a tutorial on using Hydra.☆32Updated 4 years ago
- A tutorial on JAX (https://github.com/google/jax/)☆47Updated 7 years ago
- List of awesome JAX resources☆13Updated 3 years ago
- Simplified implementation of UMAP like dimensionality reduction algorithm☆53Updated last year
- A neural network hyper parameter tuner☆30Updated 2 years ago
- Code for our ICLR Trustworthy ML 2020 workshop paper "Improved Image Wasserstein Attacks and Defenses"☆14Updated 5 years ago
- Keras-like APIs for JAX framework☆50Updated 2 years ago
- Source-to-Source Debuggable Derivatives in Pure Python☆15Updated 2 years ago
- This repository hosts the code to port NumPy model weights of BiT-ResNets to TensorFlow SavedModel format.☆14Updated 4 years ago
- Numerically Solving Parametric Families of High-Dimensional Kolmogorov Partial Differential Equations via Deep Learning (NeurIPS 2020)☆22Updated 3 years ago
- ☆36Updated 3 years ago
- You should use PySR to find scaling laws. Here's an example.☆33Updated 2 years ago
- This repository hosts code for converting the original Vision Transformer models (JAX) to TensorFlow.☆33Updated 3 years ago
- Automation tools for Python benchmarking☆19Updated 6 years ago
- Jax SSM Library☆48Updated 3 years ago
- a lightweight transformer library for PyTorch☆72Updated 4 years ago
- machine learning model performance metrics & charts with confidence intervals, optimized with numba to be fast☆16Updated 4 years ago
- A collection of optimizers, some arcane others well known, for Flax.☆29Updated 4 years ago
- PyHopper is a hyperparameter optimizer, made specifically for high-dimensional problems arising in machine learning research.☆85Updated 2 years ago
- Nvidia contributed CUDA tutorial for Numba☆265Updated 3 years ago
- Massively Parallel and Asynchronous Architecture for Logic-based AI☆43Updated 3 years ago
- ☆37Updated 3 years ago
- Implementation of "Analysing Mathematical Reasoning Abilities of Neural Models"☆30Updated 2 years ago
- ☆68Updated 10 months ago
- ML/DL Math and Method notes☆66Updated 2 years ago
- Neural Networks for JAX☆84Updated last year
- Fourth place solution to the "OpenVaccine: COVID-19 mRNA Vaccine Degradation Prediction" organized by Stanford University and Kaggle☆21Updated 5 years ago