dcmocanu / sparse-evolutionary-artificial-neural-networks
Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
☆242Updated 3 years ago
Related projects ⓘ
Alternatives and complementary repositories for sparse-evolutionary-artificial-neural-networks
- ☆70Updated 4 years ago
- Naszilla is a Python library for neural architecture search (NAS)☆304Updated last year
- End-to-end training of sparse deep neural networks with little-to-no performance loss.☆317Updated last year
- ☆143Updated last year
- ☆182Updated 3 months ago
- Gradient based hyperparameter optimization & meta-learning package for TensorFlow☆187Updated 4 years ago
- ☆219Updated 3 months ago
- Sparse learning library and sparse momentum resources.☆379Updated 2 years ago
- Experiments for the paper "Exponential expressivity in deep neural networks through transient chaos"☆70Updated 8 years ago
- Neural Architecture Search with Bayesian Optimisation and Optimal Transport☆133Updated 5 years ago
- Starter kit for the black box optimization challenge at Neurips 2020☆113Updated 4 years ago
- Keras implementation of Legendre Memory Units☆210Updated 4 months ago
- Guided Evolutionary Strategies☆265Updated last year
- Implementing Bayes by Backprop☆182Updated 5 years ago
- A collection of Deep Neuroevolution resources or evolutionary algorithms applying in Deep Learning (constantly updating)☆221Updated 3 years ago
- Code for Neural Architecture Search without Training (ICML 2021)☆460Updated 3 years ago
- Train self-modifying neural networks with neuromodulated plasticity☆76Updated 5 years ago
- Neural Turing Machines in pytorch☆47Updated 2 years ago
- ☆132Updated 7 years ago
- Example of "biological" learning for MNIST☆290Updated 2 years ago
- ☆159Updated 3 months ago
- Discovering Neural Wirings (https://arxiv.org/abs/1906.00586)☆138Updated 4 years ago
- Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.☆333Updated 4 years ago
- A small library implementing magnitude-based pruning in PyTorch☆28Updated 5 years ago
- Code for experiments in my blog post on the Neural Tangent Kernel: https://eigentales.com/NTK☆166Updated 5 years ago
- pyhessian is a TensorFlow module which can be used to estimate Hessian matrices☆23Updated 3 years ago
- ☆67Updated 5 years ago
- Bayesian neural network package☆138Updated 3 years ago
- Code for ICML 2018 paper on "Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam" by Khan, Nielsen, Tangkaratt, Lin, …☆110Updated 5 years ago
- Minimal Tensorflow implementation of the paper "Neural Architecture Search With Reinforcement Learning" presented at ICLR 2017☆41Updated 6 years ago