arsedler9 / nlb-lightningLinks
PyTorch Lightning utilities that make it easier to train and evaluate deep models for the Neural Latents Benchmark.
☆8Updated 2 years ago
Alternatives and similar repositories for nlb-lightning
Users that are interested in nlb-lightning are comparing it to the libraries listed below
Sorting:
- PyTorch-based library for various kinds of representational-similarity analysis☆24Updated 11 months ago
- NeuroTask: A Benchmark Dataset for Multi-Task Neural Analysis☆12Updated last month
- ☆77Updated 3 years ago
- A TensorFlow 2.0 implementation of Latent Factor Analysis via Dynamical Systems (LFADS) and AutoLFADS.☆21Updated last year
- Gaussian Process Factor Analysis with Dynamical Structure☆16Updated 4 years ago
- Bayesian learning and inference for state space models (SSMs) using Google Research's JAX as a backend☆59Updated 11 months ago
- Pytorch implementation of lfads, and hierarchical extension☆26Updated 3 years ago
- Code accompanying Inferring stochastic low-rank RNNs from neural data. @matthijspals☆20Updated last month
- ☆48Updated 3 years ago
- ☆23Updated last year
- Exercises and examples for the latent dynamics workshop☆17Updated last year
- ☆12Updated last year
- ☆72Updated 2 years ago
- ☆20Updated 3 months ago
- Pytorch implementation of LFADS for demo at CAN workshop☆19Updated 5 years ago
- Backend for Real-time Asynchronous Neural Decoding (BRAND)☆32Updated 2 months ago
- IBL foundation model☆22Updated 6 months ago
- Code for Galgali et al, 2023☆13Updated 2 years ago
- ☆20Updated last year
- ☆13Updated 2 years ago
- Notebooks from the workshop tutorial implementing and discussing a range of generative models commonly used in neuroscience.☆38Updated 2 years ago
- The official re-implementation of the Neurips 2021 paper, "Targeted Neural Dynamical Modeling".☆9Updated 3 years ago
- Modeling cortical visual topography with interactive topographic network (ITN) models☆15Updated 3 years ago
- Tutorial on how to use BRAND☆12Updated last year
- This code package is for the Tensor-Maximum-Entropy (TME) method. This method generates random surrogate data that preserves a specified …☆18Updated 7 years ago
- Some methods for comparing network representations in deep learning and neuroscience.☆138Updated 10 months ago
- ☆23Updated this week
- Fitting low-rank RNNs to neural trajectories (LINT method).☆16Updated 2 months ago
- YASS: Yet Another Spike Sorter☆66Updated 2 years ago
- Neyman-Scott point process model to identify sequential firing patterns in high-dimensional spike trains☆66Updated last year