KamitaniLab / EmotionVideoNeuralRepresentationLinks
Data and code for reproducing results of Horikawa, Cowen, Keltner, and Kamitani (2020) The neural representation of visually evoked emotion is high-dimensional, categorical, and distributed across transmodal brain regions. iScience (https://www.cell.com/iscience/fulltext/S2589-0042(20)30245-5).
☆22Updated last year
Alternatives and similar repositories for EmotionVideoNeuralRepresentation
Users that are interested in EmotionVideoNeuralRepresentation are comparing it to the libraries listed below
Sorting:
- Use DNN to study the brain and use brain measurement to evaluate DNN☆45Updated last year
- Voxelwise Encoding Model tutorials from the Gallant lab.☆91Updated 3 weeks ago
- Starting code for Algonauts 2021 challenge☆44Updated 4 years ago
- Starter code for the BOLDMoments fMRI video dataset☆18Updated last year
- Representational Similarity Analysis on MEG and EEG data☆77Updated this week
- A python package for predicting group-level fMRI responses to visual stimuli using deep neural networks☆13Updated 9 months ago
- Code related to analyzing the Natural Scenes Dataset☆47Updated last year
- Python package to conduct feature-reweighted representational similarity analysis.☆29Updated 2 years ago
- A toolbox for accurate single-trial estimates in fMRI time-series data