RichardYang40148 / mgeval
A toolbox for objective evaluation in symbolic music generation.
☆91Updated 3 years ago
Related projects ⓘ
Alternatives and complementary repositories for mgeval
- Algorithm and Data for paper "Automatic Detection of Hierarchical Structure and Influence of Structure on Melody, Harmony and Rhythm in P…☆85Updated 2 years ago
- Code accompanying ISMIR 2020 paper - "Music FaderNets: Controllable Music Generation Based On High-Level Features via Low-Level Feature M…☆51Updated 4 years ago
- Chord-Conditioned Melody Transformer☆36Updated 3 years ago
- The repository of the paper: Wang et al., Learning interpretable representation for controllable polyphonic music generation, ISMIR 2020.☆41Updated 7 months ago
- Controlling a LSTM to generate music with given sentiment (positive or negative).☆37Updated 2 years ago
- A chord identifier and harmonizer for MIDI files☆91Updated 3 years ago
- The repository of the paper: Wang et al., PIANOTREE VAE: Structured Representation Learning for Polyphonic Music, ISMIR 2020.☆41Updated 4 years ago
- Python MIDI track classifier and tonal tension calculation based on spiral array theory☆101Updated 5 months ago
- ☆123Updated 3 years ago
- MIDI demos in paper: "DEEP MUSIC ANALOGY VIA LATENT REPRESENTATION DISENTANGLEMENT"☆41Updated 5 months ago
- symbolic musical datasets☆128Updated 4 years ago
- ☆67Updated 5 years ago
- lead sheet datasets in various formats☆112Updated 3 years ago
- ☆102Updated 10 months ago
- The official implementation of Theme Transformer. A Theme-based music generation. IEEE TMM☆120Updated last year
- Z.Wang & G.Xia, MuseBERT: Pre-training of Music Representation for Music Understanding and Controllable Generation, ISMIR 2021☆44Updated 3 years ago
- Performance MIDI to Score (PM2S)☆57Updated last month
- Code for the ISMIR 2021 tutorial "Programming MIR Baselines from Scratch: Three Cases Studies"☆29Updated 3 years ago
- Code for BebopNet: Deep Neural Models for Personalized Jazz Improvisations☆68Updated 4 years ago
- Start-to-finish tutorial for interactive music co-creation in PyTorch and Tensorflow.js☆104Updated 3 years ago
- Generates multi-instrument symbolic music (MIDI), based on user-provided emotions from valence-arousal plane.☆59Updated 6 months ago
- Repository for paper templates in ISMIR Proceedings☆61Updated 4 months ago
- A toolkit for generating datasets of midi files which have been degraded to be 'un-musical'.☆38Updated 3 years ago
- ReconVAT: a semi-supervised automatic music transcription (AMT) model☆33Updated 10 months ago
- A toolkit for working with piano rolls☆136Updated last year
- Companion code for ISMIR 2017 paper "Deep Salience Representations for $F_0$ Estimation in Polyphonic Music"☆84Updated 4 years ago
- Experiments for Invariances and Data Augmentation for Supervised Music Transcription☆31Updated 4 years ago
- A minimum JukeMIR branch for feature extraction.☆32Updated 2 years ago
- A project to synthesize massive amounts of multitrack audio data from MIDI.☆56Updated 4 years ago
- Music Transformer Sequence Generation in Pytorch☆102Updated 4 years ago