Skip to content

Recurrent Neural Network for generative MIDI music

Notifications You must be signed in to change notification settings

Ludwiggle/GRUMIDI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

GRUMIDI

A GRU1-based RNN2 for rhythmic pattern generation. The RNN model is a char-rnn that gets trained on an input MIDI file encoded as a sequence of unit vectors

Prerequisite

WolframKernel Wolframscript

Run $ wolframscript -configure and set the variable WOLFRAMSCRIPT_KERNELPATH to your local WolframKernel address

Usage

  1. Run $ wolframscript -f encodeAndTrain.wl

    Type the input filename5 *.mid

The trained net and decoding parameters are saved in data/.

  1. Run $ wolframscript -f generateAndDecode.wl

    Generated *.mid is saved in data/.

Discussion

In general, a MIDI file is not defined on a time-grid; MIDI events might be defined by machine-precision digits. The first script will take care of time-quantization by fitting every MIDI event on a time-grid the resolution of which is equal to the minimum distance between two consecutive events that are found in the input MIDI file. The generated MIDI inherits this time-quantization.

The dimension of the unit vectors is equal to the number of different "notes" found in the input MIDI, e.g. the chromatic scale would be encoded with 12-dimensional unit vectors. Polyphony is encoded by vector addition of simultaneous events.

Similarly to LSTMetallica, the encoded input MIDI is riffled with "BAR" every 16 unit vectors for segmentation of measures. These "BAR" markers are deleted once the nerual net output is decoded to MIDI format.


1Gated Recurrent Unit

2Recurrent Neural Network

3Musical Instrument Digital Interface

4Full address or local address.