Well, the idea is to transform MIDI into NuPIC input. How we do that is the question. I've been thinking about this a lot lately, and I think the first step would be to try to take one MIDI track (one instrument in a MIDI song) and transform it into scalar values that we could feed into a NuPIC model. Ideally, we'd create a MIDI encoder, but I'm not sure if MIDI files are set up for streaming or not (I assume MIDI streams well).
For instance, a MIDI song might have a bunch of different tracks for each instrument, and we could train models on each track. If I could get each track input into different NuPIC models, I might be able to identify when the song moves from verse to chorus and back, when any new refrain is introduced, when there is a key change or time change, etc. --------- Matt Taylor OS Community Flag-Bearer Numenta On Fri, Mar 20, 2015 at 5:53 AM, Chris Albertson <[email protected]> wrote: > Use the language you know best, whatever that is. > > No those two things are different. The first does MIDI and second does > audio. MIDI does NOT contain any audio information If they were to > interact then you'd need something to produce sound from MIDI which we call > a "virtual instrument". > > Sounds like you might be re-inventing wheels. What is the "big picture"?. > > On Fri, Mar 20, 2015 at 4:55 AM, Richard Crowder <[email protected]> wrote: >> >> I have MIDI equipment and SW. But not dealt with MIDI parsing for a long >> time. Which language, assuming Python? >> >> Of interest, I had wondered whether the >> https://github.com/abudaan/MIDIBridge could interact with >> https://github.com/bbcrd/peaks.js and JHTM ? >> >> On Thu, Mar 12, 2015 at 3:11 PM, Matthew Taylor <[email protected]> wrote: >>> >>> Has anyone worked with MIDI before? >>> >>> http://www.midi.org/techspecs/midispec.php >>> >>> --------- >>> Matt Taylor >>> OS Community Flag-Bearer >>> Numenta >>> >> > > > > -- > > Chris Albertson > Redondo Beach, California
