I have written a lot of MIDI-programs, but that was a looong, looong
time ago. I wrote those programs around 1990 in C++, on an Atari ST. But
I started writing C++ again, on a Ubuntu-machine (after programming 15
years in Pascal on Windows), so I think I can help you with your
problem. It may be a bit rusty, but I assume you are not in such a hurry.
I have some questions and remarks about it. First of all, what kind of
signals you want to feed NuPIC with? I can imagine 3 levels of signals:
1) low level MIDI-events.
MIDI-events are messages, like note on (press a key with certain speed),
note off (release the key), program change (goto another instrument on
that MIDIchannel), change control (like a joystick), move a pitchwheel,
etc. Many messages have no musical meaning, they are just there to
control the synthesizer.
Disadvantage is that it has not much to do with music, it is not what
people hear. Advantage is that you are more flexible. You could hire
live musicians, let them play on MIDI-instruments and feed that to NuPIC.
If you want this level, you could restrict yourself to note on, note off
and program change messages. These are the most important ones.
2) instruments that play music
There is a MIDI-standard, and there are many midifiles following that
standard. If you let windows play a midifiles, it also follows that
standard. That standard is that if you change to program-number 1, it
start to play a piano. If you goto number 69, it plays a oboe. If you go
to number 37, it is a "slap bass 1", etc. If it plays on MIDI-channel
10, it is always a drum, and each note is linked to some drum, like
hihat, snare drum, etc. The instruments are fixed.
In that case, a standard midifile could be converted to a list of
records, like this: time, MIDI-channel, instrument, note on or note off,
velocity (= how loud it is played)
Advantage is that it is closest to what people hear when they listen to
music. Just several instruments or drums that play some notes.
Disadvantage is that not all midifiles follow the standard and that you
cannot put life music to NuPIC (well, maybe with some adaptions it is
possible).
3) sheet music
You could go a step further. You can "translate" a lot of midifiles to
sheet music. That means that the notes have fixed lengths (whole, half,
quarter notes, etc). And you should know speed and rythm. That is mainly
the case with classical music (it would be nice to feed all Bachs
cantates to NuPIC). A lot a midifiles can handled that way, but what do
you do with drums?
Another problem is input and output. I assume that input will be a
midifile, but what output do you want? It depends on the level, and I
could just make a textfile with records. Or let the program encode the
midifile. Or (like a midiplayer) the program could send on the right
time a message. But to what? Another program? Should I make a file for
each channel? Etc, etc. What are your ideas about that?
greetings: Jos Theelen
On 2015-03-20 14:52, Matthew Taylor wrote:
Well, the idea is to transform MIDI into NuPIC input. How we do that
is the question. I've been thinking about this a lot lately, and I
think the first step would be to try to take one MIDI track (one
instrument in a MIDI song) and transform it into scalar values that we
could feed into a NuPIC model. Ideally, we'd create a MIDI encoder,
but I'm not sure if MIDI files are set up for streaming or not (I
assume MIDI streams well).
For instance, a MIDI song might have a bunch of different tracks for
each instrument, and we could train models on each track. If I could
get each track input into different NuPIC models, I might be able to
identify when the song moves from verse to chorus and back, when any
new refrain is introduced, when there is a key change or time change,
etc.
---------
Matt Taylor
OS Community Flag-Bearer
Numenta
On Fri, Mar 20, 2015 at 5:53 AM, Chris Albertson
<[email protected]> wrote:
Use the language you know best, whatever that is.
No those two things are different. The first does MIDI and second does
audio. MIDI does NOT contain any audio information If they were to
interact then you'd need something to produce sound from MIDI which we call
a "virtual instrument".
Sounds like you might be re-inventing wheels. What is the "big picture"?.
On Fri, Mar 20, 2015 at 4:55 AM, Richard Crowder <[email protected]> wrote:
I have MIDI equipment and SW. But not dealt with MIDI parsing for a long
time. Which language, assuming Python?
Of interest, I had wondered whether the
https://github.com/abudaan/MIDIBridge could interact with
https://github.com/bbcrd/peaks.js and JHTM ?
On Thu, Mar 12, 2015 at 3:11 PM, Matthew Taylor <[email protected]> wrote:
Has anyone worked with MIDI before?
http://www.midi.org/techspecs/midispec.php
---------
Matt Taylor
OS Community Flag-Bearer
Numenta
--
Chris Albertson
Redondo Beach, California
-----
No virus found in this message.
Checked by AVG - www.avg.com
Version: 2015.0.5856 / Virus Database: 4311/9344 - Release Date: 03/20/15