Jep the goal is to have the HTM write into new midi files with the patterns it 
learned from the feeded ones. 

<div>-------- Original message --------</div><div>From: Matthew Taylor 
<[email protected]> </div><div>Date:03/25/2015  14:57  (GMT+01:00) 
</div><div>To: Pascal Weinberger <[email protected]> 
</div><div>Subject: Re: MIDI </div><div>
</div>I don't know. I think it depends on how the MIDI data gets encoded,
and how NuPIC deals with file streams. The best thing I can think
would be for NuPIC to produce a file stream as its prediction,
generating MIDI output.
---------
Matt Taylor
OS Community Flag-Bearer
Numenta


On Wed, Mar 25, 2015 at 2:18 AM, Richard Crowder <[email protected]> wrote:
> Is there an indication of what inferred output is required?
>
> For example;
> Voice separation - http://www.cs.ubc.ca/~hoos/Publ/KilHoo02.pdf
> Harmonic analysis - http://www.ismir2003.ismir.net/papers/Raphael.pdf
>
> or step further to, eg.
> Signal representation -
> https://files.nyu.edu/jb2843/public/Publications_files/Bello-ISMIR-2005.pdf
>
> On Wed, Mar 25, 2015 at 5:37 AM, Pascal Weinberger
> <[email protected]> wrote:
>>
>> Hey,
>>
>> I am currently trying to get sth like your 2) to work by feeding the
>> recorded midis in stime step by time step, using a variation of Hotgym opf
>> and trying to predingt what will be next in the music. I'm just working with
>> piano music; but still only works horribly. Any ideas how to encode
>> monotonic piano in the best way, maybe even so that it can be expanded?  I
>> am really interested in that, but still learning to deal with the nupic and
>> programming in general so give me some time :P
>> Thanks,
>> Pascal
>>
>>
>> -------- Original message --------
>> From: Matthew Taylor
>> Date:03/25/2015 04:25 (GMT+01:00)
>> To: Pascal Weinberger
>> Subject: Re: MIDI
>>
>> Jos, that's really helpful. When I get time to start working on this,
>> I would love to ask you some questions.
>>
>> Regards,
>> ---------
>> Matt Taylor
>> OS Community Flag-Bearer
>> Numenta
>>
>>
>> On Sun, Mar 22, 2015 at 7:14 AM, Jos Theelen <[email protected]> wrote:
>> > I have written a lot of MIDI-programs, but that was a looong, looong
>> > time
>> > ago. I wrote those programs around 1990 in C++, on an Atari ST. But I
>> > started writing C++ again, on a Ubuntu-machine (after programming 15
>> > years
>> > in Pascal on Windows), so I think I can help you with your problem. It
>> > may
>> > be a bit rusty, but I assume you are not in such a hurry.
>> >
>> > I have some questions and remarks about it. First of all, what kind of
>> > signals you want to feed NuPIC with? I can imagine 3 levels of signals:
>> >
>> > 1) low level MIDI-events.
>> > MIDI-events are messages, like note on (press a key with certain speed),
>> > note off (release the key), program change (goto another instrument on
>> > that
>> > MIDIchannel), change control (like a joystick), move a pitchwheel, etc.
>> > Many
>> > messages have no musical meaning, they are just there to control the
>> > synthesizer.
>> > Disadvantage is that it has not much to do with music, it is not what
>> > people
>> > hear. Advantage is that you are more flexible. You could hire live
>> > musicians, let them play on MIDI-instruments and feed that to NuPIC.
>> > If you want this level, you could restrict yourself to note on, note off
>> > and
>> > program change messages. These are the most important ones.
>> >
>> > 2) instruments that play music
>> > There is a MIDI-standard, and there are many midifiles following that
>> > standard. If you let windows play a midifiles, it also follows that
>> > standard. That standard is that if you change to program-number 1, it
>> > start
>> > to play a piano. If you goto number 69, it plays a oboe. If you go to
>> > number
>> > 37, it is a "slap bass 1", etc. If it plays on MIDI-channel 10, it is
>> > always
>> > a drum, and each note is linked to some drum, like hihat, snare drum,
>> > etc.
>> > The instruments are fixed.
>> > In that case, a standard midifile could be converted to a list of
>> > records,
>> > like this: time, MIDI-channel, instrument, note on or note off, velocity
>> > (=
>> > how loud it is played)
>> > Advantage is that it is closest to what people hear when they listen to
>> > music. Just several instruments or drums that play some notes.
>> > Disadvantage
>> > is that not all midifiles follow the standard and that you cannot put
>> > life
>> > music to NuPIC (well, maybe with some adaptions it is possible).
>> >
>> > 3) sheet music
>> > You could go a step further. You can "translate" a lot of midifiles to
>> > sheet
>> > music. That means that the notes have fixed lengths (whole, half,
>> > quarter
>> > notes, etc). And you should know speed and rythm. That is mainly the
>> > case
>> > with classical music (it would be nice to feed all Bachs cantates to
>> > NuPIC).
>> > A lot a midifiles can handled that way, but what do you do with drums?
>> >
>> > Another problem is input and output. I assume that input will be a
>> > midifile,
>> > but what output do you want? It depends on the level, and I could just
>> > make
>> > a textfile with records. Or let the program encode the midifile. Or
>> > (like a
>> > midiplayer) the program could send on the right time a message. But to
>> > what?
>> > Another program? Should I make a file for each channel? Etc, etc. What
>> > are
>> > your ideas about that?
>> >
>> > greetings: Jos Theelen
>> >
>> >
>> > On 2015-03-20 14:52, Matthew Taylor wrote:
>> >>
>> >> Well, the idea is to transform MIDI into NuPIC input. How we do that
>> >> is the question. I've been thinking about this a lot lately, and I
>> >> think the first step would be to try to take one MIDI track (one
>> >> instrument in a MIDI song) and transform it into scalar values that we
>> >> could feed into a NuPIC model. Ideally, we'd create a MIDI encoder,
>> >> but I'm not sure if MIDI files are set up for streaming or not (I
>> >> assume MIDI streams well).
>> >>
>> >> For instance, a MIDI song might have a bunch of different tracks for
>> >> each instrument, and we could train models on each track. If I could
>> >> get each track input into different NuPIC models, I might be able to
>> >> identify when the song moves from verse to chorus and back, when any
>> >> new refrain is introduced, when there is a key change or time change,
>> >> etc.
>> >> ---------
>> >> Matt Taylor
>> >> OS Community Flag-Bearer
>> >> Numenta
>> >>
>> >>
>> >> On Fri, Mar 20, 2015 at 5:53 AM, Chris Albertson
>> >> <[email protected]> wrote:
>> >>>
>> >>> Use the language you know best, whatever that is.
>> >>>
>> >>> No those two things are different.  The first does MIDI and second
>> >>> does
>> >>> audio.  MIDI does NOT contain any audio information  If they were to
>> >>> interact then you'd need something to produce sound from MIDI which we
>> >>> call
>> >>> a "virtual instrument".
>> >>>
>> >>> Sounds like you might be re-inventing wheels.  What is the "big
>> >>> picture"?.
>> >>>
>> >>> On Fri, Mar 20, 2015 at 4:55 AM, Richard Crowder <[email protected]>
>> >>> wrote:
>> >>>>
>> >>>>
>> >>>> I have MIDI equipment and SW. But not dealt with MIDI parsing for a
>> >>>> long
>> >>>> time. Which language, assuming Python?
>> >>>>
>> >>>> Of interest, I had wondered whether the
>> >>>> https://github.com/abudaan/MIDIBridge could interact with
>> >>>> https://github.com/bbcrd/peaks.js and JHTM ?
>> >>>>
>> >>>> On Thu, Mar 12, 2015 at 3:11 PM, Matthew Taylor <[email protected]>
>> >>>> wrote:
>> >>>>>
>> >>>>>
>> >>>>> Has anyone worked with MIDI before?
>> >>>>>
>> >>>>> http://www.midi.org/techspecs/midispec.php
>> >>>>>
>> >>>>> ---------
>> >>>>> Matt Taylor
>> >>>>> OS Community Flag-Bearer
>> >>>>> Numenta
>> >>>>>
>> >>>>
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>>
>> >>> Chris Albertson
>> >>> Redondo Beach, California
>> >>
>> >>
>> >>
>> >>
>> >> -----
>> >> No virus found in this message.
>> >> Checked by AVG - www.avg.com
>> >> Version: 2015.0.5856 / Virus Database: 4311/9344 - Release Date:
>> >> 03/20/15
>> >>
>> >>
>> >
>> >
>>
>

Reply via email to