Hi David and Carl and others,

It appears that the internal midi code is taking some set of objects, and 
mapping them to midi events. No great insight here. 

I have a source code question, and a related design question:

Source Code Question:

Where are the musical objects that are being formatted to midi: are they Guile 
objects or C++ objects? What’s the best way to see and understand these 
objects. From what I can see, most objects in lilypond are format related 
(clefs, stems…), not purely musical objects (this pitch for this duration at 
this time). I tried the graphviz, and all those objects appear to be format 
related (though I see one can tune this output).

Design Question:

It appears that the problem with the non-articulate midi output is that it 
takes these events, and maps them one-to-one. What articulate does is to 
generate a different set of objects from the originals. These new objects are 
like a ‘performance’ of the original objects. I suspect that was Jan’s original 
intention (hence the names performer in the midi source files), but never 
actually did the step of making a performance from the original events.

My assertion: articulate.ly is awkward and less than successful because it 
doesn’t have access to all the information that the C++ code has. But perhaps 
I’m wrong: perhaps 100% of what the C++ code knows is available to Guile code?

-d


_______________________________________________
lilypond-devel mailing list
[email protected]
https://lists.gnu.org/mailman/listinfo/lilypond-devel

Reply via email to