OK, alex, you asked for it... here's the message I just sent to Mario. I have not attached the version of pysco I sent to Maurizio, it's 32K which seemed a bit long for the list, and this is a long message already. If anyone wants to see it, ask me and I'll mail it to you.
----- Forwarded message from Paul Winkler <[EMAIL PROTECTED]> ----- Subject: Re: Python and music [was Re: New work, new tools] Date: Mon, 1 Oct 2001 01:57:18 -0400 From: Paul Winkler <[EMAIL PROTECTED]> To: Maurizio Umberto Puxeddu <[EMAIL PROTECTED]> On Sun, Sep 30, 2001 at 04:22:19PM +0200, Maurizio Umberto Puxeddu wrote: > If I understand correctly what you want, using OMDE/pmask you'd write this > way (more or less): > > from omde.score import Aggregate, I > import pmask, pmask.bpf > > begin, end = 0.0, 10.0 > density = pmask.bpf.LinearSegment((begin, 1.0), (end, 0.5)) > duration = 0.1 > > r1 = pmask.cloud(begin, end, 1, density, duration) > r2 = pmask.cloud(begin, end, 2, density * 0.5, duration) > r3 = pmask.cloud(begin, end, 3, density * (2.0 / 3.0), duration) > > polyrythm = Aggregate(r1, r2, r3) I *think* you have understood me correctly. If everything above does what it looks like it should, this will produce the kind of output I'm talking about. But that's a fairly trivial example. What if r3's tempo has a more complex tempo map that's relative not to r1 but to r2? And what if I then introduce r4 whose tempo is relative to r3? etc. etc. More about this topic below. So far I've only looked at pmask 0.12, which doesn't seem to include OMDE. Also I didn't see OMDE at your sourceforge page. Can you say a little more about what OMDE is intended for? I see that Aggregate comes from omde, so I'm guessing it gives you ways to combine musical objects into more complex musical objects... so maybe omde and pysco overlap somewhat. General questions: Can an Aggregate contain other Aggregates? Can there be time alterations at the Aggregate level as well as the pmask object level? > This is the simplest case of course. > Here the event has no parameters but onset and duration. You could > as well make a constructor for this complex object > (parametrizing the reference density and other parameters), use > generic_cloud to build a polyrithm of more complex objects, the "out > of time" version of the generator (� la Csound, just syntactic sugar, > it is instanced and mapped in a defined time range inside cloud()), > more complex relationship beetwen densities, omde.rythm to indicate rythm > with traditional duration symbols (not in second but similar to Lilypond, > I mean) and something more. This all sounds very nice, and is more motivation for me to make this all work together. You've done a lot of work that I'd like to take advantage of! > The omde.rythm notation for rhythm (yes, there is a typo in the module > name...) will probably get the same of the Score11 rhythm statement > notation. This because the latter is more complete and because I'll > probably have a Score11 implementation in Python/OMDE/pmask (~60% now). > It translates a Score11 file into OMDE/pmask objects, then renders them as > OMDE object and optionally prints them as Csound scores. Do OMDE objects know about csound, or are they more abstract? One of my main goals is not to be tied to one output format. Looking at your pmask example scripts, there's not much besides the FStatement that appears to be inherently csound-related. I think that's good! Can OMDE objects contain other OMDE objects? Or is there only ever one "layer" of organization? I think providing deep layers is very very important, as I will describe below. Pysco overview ============== Maybe it will be useful if I say more about how I think about pysco. It's all built around the Event class. The important thing about an Event is that it has a play() method. This can do literally anything. I have some example Event subclasses that write csound score statements. Then there is the Blob class, which is my term for generic aggregate Events. (I've gone through a number of names; I thought Aggregate was a bit too long and hard to spell; Agg was nice and short but I just never felt comfortable with it. Blob has a nice whimsical sound and reminds me of the old sci-fi movie about the Blob that just kept eating things and getting bigger.) A Blob is an Event but it also contains Events. Blob.play() simply goes through the Blob's sub-events and calls play() on each one. I've been reading about design patterns lately; in DP terms, if I understand them correctly, an Event is a Component, and a Blob is both a Component and a Composite. Time is the primary concern of pysco. The way I handle time is as follows: An Event thinks of its start time as zero. It knows its own duration. It has no notion of absolute time. A Blob knows the start times of all its sub-events, relative to its own start time (which, again, it thinks of as zero). It knows its own total duration, which it calculates from the durations and start-times of its sub-events. The implications of this are pretty cool. You can stick blobs inside blobs inside blobs inside blobs inside blobs ... and everything just happens at the "right time". You can say "section B follows section A", and it will continue to work no matter how you change the length of section A. You can take an entire composition and wrap it inside another blob if you decide you want to use it as a movement of a larger work. Or you can take a small excerpt of a large work, call its play() method and hear only that excerpt. At any time during the compositional process, it's very easy to change your mind about how things are organized. This is the raison d'etre of pysco. You'll notice that I have not said anything about any attributes of individual musical events, except start and duration. That's because, as far as I can see, those are the ONLY two quantities which are guaranteed to be meaningful for absolutely all musical events. Pmask with Pysco ---------------- I still have a very limited understanding of pmask, but already I can see one way to work with both pmask and pysco: make subclasses of pysco.Event that wrap pmask classes. Calling .play() on such an Event would do something useful with pysco's idea of the current time and this Event's duration, eventually calling ScoreSection().writeTo(some_file). In this scenario, pysco just provides a layer of organization and modularity to pmask objects. What about going the other direction - wrapping a pysco Event in some pmask object? Can you think of any reason to do so? I can. Imagine a class like pmask's ScoreSection, with the same kind of interface, except that instead of an instrument number you'd feed it a pysco event subclass. In other words, instead of calling writeFile() to generate a bunch of csound i statements, you'd call ... um... some method that would generate a Blob of pysco Event objects, taking advantage of all the weird and wonderful Masks and things from pmask. This is not very clear in my mind, but I think something like that should be possible. I had always had the vague idea that I would want to be able to take some piece of data, maybe a simple list of values, and feed it to various parameters of various groups of notes. I don't really understand all the pmask examples yet, but I can already tell they are doing very interesting things of that nature. So I suspect that I won't want to re-invent anything here - I will spend some time playing with pmask and see if I can use elements of pmask for this job in pysco as well, even without the ScoreSection interface (which might not always be what I want to use). I had planned to eventually work on some nice ways to work with groups of pitches. I can see that you've done a lot of work in this area. I've just been looking in pitch.py, and scaleList alone is an amazing resource! So even if nothing else comes of our effort to work together, I expect I'll be using pmask in my own work. I hope that we will find a way to put these projects together, and that the result will be as good as chocolate with peanut butter. More Notes on Tempo ------------------- Pysco gets very interesting when tempo comes into the picture. We all know that sometimes it's very useful to think in other units than seconds. Therefore, my idea is that in pysco, all times are relative. The plan is that a Blob can have a tempo, or an evolving tempo map, which governs the interpretation of start and end times of all sub-events. This must propagate to all sub-events of sub-events, and interact correctly with any tempos specified in any of the sub-events. I don't yet know how I'm going to implement this, though I suspect the answer will involve Events being able to ask their parent Blobs for their current tempo. I think the easiest way to express tempo is as BPM at the top level, and as a ratio at sub-levels. I find it easy to conceive of "Two times three times one-half times 90 BPM", and hard to conceive of "100 relative to 67 relative to 32 relative to 70 BPM". Simple example: Blob A contains blob B which contains blob C. If I say the tempo of A is 44 bpm, and the tempo of B is twice that of its parent, and the tempo of C is half that of its parent, then we have this: Blob Ratio Tempo (bpm) ---- ----- ----- A 0.7333 44 B 2 88 C 0.5 44 At each level, at any given moment, bpm = ratio * (bpm of parent). If there is no parent (i.e. we are on the top level), the implied parent tempo is seconds ... 60 bpm. Now let's say we change the tempo of B. I've decided it should be three times that of its parent. Blob Ratio Tempo ---- ----- ----- A 0.7333 44 B 3 132 C 0.5 66 Notice that since C is defined relative to B, its tempo changes as well. Now I change my mind about the overall tempo, and speed up A to 50 BPM. B and C will automatically track the change. Blob Ratio Tempo ---- ----- ----- A 0.8333 50 B 3 150 C 0.5 75 Output issues ------------- I've said I want to support different kinds of output. How will this work? The most obvious way would be to make subclasses of Event that create the right kind of output. CsoundEvent(Event) MidiEvent(Event) TaonEvent(Event) etc... This is an expedient solution, but it makes it hard to do crazy things like 1) Create a big Blob of events 2) write some csound output from the blob 3) then write some MIDI output from the same blob That may sound like an odd feature to want. But I'm very wary of knowingly placing *any* limits on pysco, particularly on its multi-output capabilities. Maybe I'm too ambitious, but I don't ever want to write "You just can't do that" in the manual! I have not decided who has responsibility for knowing *where* the output should go. If I have a script that creates both a csound score and a midi file (and maybe use another script to render them into .wav files and mix the result), how will all the events know where to write their output when .play() is called? Options: 1) Pass the output as an argument to .play() I don't like this. It means that, to put heterogeneous Events in one Blob, the Blob would have to know which sub-events are which, and pass each one the correct output handle. That violates encapsulation - the Blob should treat all sub-events identically. And I don't want to restrict the kind of Events any Blob can contain. 2) Pass the output to each event at instantiation time. This preserves encapsulation - only an Event knows where it's going to put its output. Disadvantage: it's still hard to change the output for a bunch of Events at once. 3) can't think of any other options yet but I wonder if there's a Bridge pattern lurking in here somewhere. So far I've just used stdout for everything. -------------------- Enough for now... I've attached my current draft of pysco 0.0.2, and pysco_sample_classes.py which is a VERY rough sketch of some subclasses that implement csound output. I hope you find them interesting. -- ................ paul winkler ................ custom calendars: http://www.calendargalaxy.com A member of ARMS: http://www.reacharms.com home page: http://www.slinkp.com ----- End forwarded message ----- -- ................ paul winkler ................ custom calendars: http://www.calendargalaxy.com A member of ARMS: http://www.reacharms.com home page: http://www.slinkp.com
