Re: [Alsa-devel] Re: [linux-audio-dev] midi events in jack callback / ALSA Sequencer

2002-08-21 Thread Frank van de Pol


On Wed, Aug 21, 2002 at 02:10:35AM +0100, Martijn Sipkema wrote:
> [...]
> > Within ALSA we have two priority queues, one for tick (bar,beat) scheduled
> > events, and one for clock (ns) scheduled events.
> 
> As MIDI uses MIDI tick messages for time based sync and MIDI clock messages
> for tempo based sync I kind of feel the ALSA sequencer naming is a little
> confusing :)

I understand. OTOH the problem might also be introduced back in the 80's
when MIDI 1.0 spec was introduced. The "MIDI Clock" (0xf8) message was
introduced with only beat sync in mind. I believe the phrase 'tick' to
specify song position came from the standard midi file (SMF 1.0)
specification.

In fact the definitions in seq_event.h are less confusing:

/** Real-time data record */
typedef struct snd_seq_real_time {
unsigned int tv_sec;/**< seconds */
unsigned int tv_nsec;   /**< nanoseconds */
} snd_seq_real_time_t;

/** (MIDI) Tick-time data record */
typedef unsigned int snd_seq_tick_time_t;

/** unioned time stamp */
typedef union snd_seq_timestamp {
snd_seq_tick_time_t tick;   /**< tick-time */
struct snd_seq_real_time time;  /**< real-time */
} snd_seq_timestamp_t;


> 
> I don't want to support tempo (MIDI clock) scheduling in my MIDI API. This
> could be better handled in the application itself. Also, when slaved to MIDI
> clock
> it is no longer possible to send messages ahead of time, and not supporting
> this
> in the API makes that clear to the application programmer.

The concept of the public available alsa timing queues allow outboard
applications to schedule events in a tempo locked fashion. Think of
applications like drum machines, appegiators, tempo delay etc. Of course you
might be using one big monolithic (cubase?) sequencer application which does
do everything you need

Compare it to an external (outboard) drum machine you're using since
programming a pattern in it so user friendly (hence you end up being more
creative), and sync that from your favorite sequencer which has the tempo
map for your song.

Of course all of this could be done without a queue which supports tempo
scheduling, but then you'll need to emit MIDI Clock events and rely on
immediate processing. In case a soft synth (or other potentially high
latency device) is triggered from the MIDI Clock you loose the ability to
correct for this.

[...] 
> > Since especially for soft synths (but also for some -undocumented!- USB
> midi
> > interfaces, like Emagic AMT)
> 
> Yes, I've repeatedly asked Emagic for documentation on their AMT protocol
> without success. :(

let them burn in peace 

[...]
> > When using events instead of midi bytes the merging is a no brainer
> 
> I was planning on doing that, but even then there are issues with for
> example
> (N)RPNs.
> 
> > leaves room for events not defined in midi spec).
> 
> ...I'm not sure that is a good idea. What kind of events?

eg.

- system events like announcements of topology changes
- (N)RPNs as a 14 bit value instead of 2x 7bit
- SMF like meta data
- controls for current sequencer queue: tempo, position, etc.

to name a few

Cheers,
Frank.

-- 
+ --- -- -  -   -- 
| Frank van de Pol  -o)A-L-S-A
| [EMAIL PROTECTED]/\\  Sounds good!
| http://www.alsa-project.org  _\_v
| Linux - Why use Windows if we have doors available?



[linux-audio-dev] (not so low) low latency - acpi, dri, proc

2002-08-21 Thread Fernando Pablo Lopez-Lezcano

Hi... anybody out there testing latency on an ACPI patched
kernel? I have two kernels I'm testing:

2.4.19-x:
  2.4.19
  2.4.20-pre4
  low latency for 2.4.19 (w/a couple of tweaks to patch pre4)
  acpi-20020815 for 2.4.20-pre4

2.4.18-x:
  2.4.18
  low latency for 2.4.18-pre10
  acpi-20020726 for 2.4.18

On both 2.4.18-x and 2.4.19-x I get quite high periodic
latency peaks that hover around 170mSecs.

I suspect acpi might be the culprit but I cannot try to
disable it because on this particular laptop you need acpi to
make sound work (probably something to do with routing of
interrupts,, also tried pci=biosirq on a non-acpi kernel
without success). Any suggestions?

DRI: I was about to write something about this and then
checked and yes, the latest lowlat patch (for 2.4.19) is
missing the dri reschedules (from Jussi Lako's patch set) so
that DRI is unusable if you don't add them (at least on a
Radeon based laptop). I'm recompiling with them to do more
tests.

Proc: something has changed from the 2.4.18 to the 2.4.19
kernels (maybe the acpi proc interface is to blame?). Big
mess of latency spikes in the proc latency test.

I'll post latency graphs later... this is all using
latencytest 0.42

-- Fernando



[linux-audio-dev] Re: [Csnd] RME boards for Linux/Csound

2002-08-21 Thread jpff

I have a Hammerfall Lite on one of my Linux machines, with external
DACs in the speakers.  I regret this decision almost every day.  It
took MONTHS to get an alsa driver which occasionally worked; it would
run out of memory, refuse to load, etc and the alsa mailing list was
close to useless.  I have arrived at the stage where I can play MP3
via mpg123, MIDI via Timidity, and mono files with aplay, but stereo
files are rendered as silence, albeit of the same length.  The OSS
emulation player does play stereo usually.  Mind you it breaks up
badly.
  I have a M-Audio DiO 2496 on my mail home machine and that is
simpler, and I am very happy with it (except reading digital from the
DAT took 6 months of experimentation -- think I have cracked it last
July; I was rebooting into W98 to read DATS).  The lab machine have a
Audiophile, which has not been so extensively used yet, but so far
seems OK.

Of course you may manage it better than I did; YMMV etc etc.

==John ffitch



Re: [linux-audio-dev] saol question: adjusting delay line time?

2002-08-21 Thread John Lazzaro

> Here's my basic architecture (criticism welcome):

One comment, remember that effects instruments can't have
their state updated at the k-rate via labelled control
statements, because there is no label on the effects
instr! Instead, for maximum portability, what you want
to do is to have a ksig global variable, and your control
driver write to it, and have your effects instrs import
it. Also, if you're using MIDI, the MIDI Master Channel's
standard name appears in the effects instr. And, you might
consider using the params[] standard name too:

imports exports ksig params[128]

that your control driver can update each kcycle, as described:

http://www.cs.berkeley.edu/~lazzaro/sa/sfman/devel/cdriver/data/index.html#bifs

although this has portability issues. 

> Right, I guess my question is, if I have an effects instrument "dly"
> that is in route and send statements, will I lose that routing if I
> release and then re-instantiate it, or is there a way to ensure that a
> new "dly" takes the place of the old one on the bus?  I thought that
> the only way to instantiate effects instruments was in send statements.

No, effects instruments get instantiated once, at startup, and the
bussing topology is fixed. As I mentioned in the earlier reply, for
your specific problem, a tapped delay line is the solution, that
is instantiated once at its maximum length, and then tapped at the
appropriate place to get the dynamic delay you want at the moment.

In general, though, you might find yourself in a situation where you
have a series of send/route statements that set up and audio signal
processing chain from input_bus to output_bus, but you want to tap
off that audio into an arbitrary instrument in the system. In this
case, the best thing to do is to either use a global wavetable or
a global ksig array as a buffer region, the basic architecture would
look something like this:

globals {
ksig lastkcyc[100]; // 100 audio samples per control period
krate 441;
srate 44100;
send(dispatch; ; input_bus);
sequence(dispatch, receiver);
}

instr dispatch ()

{
  ksig holding[100];
  ksig kidx;
  asig aidx;
  imports exports lastkcyc[100];

  // pushd last cycle into globals

  kidx = 0;
  while (kidx < 100)
   {
 lastkcyc[kidx] = holding[kidx];
 kidx = kidx + 1;
   }

  // at a-rate, buffer next cycle up

  holding[aidx] = input[0];  // mono input_bus assumed here
  aidx = (aidx == 99) ? 0 : aidx + 1;

}

instr receiver ()

{
   ksig imports lastkcyc[100];

   // at kcycle, read in lastcyc and prepare for
   // the arate, no need to make a local copy

   // note sequence() statement ensures ordering
   // is correct for this to work
}

This was all written on the fly and not tested or optimized.
But its rare that you'll need to use a technique like this,
only in situations where you want your SAOL code to behave
like a patch bay for effects that you can plug and unplug
on the fly, and even then you might be better off just using
a single effects instr, and doing the patch bay inside of
it, using user-defined aopcodes()'s to do the different
effects. 

Also, note the subtle way the code above is written, with
respect to the sequence order and such. The SAOL execution
ordering is quite explicit about what happenes when and
how when it comes to ksig imports and exports, this was
all purposely done so that compilers can do block coding
on the a-rate section without worrying about the effects
of inter-instrument communications. see:

http://www.cs.berkeley.edu/~lazzaro/sa/book/append/rules/index.html

all of the links under "Decoder Execution Order", to see
the logic behind this.

-
John Lazzaro -- Research Specialist -- CS Division -- EECS -- UC Berkeley
lazzaro [at] cs [dot] berkeley [dot] edu www.cs.berkeley.edu/~lazzaro
-



Re: [linux-audio-dev] saol question: adjusting delay line time?

2002-08-21 Thread Will Benton

On Wed, Aug 21, 2002 at 10:31:20AM -0700, Paul Winkler wrote:

> > In any case, here's my question: I have an effects instrument that
> > implements delay, but there is no way to change the delay time from
> > the control layer, since the delay instrument is only instantiated
> > once and the delay time parameter is an ivar.
> 
> There's the problem - you really want the delay time to be changing
> while the instrument runs, so use a k or a variable.  

right, but the "delay" opcode uses an ivar for the delay time.  Is
there any sensible way to have the delay time specified as a kvar?  It
seems that if the delay buffer size could change every kpass, that
could be a huge performance lose.

> What's your control driver, and how does it feed values to sfront?

I read the section of the sfront manual on writing control drivers.
It is remarkably easy for the coolness factor.

Here's my basic architecture (criticism welcome):

   GUI generates SASL note and control events for every beat (in
   real-time) and sends them over a named pipe.

   In the sfront-generated code, csys_newdata() polls on the pipe and
   reconstitutes SASL events to be instantiated by csys_saslevents().

Obviously, this has the potential for inconsistent latencies due to
scheduling and I/O delays, but IIRC named pipes are pretty cheap (<
30us for a 1-byte message, and scaling well from there).  I guess
another approach would be to move the timing logic to csys_newdata and
have the GUI offload pattern data and control events (i.e. pattern
change, etc).

> >  (More problematic is
> > that the delay time is set before the "0 tempo 120" command in the
> > SASL file, but that's another story.)  How can I get around this?  It
> > seems that I need to release the instrument so that it will get
> > re-instantiated -- but will that mess with my routing?
> 
> Probably. It would be like turning the delay off and turning
> it back on; there would be an audible interruption in the
> stream of echoes.

Right, I guess my question is, if I have an effects instrument "dly"
that is in route and send statements, will I lose that routing if I
release and then re-instantiate it, or is there a way to ensure that a
new "dly" takes the place of the old one on the bus?  I thought that
the only way to instantiate effects instruments was in send statements.




wb

-- 
Will Benton  | "Die richtige Methode der Philosophie wäre eigentlich 
[EMAIL PROTECTED]|  die: Nichts zu sagen, als was sich sagen läßt"



Re: [linux-audio-dev] saol question: adjusting delay line time?

2002-08-21 Thread John Lazzaro


> Will Benton writes
> How can I get around this? 

You want to use an interpolated delay line structure to do
adjustable delays, you create the delay line in the ipass,
make it large enough to cover the reasonable range of delays,
and then pick the tap on the delay line that matches the
current delay you want (or, if you want the smoothest changes
in delays, do interpolation).

One way to do this is the fracdelay core opcode:

http://www.cs.berkeley.edu/~lazzaro/sa/book/opcodes/filter/index.html#frac

Although honestly, I wouldn't use it, but instead I would
build my own custom delay line opcode, and get a nicer
API than the psuedo-OO fracdelay API ... it will also be
easier to benchmark and tune the delay line structure to
run efficiently, since you'll be able to make micro changes
in the SAOL and see the result in the C code in the sa.c
file.  

-
John Lazzaro -- Research Specialist -- CS Division -- EECS -- UC Berkeley
lazzaro [at] cs [dot] berkeley [dot] edu www.cs.berkeley.edu/~lazzaro
-



Re: [linux-audio-dev] saol question: adjusting delay line time?

2002-08-21 Thread Paul Winkler

On Wed, Aug 21, 2002 at 11:21:57AM -0500, Will Benton wrote:
> Howdy, all.
> 
> I'm writing a softsynth/toy that uses sfront (via a custom control
> driver) to produce trendy, 303-inspired sounds.  I realize that this
> isn't that interesting a project (how many 303 toys does the world
> need?), but I'm using it as a proof-of-concept/stepping stone to a
> cooler GUI-controlled environment/softsynth with an SAOL engine. 

woohoo! this is exactly the kind of thing I want to see.

> In any case, here's my question: I have an effects instrument that
> implements delay, but there is no way to change the delay time from
> the control layer, since the delay instrument is only instantiated
> once and the delay time parameter is an ivar.

There's the problem - you really want the delay time to be changing
while the instrument runs, so use a k or a variable.  

What's your control driver, and how does it feed values to sfront?

>  (More problematic is
> that the delay time is set before the "0 tempo 120" command in the
> SASL file, but that's another story.)  How can I get around this?  It
> seems that I need to release the instrument so that it will get
> re-instantiated -- but will that mess with my routing?

Probably. It would be like turning the delay off and turning
it back on; there would be an audible interruption in the
stream of echoes.

-- 
--

Paul Winkler
"Welcome to Muppet Labs, where the future is made - today!"



[linux-audio-dev] saol question: adjusting delay line time?

2002-08-21 Thread Will Benton

Howdy, all.

I'm writing a softsynth/toy that uses sfront (via a custom control
driver) to produce trendy, 303-inspired sounds.  I realize that this
isn't that interesting a project (how many 303 toys does the world
need?), but I'm using it as a proof-of-concept/stepping stone to a
cooler GUI-controlled environment/softsynth with an SAOL engine. 

In any case, here's my question: I have an effects instrument that
implements delay, but there is no way to change the delay time from
the control layer, since the delay instrument is only instantiated
once and the delay time parameter is an ivar.  (More problematic is
that the delay time is set before the "0 tempo 120" command in the
SASL file, but that's another story.)  How can I get around this?  It
seems that I need to release the instrument so that it will get
re-instantiated -- but will that mess with my routing?



wb

--
Will Benton  | "Die richtige Methode der Philosophie wäre eigentlich 
[EMAIL PROTECTED]|  die: Nichts zu sagen, als was sich sagen läßt"



RE: [linux-audio-dev] midi events in jack callback / ALSA Sequencer

2002-08-21 Thread mikko.a.helin

Is it possible to lock the ALSA sequencer to audio clock? Most controllers like Envy24 
have a register called "Playback DMA Current/Base Count Register" (= where the 'play 
position' inside the DMA buffer is currently). Can ALSA sequencer also send MIDI clock 
based on this counter to MIDI out? Would it be too heavy for a system to read the 
position inside a timer and then decide which MIDI data should be sent out and update 
the time stamp (global clock)?
-Mikko (sorry for the ignorance..)

> -Original Message-
> From: ext Frank van de Pol [mailto:[EMAIL PROTECTED]]
> Sent: 20. August 2002 23:50
> To: [EMAIL PROTECTED]; Martijn Sipkema
> Cc: [EMAIL PROTECTED]
> Subject: Re: [linux-audio-dev] midi events in jack callback / ALSA
> Sequencer
> 
> 
> 
> Martijn, 
> 
> the ALSA sequencer is exactly designed to serve a mechanism 
> to route (midi)
> _events_ from a source to a sink (either driver, application 
> or whatever). 
> 
> On Mon, Aug 19, 2002 at 07:12:47PM +0100, Martijn Sipkema wrote:
> > > >MIDI through and any other 'immediate' type MIDI messages do
> > > >not need to be scheduled, they can be written to the interface
> > immediately.
> > >
> > > Yes, they could. It would however necessitate different 
> input routes
> > > for 'immediate' and 'queued' events to the MIDI output handler.
> > 
> > The MIDI I/O API I am working on has 'scheduled' and 
> 'immediate' queues. I
> > don't think there is a way around this unless 'immediate' 
> messages are not
> > used
> > at all and that is clearly not an option.
> 
> Within ALSA we have two priority queues, one for tick 
> (bar,beat) scheduled
> events, and one for clock (ns) scheduled events. In case of immediate
> scheduling the priority queue is bypassed and the event 
> submitted in the
> receiver's fifo (which would be your immediate queue).
> 
> Due to potential blocking at the receivers you'll need a fifo 
> for every
> destination. 
> 
> Reason for having 2 priority queues with different reference 
> is to cope with
> tempo/signature changes while remaining in sync. The clock 
> and tick priority
> queues are in fact parallel in ALSA. 
> 
> Since especially for soft synths (but also for some 
> -undocumented!- USB midi
> interfaces, like Emagic AMT) the events need to be scheduled 
> ahead (sort of
> a pre-delay, say 10ms or more) to let the device/softsynth 
> handle the micro
> scheduling, it would seem a good idea to handle this at the 
> clock based
> queue. Since variable predelay in ms would be not quite 
> friendly to the tick
> based queue (different units), it might make sense to have 
> the tick based
> queue send events into the clock based queue instead of 
> immediate delivery).
> 
> Your links about UST sound very good in this perspective, since the
> _monotonic_ clock ensures compatibility between the event (sequencer)
> subsystem, and other audio/video subsystems. 
> 
> The current clock based ALSA prioq could then simply be a 
> reference against
> the UST instead of keeping track of time itself. High level 
> applications can
> have the freedom to take whatever route they want, use the 
> ALSA sequencer
> API or go directly to the UST based scheduler. All these 
> applications can
> still cooperate in this framework :-) (try that in MS Windows.)
> 
> A good reason for applications to use (UST/ALSA) scheduling instead of
> taking care of micro scheduling itself and using rawmidi interfaces is
> better support for softsynths to trigger at the right spot in 
> the buffer,
> and for the upcoming smart (eg. USB) midi devices.
> 
> > 
> > > This
> > > would not help make things simpler. It would also mean 
> that a Sysex
> > > or pitch/CC burst routed through can delay MIDI clocks 
> because of the
> > > limited bandwidth on the MIDI wire.
> > 
> > Sysex can hurt timing for other events, but that's MIDI. 
> MIDI clock (any
> > MIDI realtime message) can interleave other messages. And 
> yes, merging
> > MIDI streams is not easy.
> 
> You can't overcome the limits of the MIDI physical line if that's your
> target transport. However when sending events to soft- or 
> onboard synths
> these limits are different (typically less of an issue). 
> 
> When using events instead of midi bytes the merging is a no 
> brainer (and
> leaves room for events not defined in midi spec).
> 
> > 
> > > Thinking about it -- it's hypothetical because we don't 
> have them in
> > > Linux yet -- I believe a decent MIDI out handler using a 
> firm timer
> > > would be an order of magnitude more complicated than one 
> based on the
> > > RTC. Have you coded one yet?
> > 
> > Yes, and it is not that complex I think. Note that this 
> would only have to
> > be done
> > in a driver process or a user-space sequencer application 
> and not for every
> > client
> > application.
> > 
> > I'll try to get a version of my MIDI I/O API/framework 
> ready, but it will
> > probably still
> > take me some time to get finished.
> 
> Perhaps you might want to lo

Re: [linux-audio-dev] Reborn

2002-08-21 Thread Kjetil S. Matheussen



On Tue, 20 Aug 2002, Ingo Oeser wrote:

> > /* Consumer */
> >
> > int jackprocess (nframes_t nframes, void *arg){
> >   int ch,i;
> >   struct Jackplay *jackplay=(struct Jackplay *)arg;
> >   int numch=jackplay->fftsound->samps_per_frame;
> >   sample_t *out[numch];
> >
> >   for(ch=0;ch > out[ch]= (sample_t *) jack_port_get_buffer (jackplay->jpc[ch].output_port, 
>nframes);
> > memset(out[ch],0.0f,nframes*sizeof(sample_t));
>
> This doesn't work and isn't needed. The second argument will be
> intepreted as a byte. You also filling the buffer all the time,
> so zero initialisation is not necessary.
>
The reason for writing 0.0f instead of 0.0 is just to make it clear that
the array out[ch] contain floats.

The reason for for nulling out the buffer each time is because
of the following test: "if(jackplay->unread==0) break; in the proceeding
for-loop". The reamining part of the procedure doesnt allways
fill the buffer.

Now, what I could have done was, of course, just to fill out the remaining
buffer that wasnt written in the proceeding for-loop. However, the higher
performance gained by doing that would probably not be noticable. The
code that sends data to the JackWritePlay() routine uses xxx times more
cpu than just the jack-things.

And most important: the code would be harder to read.


-- 




Re: [linux-audio-dev] Reborn

2002-08-21 Thread Ingo Oeser

Hi Kjetil,

On Mon, Aug 19, 2002 at 02:42:54PM +0200, Kjetil S. Matheussen wrote:
> Here are some code from ceres to play:
 
Which is wrong in some places.

> 
> struct JackPlayChannel{
>   jack_port_t *output_port;
>   sample_t buffer[BUFFERSIZE];
> };
> 
> struct Jackplay{
>   struct JackPlayChannel jpc[4];
>   jack_client_t *client;
>   struct FFTSound *fftsound;
These four ints must be some kind of atomic variables, to make it work.
>   int writeplace;
>   int readplace;
>   int buffersize;
>   int unread;
> };


> /* Consumer */
> 
> int jackprocess (nframes_t nframes, void *arg){
>   int ch,i;
>   struct Jackplay *jackplay=(struct Jackplay *)arg;
>   int numch=jackplay->fftsound->samps_per_frame;
>   sample_t *out[numch];
> 
>   for(ch=0;ch out[ch]= (sample_t *) jack_port_get_buffer (jackplay->jpc[ch].output_port, 
>nframes);
> memset(out[ch],0.0f,nframes*sizeof(sample_t));

This doesn't work and isn't needed. The second argument will be
intepreted as a byte. You also filling the buffer all the time,
so zero initialisation is not necessary.

>   }
> 
>   for(i=0;i if(jackplay->unread==0) break;
> 
> for(ch=0;ch   out[ch][i]=jackplay->jpc[ch].buffer[jackplay->readplace];
> }
> jackplay->unread--;
> 
> jackplay->readplace++;
> if(jackplay->readplace==jackplay->buffersize){
>   jackplay->readplace=0;
> }
>   }
> 
>   return 0;
> }
> 
> 
> /* Providor */
> 
> void JackWritePlay(
>struct FFTSound *fftsound,
>void *port,double **samples,int num_samples
>)
> {
>   struct Jackplay *jackplay=(struct Jackplay *)port;
> 
>   int i,ch;
> 
>   for (i=0; i while(jackplay->unread==jackplay->buffersize){
>   usleep(128);
> }
> for (ch=0; chsamps_per_frame; ch++){
>   jackplay->jpc[ch].buffer[jackplay->writeplace]=(sample_t)samples[ch][i];
> }
> jackplay->unread++;
> jackplay->writeplace++;
> if(jackplay->writeplace==jackplay->buffersize){
>   jackplay->writeplace=0;
> }
>   }
> }

But for pseudocode all of this is ok ;-)

Regards

Ingo Oeser
-- 
Science is what we can tell a computer. Art is everything else. --- D.E.Knuth