Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread n++k

/wrote Paul Davis <[EMAIL PROTECTED]> [Tue, 23 Jul 2002 19:58:16 -0400]

|the most fundamental problem with SGI's approach to audio+video is
|that its not based on the desirability of achieving low latency for
|realtime processing and/or monitoring. its centered on the playback of
|existing, edited, ready-to-view material. the whole section at the end
|of that page about "pre-queuing" the data makes this point very clear.
|
|SGI tried to solve the problem of the Unix read/write API mismatch
|with realtime streaming media by adding timestamps to data. CoreAudio
|solves it by getting rid of read/write, and acknowledging the inherent
|time-basis of the whole thing, but for some reason keeps timestamps
|around without using them in many cases. JACK follows CoreAudio's
|lead, but gets rid of the timestamps.

My concern is audio+video synchronization:
  Currently, i'm using the audio clock, snd_pcm_status_get_tstamp or
  ioctl(fd, SNDCTL_DSP_GETOPTR, ¤t_ptr) to provide a timestamp
  at the start of the video processing..

  With JACK's api not providing a timestamp, I cannot know
  whether there's any extra buffering/latency
  added once the callback buffer is processed.. 
  (which would be the case of rather braindead hardware or driver)

  ...  It also requires me to keep a local clock that i would 
  empirically correct by one buffer_size, as set by 
  the jack_set_buffer_size callback, in the hope it corresponds
  with the actual delay between processing and the actual time
  sound will be heard. This is critical for audio/video 
  synchronization, whichever the latency the system is aiming for.

  am I at least right in my assumption?





Re: [linux-audio-dev] UST

2002-07-23 Thread Paul Davis

>> i read this a year or two ago. it was a major impetus in the design of JACK,
>> because i believe that almost all of it is based on a complete misconception
>> of how to do this stuff, and some of it is just plain wrong. Its worth
>> noting that SGI's "DM" API has never really taken off, and there are lots of
>> reasons why, some technical, some
>> political.
>
>I was under the impression that SGI had a pretty good grasp on how to do RT
>systems... (http://www.sgi.com/software/react/)

their stuff has never been widely (if at all) used for low latency
real time processing of audio. its not because their OS can't do it
(it can) and thats all the REACT stuff sets out to prove. their FRS
scheduler in particular is very cool and probably very very useful for
some applications.

no, despite a fine fairly-firm-RT OS, it doesn't get used this way
because (1) their hardware costs too much (2) their API for audio
doesn't encourage it in any way (3) their API for digital media in
general is confused and confusing. i or someone else posted a link
here a few months ago from someone who worked in the DM group at SGI
which described the problems that have beset their Video API. a
similar set of problems exists for the Audio side of DM, IMHO.

SGI's definition of "realtime" is great when discussing their OS, but
it more or less fades into the background when using the DM API and
UST. 

>> SGI tried to solve the problem of the Unix read/write API mismatch with
>> realtime streaming media by adding timestamps to data. CoreAudio solves it
>> by getting rid of read/write, and acknowledging the inherent time-basis of
>> the whole thing, but for some reason keeps timestamps around without using
>> them in many cases. JACK follows CoreAudio's
>> lead, but gets rid of the timestamps.
>
>Timestamps are needed in hard-realtime and parallel-computing systems,

timestamps are not needed to deal with streaming media when it is
handled without pre-queueing. Pre-queuing destroys latency, and so is
best left to a higher level API and/or the application itself. SGI's
own FRS scheduler is proof of that, and is an interesting vindication
of JACK in some respects in that its essentially the scheduling part
of JACK executed in kernel space.

--p




Re: [linux-audio-dev] UST

2002-07-23 Thread Andy W. Schm

> i read this a year or two ago. it was a major impetus in the design of JACK,
> because i believe that almost all of it is based on a complete misconception
> of how to do this stuff, and some of it is just plain wrong. Its worth
> noting that SGI's "DM" API has never really taken off, and there are lots of
> reasons why, some technical, some
> political.

I was under the impression that SGI had a pretty good grasp on how to do RT
systems... (http://www.sgi.com/software/react/)

> SGI tried to solve the problem of the Unix read/write API mismatch with
> realtime streaming media by adding timestamps to data. CoreAudio solves it
> by getting rid of read/write, and acknowledging the inherent time-basis of
> the whole thing, but for some reason keeps timestamps around without using
> them in many cases. JACK follows CoreAudio's
> lead, but gets rid of the timestamps.

Timestamps are needed in hard-realtime and parallel-computing systems,
something that SGI most certainly has experience with, but is just does not
apply to typical linux workstations.  ... hopefully someday it will.

-andy
---
Andrew W. Schmeder





Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>[...]
>> UST = Unadjusted System Time
>
>I believe this is a good introduction to UST/MSC:
>
>http://www.lurkertech.com/lg/time/intro.html

i read this a year or two ago. it was a major impetus in the design of
JACK, because i believe that almost all of it is based on a complete
misconception of how to do this stuff, and some of it is just plain
wrong. Its worth noting that SGI's "DM" API has never really taken
off, and there are lots of reasons why, some technical, some
political.

the most fundamental problem with SGI's approach to audio+video is
that its not based on the desirability of achieving low latency for
realtime processing and/or monitoring. its centered on the playback of
existing, edited, ready-to-view material. the whole section at the end
of that page about "pre-queuing" the data makes this point very clear.

SGI tried to solve the problem of the Unix read/write API mismatch
with realtime streaming media by adding timestamps to data. CoreAudio
solves it by getting rid of read/write, and acknowledging the inherent
time-basis of the whole thing, but for some reason keeps timestamps
around without using them in many cases. JACK follows CoreAudio's
lead, but gets rid of the timestamps.

--p



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>> UST = Unadjusted System Time
>> 
>> I haven\'t seen any implementations of UST where you could specify a
>> different source of the clock tick than the system clock/cycle timer.
>
>Well, no. Is this needed. The UST should just be an accurate unadjusted
>clock that can be used for timestamping/scheduling events.

UST can be used for timestamping, but thats sort of useless, since the
timestamps need to reflect audio time (see below). UST cannot (on my
understanding) be used for scheduling.

>But JACK doesn\'t provide timestamps, or does it?

it doesn't timestamp buffers, because i firmly believe that to be an
incorrect design for streamed data. however, it does provide (if a
client volunteers to do this) a time base. see jack/transport.h. the
API is not in the main header because we don't want to frighten
users. most applications don't need this stuff at all.

>
>> >Using UST would also enable syncing to video or some other media
>> >stream without it all residing in the same API.
>> 
>> Something has to connect the UST to the applications, and I have not
>> seen anything in the definition of UST that would put UST in user
>> space.
>
>I don\'t really understand. For a POSIX system, UST is CLOCK_MONOTONIC. Now
>I know Linux does not yet support this, but it will eventually. Apparently
>adding CLOCK_MONOTONIC to libc will change its ABI.
>If CLOCK_MONOTONIC is accurate enough, then it can be used to sync
>audio/midi/video by associating the performance time (e.g. with audio as
>the master, MSC) with the UST.

CLOCK_MONOTONIC doesn't change the scheduling resolution of the
kernel. its not useful, therefore, in helping with this problem. what
you need is an API that says "do this at time T", and have some
good expectation that the jitter on "T" is very very small (right now,
its generally huge).

the "firm timers" patch helps with this immensely (and before it, the
KURT patches did the same thing). they don't involve a change to libc.

--p



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Martijn Sipkema

[...]
> UST = Unadjusted System Time

I believe this is a good introduction to UST/MSC:

http://www.lurkertech.com/lg/time/intro.html


--martijn





Powered by ASHosting



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Vincent Touquet

Yamaha has a large NDA tradition,
making lots of things impossible.

As another example: the filesystem
format of their A series samples storage.

It would be so nice if you could mount
these disks in Linux too, but yammy
refuses without an NDA ...

Ask them the question though :)
They have to come up with a clean answer.

regards
vincent



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Martijn Sipkema

[...]
> UST = Unadjusted System Time
> 
> I haven\'t seen any implementations of UST where you could specify a
> different source of the clock tick than the system clock/cycle timer.

Well, no. Is this needed. The UST should just be an accurate unadjusted
clock that can be used for timestamping/scheduling events.

> >audio by scheduling it on UST. An application doing JACK audio output
> >and MIDI output would most likely estimate the UST for the output buffer
> >using the UST of the input buffer and schedule MIDI messages for that
> >buffer in the callback also. So this then looks much like your proposal.
> >But there is also still the ability to use send MIDI messages for
> >immediate transmission.
> 
> Well, thats actually what we have in JACK already, if the client uses
> the ALSA sequencer in another (non-JACK) thread. its precisely the
> design i have in my head for Ardour, for example.

But JACK doesn\'t provide timestamps, or does it?

> >Using UST would also enable syncing to video or some other media
> >stream without it all residing in the same API.
> 
> Something has to connect the UST to the applications, and I have not
> seen anything in the definition of UST that would put UST in user
> space.

I don\'t really understand. For a POSIX system, UST is CLOCK_MONOTONIC. Now
I know Linux does not yet support this, but it will eventually. Apparently
adding CLOCK_MONOTONIC to libc will change its ABI.
If CLOCK_MONOTONIC is accurate enough, then it can be used to sync
audio/midi/video by associating the performance time (e.g. with audio as
the master, MSC) with the UST.

So instead of having a MIDI API that lets you schedule MIDI messages on a
dozen different timebases, you can only schedule to UST. The application
will now the relation between the audio time (frames) and UST, and allthough
both will drift this doesn\'t really matter for the short period MIDI messages
are scheduled ahead (i.e. << 1s, probably more like 100ms or less).

Or am I missing something?

--martijn








Powered by ASHosting



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Fernando Pablo Lopez-Lezcano

> > What about Yamaha's mLan ?
> > I thought that was some kind of midi over firewire,
> > but maybe they didn't grab it as an opportunity
> > to improve on midi ...
>
> It's midi and audio over firewire.
> I think it's plain vanilla midi messages, though.
> not sure.

mLAN is built on top of the public IEC61883 protocols,
specifically IEC61883-6 which (I believe) specifies audio and
plain midi transport over ieee1394 (aka firewire). It does not
venture into better control protocols, it is just midi. mLAN
adds (at least) word clock (sync and recovery, low jitter and
so on and so forth) and connection management to the public
standards, but it is a proprietary system - you can get a
license only after signing an NDA, I believe an open source
driver would not be possible given those constraints. At a
recent talk about mLAN here at ccrma I asked the question: so,
how would an mLAN product interact with a completely
IEC61883-6 compliant protocol stack in, let's say, linux?
After all, it is built on top of that. They did not know. At
that time it was a theoretical question, now it is a practical
one as I just noticed last week that in 2.4.19-rc2 there is
now an option to compile a IEC61883-6 protocol stack module
for ieee1394 so I'll ask the question again...

-- Fernando



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Sebastien Metrot

There is no royalty fee for mLan. You just have to register with Yamaha to
become a Licensee (the process is lengthy because of the distance, it took
not far from 2 month for me).

Sebastien

- Original Message -
From: "Paul Davis" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Tuesday, July 23, 2002 9:52 PM
Subject: Re: [linux-audio-dev] App metadata intercomunication protocol..


> >On Tue, Jul 23, 2002 at 02:00:15PM -0300, Juan Linietsky wrote:
> >(cut)
> >>Yes, I agree that midi sucks. I'm wondering why dont we have
> >>a newer protocol by now, but we dont. So there's nothing else
> >>than having to stick to that archaic crap :)
> >(cut)
> >
> >What about Yamaha's mLan ?
> >I thought that was some kind of midi over firewire,
> >but maybe they didn't grab it as an opportunity
> >to improve on midi ...
>
> its not available under a royalty free license. its acceptance has
> thus been much, much slower and less widespread than MIDI.
>
> --p
>




Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Sebastien Metrot

Then you'd have to provide a way to do offline processing to. The clock
source can't be the system clock in that case... (offline processing IS
important for the musicians :-))

Sebastien

- Original Message -
From: "Vincent Touquet" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Tuesday, July 23, 2002 9:31 PM
Subject: Re: [linux-audio-dev] App intercomunication issues, some views.


> On Tue, Jul 23, 2002 at 07:38:25PM +0200, Martijn Sipkema wrote:
> (cut)
> >Using UST would also enable syncing to video or some other media
> >stream without it all residing in the same API.
> (cut)
>
> That would certainly make me very happy :)
>
> vini
>




Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 12:31:13PM +0100, Steve Harris wrote:
> Yes, this is important, it should be possible to mail the tarball to a
> friend who can then start up the project (assuming he has all the apps
> installed of course).

That would be one hell of a mail message. :)
Now, an rsync server...

-- 

Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Sebastien Metrot

They don't even touch the idea of changing midi. (i'm registered with them
as an mLan Licensee).

Sebastien

- Original Message -
From: "Vincent Touquet" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Tuesday, July 23, 2002 9:30 PM
Subject: Re: [linux-audio-dev] App metadata intercomunication protocol..


> On Tue, Jul 23, 2002 at 02:00:15PM -0300, Juan Linietsky wrote:
> (cut)
> >Yes, I agree that midi sucks. I'm wondering why dont we have
> >a newer protocol by now, but we dont. So there's nothing else
> >than having to stick to that archaic crap :)
> (cut)
>
> What about Yamaha's mLan ?
> I thought that was some kind of midi over firewire,
> but maybe they didn't grab it as an opportunity
> to improve on midi ...
>
> regards
> vincent
>




Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 09:30:10PM +0200, Vincent Touquet wrote:
> What about Yamaha's mLan ?
> I thought that was some kind of midi over firewire,
> but maybe they didn't grab it as an opportunity
> to improve on midi ...

It's midi and audio over firewire.
I think it's plain vanilla midi messages, though.
not sure.

-- 

Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Paul Davis

>On Tue, Jul 23, 2002 at 02:00:15PM -0300, Juan Linietsky wrote:
>(cut)
>>Yes, I agree that midi sucks. I'm wondering why dont we have 
>>a newer protocol by now, but we dont. So there's nothing else
>>than having to stick to that archaic crap :)
>(cut)
>
>What about Yamaha's mLan ?
>I thought that was some kind of midi over firewire,
>but maybe they didn't grab it as an opportunity
>to improve on midi ...

its not available under a royalty free license. its acceptance has
thus been much, much slower and less widespread than MIDI.

--p



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 01:14:18PM -0500, Arthur Peters wrote:
(cut)
>I think there might be problems with option 1 when the apps are running
>on different machines (as was mensioned earlier). Maybe a hybrid would
>work: provide an API for each app to pass it's data to the project
>server. This data could be anything, XML, binary of somesort, whatever.
>The project server could then store that data any way it wanted, the
>simplest being a group of files in a directory. The apps wouldn't have
>to change their data format, only replace their file IO with this API,
>and we would have some choice of backends.
(cut)

I think this is the only reasonable approach.
Have some interface in each program to set
and retrieve state information.
Programs who implement the interface benifit
from the added advantage a user has of using
that program with some sort of state daemon.

I would vote for this solution if I could :)

vini



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 02:00:15PM -0300, Juan Linietsky wrote:
(cut)
>Yes, I agree that midi sucks. I'm wondering why dont we have 
>a newer protocol by now, but we dont. So there's nothing else
>than having to stick to that archaic crap :)
(cut)

What about Yamaha's mLan ?
I thought that was some kind of midi over firewire,
but maybe they didn't grab it as an opportunity
to improve on midi ...

regards
vincent



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 07:38:25PM +0200, Martijn Sipkema wrote:
(cut)
>Using UST would also enable syncing to video or some other media
>stream without it all residing in the same API.
(cut)

That would certainly make me very happy :)

vini



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Arthur Peters

On Tue, 2002-07-23 at 05:11, .n++k wrote:
> 
> Anyway, as a summary, what is the be solved: 
>   . defining, naming, identifying a session (= project)
>   . communicating with the apps to request loading and saving
>   session associated data.
>   . 1. each app is passed enough knowledge to store their state themselves
> or
> 2. an api and library is designed that can abstract finding the location
> of the project itself, how to store the data etc..
> 
> I fear that 2. would be a complete reinvention of things that have been done many 
>times before (dbms) and thus would either result in: 
>   . not being done
>   . being done, incompletly
> 
> I'd prefer 1. although it freezes the backend.
> 

I think there might be problems with option 1 when the apps are running
on different machines (as was mensioned earlier). Maybe a hybrid would
work: provide an API for each app to pass it's data to the project
server. This data could be anything, XML, binary of somesort, whatever.
The project server could then store that data any way it wanted, the
simplest being a group of files in a directory. The apps wouldn't have
to change their data format, only replace their file IO with this API,
and we would have some choice of backends.

-Arthur





Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>> >Why not have a seperate API/daemon for MIDI and
>> >have it and JACK both use the same UST timestamps?
>> 
>> you can\'t use any timestamp unless its derived from the clock master,
>> which UST by definition almost never is. the clock on your PC doesn\'t
>> run in sync with an audio interface, and will lead to jitter and drift
>> if used for general timing.
>
>The mapping between UST and audio time (frames) is continuously updated.
>There is no need for the UST to be the master clock. If JACK would

UST = Unadjusted System Time

I haven't seen any implementations of UST where you could specify a
different source of the clock tick than the system clock/cycle timer.

>audio by scheduling it on UST. An application doing JACK audio output
>and MIDI output would most likely estimate the UST for the output buffer
>using the UST of the input buffer and schedule MIDI messages for that
>buffer in the callback also. So this then looks much like your proposal.
>But there is also still the ability to use send MIDI messages for
>immediate transmission.

Well, thats actually what we have in JACK already, if the client uses
the ALSA sequencer in another (non-JACK) thread. its precisely the
design i have in my head for Ardour, for example.

>Using UST would also enable syncing to video or some other media
>stream without it all residing in the same API.

Something has to connect the UST to the applications, and I have not
seen anything in the definition of UST that would put UST in user
space.

--p



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 06:21:42PM +0100, Steve Harris wrote:
> On Tue, Jul 23, 2002 at 12:42:38 -0400, Paul Winkler wrote:
> > I think it's stretching things a bit to refer to
> > sfront as a midi sequencer. It *can* produce midi output...
> > but that's it.  no real user interface, no way to save or restore
> > state...
> 
> Er, vi foo.orc?
> 
> What better UI could you ask for ;)

Works for me. :)
But most people aren't used to an "edit, run sfront, run gcc, run
executable" cycle in their composing work.  It's not exactly 
RCD (Rapid Composition Development (TM)).

-- 

Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 02:00:15PM -0300, Juan Linietsky wrote:
> Yes, I agree that midi sucks. I'm wondering why dont we have 
> a newer protocol by now, but we dont. So there's nothing else
> than having to stick to that archaic crap :)

Because the hardware music device market moves a lot more
slowly than the software world. They're still selling samplers
that take SIMM memory upgrades, for crying out loud.

It's hard to get the big companies to agree on anything. To agree on
a replacement for MIDI? It'll happen someday, but who knows
how many more years...  and of course there may be power struggles
a la DVD-A vs. SACD, or the better-known VHS vs Betamax...

-- 

Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 12:43:32PM +0100, Steve Harris wrote:
> One hard thing is how to deal with communication interdependencies, eg.
> app A talks jack to app B and app B talks alsa sequencer to app A.

Why is this hard?

ensure ALSA is up
start jackd
start jack client 1
start jack client 2
...
run jackconnect with appropriate args to connect clients
...
start alsa seq. client 1
start alsa seq. client 2
run aconnect with appropriate args


You know, this session manager thing is starting to sound
more and more like make. :)

-- 

Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Martijn Sipkema

> >Does that mean that MIDI output can only be done from a callback? 
> 
> No, it would mean that MIDI is only actually delivered to a timing
> layer during the callback. Just as with the ALSA sequencer and with
> audio under JACK, the application can queue up MIDI at any time, but
> its only delivered at specific points in time. Obviously, pre-queuing
> leads to potential latency problems (e.g. if you queue a MIDI volume
> change 1 second ahead of time, then the user alters it during that 1
> second, you\'ve got problems).

The only problem i have with this is latency. For applications that
only do MIDI output this is fine. For a software synth, only taking
MIDI input, there is also no extra latency, since you already need it
to avoid jitter.

For a complex MIDI application that does MIDI input -> MIDI output,
this adds latency. I am working at the moment on a low level MIDI I/O
API and a daemon for IPC routing. This will support sending either
immediate ro scheduled MIDI messages. It will take probably some time
still to get a working version. All scheduling is done to UST (, 
allthough Linux does not support this yet).

[...]
> >Why not have a seperate API/daemon for MIDI and
> >have it and JACK both use the same UST timestamps?
> 
> you can\'t use any timestamp unless its derived from the clock master,
> which UST by definition almost never is. the clock on your PC doesn\'t
> run in sync with an audio interface, and will lead to jitter and drift
> if used for general timing.

The mapping between UST and audio time (frames) is continuously updated.
There is no need for the UST to be the master clock. If JACK would
provide on every process() callback a UST time for the first frame
of the (input) buffer, then MIDI could be very accurately synced to JACK
audio by scheduling it on UST. An application doing JACK audio output
and MIDI output would most likely estimate the UST for the output buffer
using the UST of the input buffer and schedule MIDI messages for that
buffer in the callback also. So this then looks much like your proposal.
But there is also still the ability to use send MIDI messages for
immediate transmission.

Using UST would also enable syncing to video or some other media
stream without it all residing in the same API.


--martijn







Powered by ASHosting



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 12:11:23PM +0200, .n++k wrote:
> But aren't musicians a different kind of users than desktop users?

how so? more technical? I don't really think so.

we just had a guy on linux-audio-user who says he's been 
familiar with unix since at least the early eighties, 
and he's been a technical writer for 20 years. So you'd
think he'd be a prime candidate for getting linux up and running.
But after two weeks and 70 or so messages on the list, 
not only did he not get his USB audio working, his
previously-working pcmcia modem stopped working, and then
he somehow (I've no idea how) trashed his /etc/fstab
and finally gave up and went back to windows.

Feel free to disagree, but I think adding a dependency on
a full-featured RDBMs is a step in the wrong direction.

-- 

Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Steve Harris

On Tue, Jul 23, 2002 at 12:42:38 -0400, Paul Winkler wrote:
> I think it's stretching things a bit to refer to
> sfront as a midi sequencer. It *can* produce midi output...
> but that's it.  no real user interface, no way to save or restore
> state...

Er, vi foo.orc?

What better UI could you ask for ;)

- Steve 



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Juan Linietsky

On Tue, 23 Jul 2002 11:28:01 +0200
Vincent Touquet <[EMAIL PROTECTED]> wrote:

> On Tue, Jul 23, 2002 at 02:30:22AM -0300, Juan Linietsky wrote:
> >ok, but the question is, what for? What else do you need other than
> >start/stop/seek ? doesnt midi proovide that already? then why
> >something else?
> >Also using midi you make sure that what you do is synced to
> >external devices...
> 
> You can use a protocol which hasn't the limitations
> of midi (7 bit CC data :() and trunk it on the ends
> that need midi (filter the more dynamic range 
> to the coarser range).

Actually Midi CC 0-31 are 14 bits, with the MSB from 32-63

> 
> Of course this brings issues about the different
> dynamic ranges used and I think you'll hear it.
> 
> I just don't think there is any good reason
> to use the restrictions imposed by midi
> when you stay inside your computer
> (it is not as if you need to be able
>  to pass all your data on a 33600 baud
>  cable, is it ?)
> 

Yes, I agree that midi sucks. I'm wondering why dont we have 
a newer protocol by now, but we dont. So there's nothing else
than having to stick to that archaic crap :)

> regards
> vini
> 

Juan Linietsky



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 03:58:40PM +0100, Steve Harris wrote:
> On Tue, Jul 23, 2002 at 10:23:40 -0400, Paul Davis wrote:
> > for me, its more than tempting. none of the existing ALSA MIDI
> > sequencers have JACK support (that i know of). MusE is the only MIDI
> 
> The only ones I know of are iiwusynth and sfront.

I think it's stretching things a bit to refer to
sfront as a midi sequencer. It *can* produce midi output...
but that's it.  no real user interface, no way to save or restore
state...

now John Lazzaro will appear and tell me I'm wrong :)

--PW


Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 11:24:03AM +0200, Vincent Touquet wrote:
> On Tue, Jul 23, 2002 at 01:17:53AM -0400, Paul Winkler wrote:
> >actually zipi and skini.
> >tooskini has something to do with perry cook's STK.
> 
> I though skini is just readable MIDI ?
> Hit me if I'm wrong though.

it's mostly backwards compatible with MIDI, e.g. pitch of 60
means the same thing; but skini allows floats too, so you could
have pitch of 60.09250194 ...

-- 

Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>Does that mean that MIDI output can only be done from a callback? 

No, it would mean that MIDI is only actually delivered to a timing
layer during the callback. Just as with the ALSA sequencer and with
audio under JACK, the application can queue up MIDI at any time, but
its only delivered at specific points in time. Obviously, pre-queuing
leads to potential latency problems (e.g. if you queue a MIDI volume
change 1 second ahead of time, then the user alters it during that 1
second, you've got problems).

>  Is the
>callback the same one as for the audio hardware? MIDI being sporadic just
>doesn\'t seem to fit in JACK. 

i agree that its not a natural fit. i'm trying to find some reasonably
natural way to handle it, one that is closer to the level of
abstraction that JACK uses than the one used by the ALSA
sequencer. its quite possible that this isn't possible.

>  Why not have a seperate API/daemon for MIDI and
>have it and JACK both use the same UST timestamps?

you can't use any timestamp unless its derived from the clock master,
which UST by definition almost never is. the clock on your PC doesn't
run in sync with an audio interface, and will lead to jitter and drift
if used for general timing.

the separate API may well be necessary, but i would hope that we can
eventually converge on a single daemon (e.g. ALSA sequencer moves into
user space as a library+reference implementation of the daemon; then
jackd runs as the daemon).

--p



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>I am not sure if this is the right way to go, but then I've never used the
>alsa seqeuncer API, and write no midi software anymore.
>
>What about the maximum jack buffer size issue. Is it reasonable to make
>all the apps do thier own limiting?

thats where the work comes in. the "read" side of the proposed
handling of MIDI is easy; the write side needs to collect data from
the ALSA sequencer and store it in shm or something like that. its all
a bit murky right now. there is no maximum buffer size, however, other
than system limits.

>Also, I suspect even more MIDI apps will be read/write based, and its more
>reasonable for them to be IMHO, that is actually how MIDI works at the low
>level, unlike PCM soundcards.

actually, i think its the opposite. most MIDI apps need to emit MIDI
data at specific times (hence the early development of the OSS
sequencer and now the ALSA one). write(2) of raw MIDI data doesn't
work (by itself) for this - the applications either have to do their
own timing, or they need to deliver event structures to some API that
does the timing for them.

for apps that use audio+MIDI, they will generally want to use the same
timebase for this. with ALSA right now, you can do this, but its not
trivial to set up, and has been subject to some wierd bugs which are
hopefully now fixed. if you do this at the JACK level (or if the ALSA
seq was in user space, or something)), i think this gets very easy to
implement. 

a typical MIDI client that does MIDI output really wants to use a
queued write, which is more or less exactly what JACK's callback-based
handling of audio provides the major problem is that the granularity of
the MIDI timestamp is finer than the audio, because of the
block/chunked handling of audio.

--p



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>> (some people have suggested that Rosengarden
>> does audio too, but it seems to me that its firmly oriented towards
>> MIDI).
>
>*cough*

syrup duly swallowed :)

--p



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Steve Harris

On Tue, Jul 23, 2002 at 10:23:40 -0400, Paul Davis wrote:
> for me, its more than tempting. none of the existing ALSA MIDI
> sequencers have JACK support (that i know of). MusE is the only MIDI

The only ones I know of are iiwusynth and sfront.

I am not sure if this is the right way to go, but then I've never used the
alsa seqeuncer API, and write no midi software anymore.

What about the maximum jack buffer size issue. Is it reasonable to make
all the apps do thier own limiting?

Also, I suspect even more MIDI apps will be read/write based, and its more
reasonable for them to be IMHO, that is actually how MIDI works at the low
level, unlike PCM soundcards.

- Steve



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Martijn Sipkema

On 23.07.2002 at 15:35:58, Paul Davis <[EMAIL PROTECTED]> wrote:

> >On Tue, Jul 23, 2002 at 07:48:45 -0400, Paul Davis wrote:
> >> the question is, however, to what extent is it worth it. the reason
> >> JACK exists is because there was nothing like it available for moving
> >> audio data around. this isn\'t true of the MIDI side of things, where
> >
> >If you actaully want to deal with raw MIDI (you\'d be mad, but...) then its
> >OK, as the maximum ammount of data per jack unit time is pretty small, but
> >I agree, it\'s better dealt with via the alsa api.
> 
> well, what i was thinking was something more like this:
> 
>   struct jack_midi_buffer_t {
>unsigned int event_cnt;
>WhateverTheALSASequencerEventTypeIsCalled events[0];
>   };
> 
> a JACK client that handles MIDI as well would do something like this
> in its process() callback:
> 
>   jack_midi_buffer_t* buf = (jack_midi_buffer_t*) 
> jack_port_get_buffer (midi_in_port, nframes);
>   
>   for (n = 0; n < buf->event_cnt; ++n) {
>process_midi_event (buf->events[n]);
>   }
> 
> a real client would probably look at the timestamps in the events too.

Does that mean that MIDI output can only be done from a callback? Is the
callback the same one as for the audio hardware? MIDI being sporadic just
doesn\'t seem to fit in JACK. Why not have a seperate API/daemon for MIDI and
have it and JACK both use the same UST timestamps?

--martijn





Powered by ASHosting



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Richard Bown

Paul Davis wrote:

> for me, its more than tempting. none of the existing ALSA MIDI
> sequencers have JACK support (that i know of).

Apart from Rosegarden-4.

> (some people have suggested that Rosengarden
> does audio too, but it seems to me that its firmly oriented towards
> MIDI).

*cough*

Rosegarden-4 has JACK and ALSA support - the last point release was
pretty much with a proof of concept JACK driver but the latest CVS
has decent enough (WAV) audio support through JACK and audio file
management.

(We also have aRts audio and MIDI support for what it's worth)

RG-4 "does" both MIDI and audio (full duplex through JACK).  We do
notation pretty well at the moment, our matrix/piano roll is a bit
lacking but our audio is definitely getting there (currently using
an external audio editor such as audacity for waveform editing).
The canvas audio segment editing is also getting pretty good -
splitting, copying, repeating and moving audio segments - all with
multi-level undo.

If you want to play around with JACK on Rosegarden I suggest you dig
it out of CVS.

B
-- 
http://www.all-day-breakfast.com/rosegarden
http://www.bownie.com



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 02:09:32PM +0100, Steve Harris wrote:
(cut)
>Yes, It's terrible. I remeber hearing from someone a year or so ago,
>who was incharge of cleaning up the source. I never heard any more though.

Well he had to clean it up,
I guess he just escaped and ran away ;)

vini



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>On Tue, Jul 23, 2002 at 09:23:27 -0400, Paul Davis wrote:
>> >If you actaully want to deal with raw MIDI (you'd be mad, but...) then its
>> >OK, as the maximum ammount of data per jack unit time is pretty small, but
>> >I agree, it's better dealt with via the alsa api.
>> 
>>   struct jack_midi_buffer_t {
>>   unsigned int event_cnt;
>>   WhateverTheALSASequencerEventTypeIsCalled events[0];
>>   };
>
>Right, but it seems silly when there are so many alsa sequencer apps out
>there. Though, having the audio and MIDI unified in one connection API
>does seem tempting.

for me, its more than tempting. none of the existing ALSA MIDI
sequencers have JACK support (that i know of). MusE is the only MIDI
sequencer with somewhat functional audio sequencing, and it doesn't
yet have JACK support (some people have suggested that Rosengarden
does audio too, but it seems to me that its firmly oriented towards
MIDI). this means that when/if those apps switch to using JACK, it
would be reasonable to have them use it for MIDI as well. and note:
the above API takes care, just as the audio side does, of a whole heap
of issues and handling. the "events" array would contain only the
events needed to be dealt with during the current time slice; they are
neatly packaged as an array; no connection setup with the sequencer is
done by the JACK client application code.

to me, its fairly compelling. the ALSA sequencer would of course
underlie it.

finally, i suspect that there are less ALSA sequencer apps than ALSA
PCM apps. the latter number didn't act as a large disincentive to
develop JACK ...

--p



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Steve Harris

On Tue, Jul 23, 2002 at 09:23:27 -0400, Paul Davis wrote:
> >If you actaully want to deal with raw MIDI (you'd be mad, but...) then its
> >OK, as the maximum ammount of data per jack unit time is pretty small, but
> >I agree, it's better dealt with via the alsa api.
> 
>   struct jack_midi_buffer_t {
>unsigned int event_cnt;
>WhateverTheALSASequencerEventTypeIsCalled events[0];
>   };

Right, but it seems silly when there are so many alsa sequencer apps out
there. Though, having the audio and MIDI unified in one connection API
does seem tempting.

- Steve



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Steve Harris

On Tue, Jul 23, 2002 at 01:14:02 +0100, Bob Ham wrote:
> On Tue, 2002-07-23 at 09:45, .n++k wrote:
> 
> > that's 15M, of which 5M (mysql-test. sql-bench) are useless
> > 
> > i would hardly call that big
> 
> 28K   /usr/local/include/ladspa.h

And that includes comments:

$ cat /usr/include/ladspa.h | gcc -E - | grep -v ^\# | grep -v ^$ | wc -c
1565
$

- Steve



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>On Tue, Jul 23, 2002 at 07:48:45 -0400, Paul Davis wrote:
>> the question is, however, to what extent is it worth it. the reason
>> JACK exists is because there was nothing like it available for moving
>> audio data around. this isn't true of the MIDI side of things, where
>
>If you actaully want to deal with raw MIDI (you'd be mad, but...) then its
>OK, as the maximum ammount of data per jack unit time is pretty small, but
>I agree, it's better dealt with via the alsa api.

well, what i was thinking was something more like this:

  struct jack_midi_buffer_t {
 unsigned int event_cnt;
 WhateverTheALSASequencerEventTypeIsCalled events[0];
  };

a JACK client that handles MIDI as well would do something like this
in its process() callback:

  jack_midi_buffer_t* buf = (jack_midi_buffer_t*) 
jack_port_get_buffer (midi_in_port, nframes);

  for (n = 0; n < buf->event_cnt; ++n) {
   process_midi_event (buf->events[n]);
  }

a real client would probably look at the timestamps in the events too.

--p



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread .n++k

From: Bob Ham <[EMAIL PROTECTED]>
Subject: Re: [linux-audio-dev] (no subject)
Date: 23 Jul 2002 13:14:02 +0100
Message-ID: <1027426442.2811.6.camel@insanity>

node> On Tue, 2002-07-23 at 09:45, .n++k wrote:
node> 
node> > that's 15M, of which 5M (mysql-test. sql-bench) are useless
node> > 
node> > i would hardly call that big
node> 
node> 28K   /usr/local/include/ladspa.h
node> 
node> It's big.  Don't get over-zealous here.  Complexity is the inverse of
node> reliability, remember.

I don't think i'm being zealous. And this was just a proposal
anyway... But if I need to defend myself, i\d like to point out that:

111Mtestrun.wav 

and that's just 10min worth of stereo audio

I already said the size matter was hypocrisy. 







Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Steve Harris

On Tue, Jul 23, 2002 at 07:48:45 -0400, Paul Davis wrote:
> the question is, however, to what extent is it worth it. the reason
> JACK exists is because there was nothing like it available for moving
> audio data around. this isn't true of the MIDI side of things, where

If you actaully want to deal with raw MIDI (you'd be mad, but...) then its
OK, as the maximum ammount of data per jack unit time is pretty small, but
I agree, it's better dealt with via the alsa api.

> parameters controls could be easily handled by OSC, which could be
> integrated into JACK fairly easily. the problem with OSC is that the
> reference implementation is probably the worst piece of coding i have
> ever seen outside of the internals of Csound. i was very keen to add

Yes, It's terrible. I remeber hearing from someone a year or so ago,
who was incharge of cleaning up the source. I never heard any more though.

- Steve



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Bob Ham

On Tue, 2002-07-23 at 09:45, .n++k wrote:

> that's 15M, of which 5M (mysql-test. sql-bench) are useless
> 
> i would hardly call that big

28K /usr/local/include/ladspa.h

It's big.  Don't get over-zealous here.  Complexity is the inverse of
reliability, remember.

-- 
Bob Ham: [EMAIL PROTECTED]  http://pkl.net/~node/



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Paul Davis

>Well i don't see it as unnecessary, just for the reason that you NEED
>a database. so any objection is either a practical one or a theorical
>one. Maybe the hidden objection is that it's too complicated for an
>application writer to relationalize his application's data
>model.. Myself i find it difficult to xml-ize data sometimes just for
>the fact that it freezes a certain hierarchy.

there is one important difference.

when you decide that its necessary to change the schema of a
relational DB, its a major change in the setup of things. when you
decide to change the DTD used for an XML file, you haven't actually
done very much, certainly not if the code that uses the results of
parsing is well written. 

both methods freeze a given ordering and relationship between the
parts, but the XML one freezes it only till its a bit mushy, the RDB
freezes it till its just about solid.

anyway, i think, as some others have said, that both XML and RDB
choices are completely irrelevant to the discussion unless someone
proposes that a single agent collects the state information and stores
in a single location (i.e. a file or a table). what format or
methodology a given participant application in the project uses to
store its state is not relevant if all the system does is to provide a
standard way to load, restore and collect state together for distribution.

--p



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Paul Davis

>On Mon, 2002-07-22 at 22:11, Phil Kerr wrote:
>> One of the strong points of Linux, and UNIX in general, is the way small
>> app's can be strung together, this is the most flexible approach.
>
>This is true, but only of apps that can fit in to the way that Unix does
>its stringing together; ie, filestream-based apps.  It is indeed
>advantageous to bend the way you work into this in many instances when
>you're trying to get a computer to do work.  Unfortunately, audio work
>isn't one of them.  There is no such framework to provide similar
>functionality to audio programs, yet.
>
>Jack provides an excellent transport for audio, but that is rarely the
>only thing that audio programs require; MIDI and parameter controls are
>also needed more often than not.  This is, to some degree, what I'm
>attempting to provide with the app (or api, if you like) that I'm
>working on.

i would like to note that JACK *was* designed with the intent of
handling non-audio data as well. there are challenges in making it
work, particularly data that isn't streaming such as MIDI
(i.e. non-constant quantities of data in a given time period). but i
believe these challenges can be overcome relatively easily by anyone
with the motivation to do so.

the question is, however, to what extent is it worth it. the reason
JACK exists is because there was nothing like it available for moving
audio data around. this isn't true of the MIDI side of things, where
the ALSA sequencer, despite not being in user space and despite having
a complex and not particularly simple API (as befits a powerful
system), does the job of routing and sharing MIDI pretty well. the
task of simplifying MIDI I/O in a way analagous to what JACK does for
audio (everything is a port, all sample streams are 32 bit float, etc)
doesn't seem very obvious in its goals, at least to me.

parameters controls could be easily handled by OSC, which could be
integrated into JACK fairly easily. the problem with OSC is that the
reference implementation is probably the worst piece of coding i have
ever seen outside of the internals of Csound. i was very keen to add
OSC support to Ardour, but when I saw what I was working with, i gave
up the idea. if someone produced a clean, well written implementation
of OSC, it would help many people immensely. it could be used instead
of LCP, for example, which would probably be a good thing.

--p



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Steve Harris

On RDMSs

Look, I like SQL as much as the next man, but its really not appropraite
for storing application state. RDMSs have loads of features that we just
dont need for this.

Most applications allready handle saving (some of) thier state to a file,
we just need to define a way to instruct them to do it on demand, into a
particular directory (socket, fifo, signal+/tmp, whatever).

When they want to be restarted they can be given the file as an argument.

Simple.

One hard thing is how to deal with communication interdependencies, eg.
app A talks jack to app B and app B talks alsa sequencer to app A.

- Steve



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Steve Harris

On Tue, Jul 23, 2002 at 04:56:06 -0300, Juan Linietsky wrote:
> > > Why not use an SQL database for storing session/project metadata?
> > > (configuration and such) We have the benefit of having a few quite
> > > stable free software SQL databases. (mysql, postgresql, sapdb) so
> > > requiring one wouldn't be too much to ask.
> > 
> > except they're big to install. that might cause some resistance.
> > and it strikes me as using a piledriver when a hammer would do.
> 
> And also you cant do the neat thing of asking all your apps to save
> all their data
> to a directory so you can create a targzip with the project :)

Yes, this is important, it should be possible to mail the tarball to a
friend who can then start up the project (assuming he has all the apps
installed of course).

- Steve



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread janne halttunen

Hi all,

I've been lurking on this list for a while.  I have worked mainly for various 
frontends of ecasound.  Currently I am in the making of a curses-based
graphical wave-editor. 


On Mon, 22 Jul 2002 22:27:31 -0300
Juan Linietsky <[EMAIL PROTECTED]> wrote:

> I'm talking about a "Project" manager, this hasnt been done before,
> and it wasnt
> done before no apis existed which would provide such amount of
> modularity (like jack/alsa),
> and because in commercial OSs it's no usual for programs to cooperate
> so much... so
> this is basically a new ground. the tasks involve:
> 
> -Saving current status of the apps, files open, data, etc, into a file
> you choose, or even
> be smart enough to tell your apps "save all your stuff here in this
> dir" and it does a nice tar.gz
> with your project in case you want to backup it. 
> -Loading current status of the apps, from just telling them filenames,
> or just giving it the files.

A super-simple solution to this would be using combination of commandline parameters 
and signals.

this would involve steps:

-agree on cmdline-switch to create a new file.
-agree on cmdline-switch to load an existing file.
-agree on signal to be used to save state to that given file. (SIGTERM?)

Ofcourse the project-manager program could maintain a DB of different load and new 
switches for different programs, but at least the signal handling should be somewhat 
standardised.


janne



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread Sebastien Metrot

Using a DB for such a job has no real advantage because saving a project is
just the begining of the story. After having been able to save your project
you'll want to have a simple way to backup it simply, to move it to another
computer (for exemple from a laptop computer to a desktop computer in the
studio, etc...). A real database is not an easy thing to move from one
computer to another and a set of files is much more simple for the user (try
to thing about the user sometimes ;-).

Sebastien

- Original Message -
From: "rm" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Tuesday, July 23, 2002 11:33 AM
Subject: Re: [linux-audio-dev] (no subject)


> On Tue, Jul 23, 2002 at 10:45:55AM +0200, .n++k wrote:
> > | > > Why not use an SQL database for storing session/project metadata?
> > | > > (configuration and such) We have the benefit of having a few quite
> > | > > stable free software SQL databases. (mysql, postgresql, sapdb) so
> > | > > requiring one wouldn't be too much to ask.
> >
> > Well it would be just as easy to me to have a small util with a dump
> > of the database.. as would any other manipulation/analysis on the
> > projects data be. Backup are easy too.
>
> requiring an sql db for this sort of thing would be gratuitous. for
> one, it's another dependency that people would have to download,
> compile, and setup. for most databases, the setup is non-trivial. (i
> can't run my audio app, because i can't figure out how to setup my
> database?) secondly, it's unnecessary as i understand the requirements
> of task. you might be able to justify a smaller database like
> sleepycat's stuff, but files are more reasonable and pretty much
> everyone has a filesystem these days. you just don't need the majority
> of the features that a relational db provides.
>
> gconf2 might be an appropriate model to look at for ideas. api calls
> for accessing the information are defined, and the backend is left
> unspecified (but likely fs based initially). additionally, swig,
> corba, etc can be used to provide bindings of the api to most
> languages.
> rob
> 
> Robert Melby
> Georgia Institute of Technology, Atlanta Georgia, 30332
> uucp: ...!{decvax,hplabs,ncar,purdue,rutgers}!gatech!prism!gt4255a
> Internet: [EMAIL PROTECTED]
>




Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Taybin Rutkin

On Tue, 23 Jul 2002, Vincent Touquet wrote:

> On Tue, Jul 23, 2002 at 12:07:43AM -0400, Paul Winkler wrote:
> >Does this help?
> >http://developer.gnome.org/arch/sm/extension.html
> (cut)
> >"_GSM_Priority
> (cut)
> 
> So their would be a dependency on gnome-session-manager
> (and what else ?)

gnome-session has no other dependencies.




Re: [linux-audio-dev] (no subject)

2002-07-23 Thread .n++k

From: rm <[EMAIL PROTECTED]>
Subject: Re: [linux-audio-dev] (no subject)
Date: Tue, 23 Jul 2002 05:33:56 -0400
Message-ID: <[EMAIL PROTECTED]>

async> On Tue, Jul 23, 2002 at 10:45:55AM +0200, .n++k wrote:
async> > | > > Why not use an SQL database for storing session/project metadata?
async> > | > > (configuration and such) We have the benefit of having a few quite
async> > | > > stable free software SQL databases. (mysql, postgresql, sapdb) so
async> > | > > requiring one wouldn't be too much to ask.
async> > 
async> > Well it would be just as easy to me to have a small util with a dump
async> > of the database.. as would any other manipulation/analysis on the
async> > projects data be. Backup are easy too.
async> 
async> requiring an sql db for this sort of thing would be gratuitous. for
async> one, it's another dependency that people would have to download,
async> compile, and setup. for most databases, the setup is non-trivial. (i
async> can't run my audio app, because i can't figure out how to setup my
async> database?)

That would be an amazingly bad design if the db persistence was a
requirement for starting/using the apps. I just think the cost of
having a dbms on a desktop linux machine to be really low. We all have
a different linux experience, which is part of the fun of it.

But aren't musicians a different kind of users than desktop users?

async> secondly, it's unnecessary as i understand the requirements
async> of task. you might be able to justify a smaller database like
async> sleepycat's stuff, but files are more reasonable and pretty much
async> everyone has a filesystem these days. you just don't need the majority
async> of the features that a relational db provides.

I said sql dbms, pointing at the fact it would be the simplest of that
family of databases. (like that primitive mysql is.. but nobody would
be forbidden to upgrade to a real relational one)

Well i don't see it as unnecessary, just for the reason that you NEED
a database. so any objection is either a practical one or a theorical
one. Maybe the hidden objection is that it's too complicated for an
application writer to relationalize his application's data
model.. Myself i find it difficult to xml-ize data sometimes just for
the fact that it freezes a certain hierarchy.

async> 
async> gconf2 might be an appropriate model to look at for ideas. api calls
async> for accessing the information are defined, and the backend is left
async> unspecified (but likely fs based initially). additionally, swig,
async> corba, etc can be used to provide bindings of the api to most
async> languages.

My preference probably comes from the fact i far prefer to install a
rather stable (api wise) database server (1 software) than to have to
install dozens of gnome packages..

Anyway, as a summary, what is the be solved: 
  . defining, naming, identifying a session (= project)
  . communicating with the apps to request loading and saving
  session associated data.
  . 1. each app is passed enough knowledge to store their state themselves
or
2. an api and library is designed that can abstract finding the location
of the project itself, how to store the data etc..

I fear that 2. would be a complete reinvention of things that have been done many 
times before (dbms) and thus would either result in: 
  . not being done
  . being done, incompletly

I'd prefer 1. although it freezes the backend.



.. 






Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Martijn Sipkema

> Yep, think of 0-127 ranges for controller data :(
> That is too coarse;

MIDI provides 14bit controller resolution by having controller
pairs. That should be enough for controller since most sliders/knobs
on hardware have much less than that.
Pitch bend is 14bit also, allthough there is a lot of hardware that
only uses the MSB.
For what MIDI is meant for it isn\'t that bad actually. A higher
transmission rate would solve most problems users are having. As long
as MIDI is only used for IPC you can easily have a higher transmission
rate.

--martijn



Powered by ASHosting



Re: [linux-audio-dev] (no subject)

2002-07-23 Thread rm

On Tue, Jul 23, 2002 at 10:45:55AM +0200, .n++k wrote:
> | > > Why not use an SQL database for storing session/project metadata?
> | > > (configuration and such) We have the benefit of having a few quite
> | > > stable free software SQL databases. (mysql, postgresql, sapdb) so
> | > > requiring one wouldn't be too much to ask.
> 
> Well it would be just as easy to me to have a small util with a dump
> of the database.. as would any other manipulation/analysis on the
> projects data be. Backup are easy too.

requiring an sql db for this sort of thing would be gratuitous. for
one, it's another dependency that people would have to download,
compile, and setup. for most databases, the setup is non-trivial. (i
can't run my audio app, because i can't figure out how to setup my
database?) secondly, it's unnecessary as i understand the requirements
of task. you might be able to justify a smaller database like
sleepycat's stuff, but files are more reasonable and pretty much
everyone has a filesystem these days. you just don't need the majority
of the features that a relational db provides.

gconf2 might be an appropriate model to look at for ideas. api calls
for accessing the information are defined, and the backend is left
unspecified (but likely fs based initially). additionally, swig,
corba, etc can be used to provide bindings of the api to most
languages.
rob

Robert Melby
Georgia Institute of Technology, Atlanta Georgia, 30332
uucp: ...!{decvax,hplabs,ncar,purdue,rutgers}!gatech!prism!gt4255a
Internet: [EMAIL PROTECTED]



Re: [linux-audio-dev] App intercomunication issues, some views.

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 12:07:43AM -0400, Paul Winkler wrote:
>Does this help?
>http://developer.gnome.org/arch/sm/extension.html
(cut)
>"_GSM_Priority
(cut)

So their would be a dependency on gnome-session-manager
(and what else ?)

regards
vini



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 04:56:06AM -0300, Juan Linietsky wrote:
>And also you cant do the neat thing of asking all your apps to save
>all their data
>to a directory so you can create a targzip with the project :)

That point is irrelevant, 
you can extract everything from
the database and tar gzip.

vini



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 08:43:36AM +0200, n++k wrote:
>Why not use an SQL database for storing session/project metadata?
>(configuration and such) We have the benefit of having a few quite
>stable free software SQL databases. (mysql, postgresql, sapdb) so
>requiring one wouldn't be too much to ask.. The persistence protocol
>(sql) is already there and tested, the administration is as easy as
>with plain text files, and (I think) the schema is a lot easier to
>design than that of a set of interrelated XML files. (hierarchical
>databases?)
(cut)

Well you are entering the souvereign entity
of a program and asking them to change
the format they are very used to
and dump it into a database ?

To me that sounds like making
a big soup of a nice modular
system again.

Let the programs keep their
own settings in their own way,
just tell them when to save them.

Also: are you going to impose 
people to install a RDB just
because they want their states
saved ?

Then comes the issue: which RDB,
you need choice, so you have to
provide a database abstraction layer.

Its not worth it in my eyes ...

regards
vini



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 02:30:22AM -0300, Juan Linietsky wrote:
>ok, but the question is, what for? What else do you need other than
>start/stop/seek ? doesnt midi proovide that already? then why
>something else?
>Also using midi you make sure that what you do is synced to external
>devices...

You can use a protocol which hasn't the limitations
of midi (7 bit CC data :() and trunk it on the ends
that need midi (filter the more dynamic range 
to the coarser range).

Of course this brings issues about the different
dynamic ranges used and I think you'll hear it.

I just don't think there is any good reason
to use the restrictions imposed by midi
when you stay inside your computer
(it is not as if you need to be able
 to pass all your data on a 33600 baud
 cable, is it ?)

regards
vini



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Vincent Touquet

On Tue, Jul 23, 2002 at 01:17:53AM -0400, Paul Winkler wrote:
>actually zipi and skini.
>tooskini has something to do with perry cook's STK.

I though skini is just readable MIDI ?
Hit me if I'm wrong though.

regards
vini



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Vincent Touquet

On Mon, Jul 22, 2002 at 11:21:56PM -0500, Arthur Peters wrote:
(cut)
>MIDI is very powerful, but it is also very
>restrictive in some ways.
Yep, think of 0-127 ranges for controller data :(
That is too coarse;

>Just a thought. I don't know much about these subjects.
Well I'm not an expert either :)
If you keep it in mind, be humble
and don't laugh at the experts its ok ;)

regards
vincent



[linux-audio-dev] (no subject)

2002-07-23 Thread .n++k


| On Tue, 23 Jul 2002 03:14:56 -0400
| Paul Winkler  wrote:
| 
| > On Tue, Jul 23, 2002 at 08:43:36AM +0200, n++k wrote:
| > > Just a comment on the metadata persistence:
| > >
| > > Why not use an SQL database for storing session/project metadata?
| > > (configuration and such) We have the benefit of having a few quite
| > > stable free software SQL databases. (mysql, postgresql, sapdb) so
| > > requiring one wouldn't be too much to ask.
| >
| > except they're big to install. that might cause some resistance.
| > and it strikes me as using a piledriver when a hammer would do.
| >

Let's not be hypocrites:
$ du -hs /opt/mysql
  2.5Mbin
  252kinclude
  1.9Minfo
  1.1Mlib
  2.7Mlibexec
  76k man
  1.4Mmysql-test
  888kshare
  4.5Msql-bench

that's 15M, of which 5M (mysql-test. sql-bench) are useless

i would hardly call that big

I think the real issue is that of the api, i haven't tested myself but
ODBC provides a common api for databases under linux too. (drivers are
available for just about any of the databases mentionned)

| 
| And also you cant do the neat thing of asking all your apps to save
| all their data
| to a directory so you can create a targzip with the project :)
| 
| Juan Linietsky 

Well it would be just as easy to me to have a small util with a dump
of the database.. as would any other manipulation/analysis on the
projects data be. Backup are easy too.

Somehow i think xml+files are messy and unflexible, and just a step
backwards in the world of hierarchical databases.




Re: [linux-audio-dev] swh plugins 0.2.8

2002-07-23 Thread Joern Nettingsmeier

Dave Phillips wrote:
> 
> Steve Harris wrote:
> 
> > http://plugin.org.uk/releases/0.2.8/
> >
> > New stuff:
> >[snip]
> >Plate reverb phsical model, full of platey goodness
> 
> Oooh, plate reverb ! I love plate reverbs, so I wired this one up in
> Snd, applied it, and... hey, not bad ! Another neat reverb, thank you
> Steve and Josef !
> 
> Now for a modeled reverb a la the old Fender Twin... I want to be able
> to shake my machine and hear the springs, is that too much too ask ??
> ;-)

maybe the guys who put the tube amp on their board will be interested in
this... at least it will make a slashdot headline :)



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Phil Kerr

Using a SQL DB with XML is less than optimal partly for the reasons Paul
mentions below.

http://xml.apache.org/xindice/

Xindice is a fully native XML DB and is cross-platform (thinking
cross-platform goes together with a distributed architecture).

Although having to use a DB to store metadata could be considered
overkill, and in some instances it could be,  it does offer some
advantages when trying to centralize config gathering (dump the DB and
you have everything).

The question is: is this the best way or should the distributed app
handle their own configuration data and the master controller just
request which patch to use.

-P

Paul Winkler wrote:

> On Tue, Jul 23, 2002 at 08:43:36AM +0200, n++k wrote:
> > Just a comment on the metadata persistence:
> >
> > Why not use an SQL database for storing session/project metadata?
> > (configuration and such) We have the benefit of having a few quite
> > stable free software SQL databases. (mysql, postgresql, sapdb) so
> > requiring one wouldn't be too much to ask.
>
> except they're big to install. that might cause some resistance.
> and it strikes me as using a piledriver when a hammer would do.
>
> also wouldn't we have to standardize on one RDBM?
> I'm no DBA but IIRC the SQL queries are not 100%
> the same...
>
> (sorry, it's nearing the end of the month and I'm behind
> on my acronym quota)
>
> --PW
>
> --
>
> Paul Winkler
> home:  http://www.slinkp.com
> "Muppet Labs, where the future is made - today!"




Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Juan Linietsky

On Tue, 23 Jul 2002 03:14:56 -0400
Paul Winkler <[EMAIL PROTECTED]> wrote:

> On Tue, Jul 23, 2002 at 08:43:36AM +0200, n++k wrote:
> > Just a comment on the metadata persistence:
> > 
> > Why not use an SQL database for storing session/project metadata?
> > (configuration and such) We have the benefit of having a few quite
> > stable free software SQL databases. (mysql, postgresql, sapdb) so
> > requiring one wouldn't be too much to ask.
> 
> except they're big to install. that might cause some resistance.
> and it strikes me as using a piledriver when a hammer would do.
> 

And also you cant do the neat thing of asking all your apps to save
all their data
to a directory so you can create a targzip with the project :)

Juan Linietsky



Re: [linux-audio-dev] App metadata intercomunication protocol..

2002-07-23 Thread Paul Winkler

On Tue, Jul 23, 2002 at 08:43:36AM +0200, n++k wrote:
> Just a comment on the metadata persistence:
> 
> Why not use an SQL database for storing session/project metadata?
> (configuration and such) We have the benefit of having a few quite
> stable free software SQL databases. (mysql, postgresql, sapdb) so
> requiring one wouldn't be too much to ask.

except they're big to install. that might cause some resistance.
and it strikes me as using a piledriver when a hammer would do.

also wouldn't we have to standardize on one RDBM?
I'm no DBA but IIRC the SQL queries are not 100%
the same...

(sorry, it's nearing the end of the month and I'm behind
on my acronym quota)

--PW

-- 

Paul Winkler
home:  http://www.slinkp.com
"Muppet Labs, where the future is made - today!"