/wrote Paul Davis <[EMAIL PROTECTED]> [Wed, 24 Jul 2002 15:05:24 -0400]
|>| thats the way to handle MIDI *input*. it doesn't really apply to
|>| output, though it would work there too.
|>
|>Yes, it's the symetrical case
|
|only theoretically. handling MIDI input is quite different than
|handling
>MIDI is different to audio and it seems wrong to lump it in with Jack as it's
>just adding complexity. Being able to sync things up with Jack would be
>good.
right. we don't try to get JACK to do MIDI right now, but we do want
JACK to be able to provide sync'ed transport control. longer term, i
>> consider:node B
>>/\\
>> ALSA PCM -> node Anode D -> ALSA PCM
>>\\/
>> node C
>>
>> what is the latency for output of data from node A ? it depends on
>> what happens at nod
Hi all,
> I've been meaning for some time to make a universal transport control, a
> separate app whose sole purpose would be transport control.
> This would be rad (TM) for a number of reasons:
>
> 1) Get your whole rig to start at once, even if you're not using
> ecasound or ardour
Which is pr
On Wed, 24 Jul 2002, Richard Bown wrote:
>> because the transport time is the only important global time it indicates
>> what you should be playing/doing right now.
> But of course at the moment we've also got the ALSA sequencer clock running
> to schedule our MIDI alongside our JACK callbacks -
On Wed, 24 Jul 2002, Richard Bown wrote:
> What apps have an example implementation of the JACK transport controls?
> Neither ecasound or ardour appear to but I'm probably a bit out of date
> with them. And again, why should I care about the JACK transport?
http://cvs.seul.org/cgi-bin/cvsweb-1.
On Wed, 24 Jul 2002, Andy Wingo wrote:
[JACK transport control]
> I've been meaning for some time to make a universal transport control, a
> separate app whose sole purpose would be transport control.
> This would be rad (TM) for a number of reasons:
>
> 1) Get your whole rig to start at once, ev
[...]
>
> consider:node B
>/\\
> ALSA PCM -> node Anode D -> ALSA PCM
>\\/
>node C
>
> what is the latency for output of data from node A ? it depends on
> what happens at n
On Wednesday 24 July 2002 03:48 pm, Paul Winkler wrote:
> On Wed, Jul 24, 2002 at 03:13:45PM -0400, Lamar Owen wrote:
> > What ALSA USB audio? Owning a UA-100 I got off eBay cheap, it would be
> > nice to see good USB ALSA support. The existing UA-100 USB driver does
> > work, but it's a cripple
On Wed, Jul 24, 2002 at 03:13:45PM -0400, Lamar Owen wrote:
> On Wednesday 24 July 2002 08:05 am, Paul Davis wrote:
> > i'm hoping takashi will jump on this the way he did USB audio and get
> > this merged/extended into alsa.
>
> *REWIND*
>
> What ALSA USB audio? Owning a UA-100 I got off eBay
On Wednesday 24 July 2002 08:05 am, Paul Davis wrote:
> i'm hoping takashi will jump on this the way he did USB audio and get
> this merged/extended into alsa.
*REWIND*
What ALSA USB audio? Owning a UA-100 I got off eBay cheap, it would be nice
to see good USB ALSA support. The existing UA-10
>> as i\'ve indicated, i think this is a
>> bad design. Defining the semantics of MSC in a processing graph is
>> hard (for some of the same reasons that jack_port_get_total_latency()
>> is hard to implement).
>
>Why? On a audio card interrupt buffers traverse the entire graph right? i.e.
>for eve
>| thats the way to handle MIDI *input*. it doesn't really apply to
>| output, though it would work there too.
>
>Yes, it's the symetrical case
only theoretically. handling MIDI input is quite different than
handling MIDI output. there is no scheduling necessary for MIDI input
- the data arrives
> >If I use an absolute sleep there is basically no difference. The drift
> >will be the same, but instead of scheduling events from \'now\' I can
> >spcify the exact time. So a callback would then be like:
> >
> >- get the UST and MSC for the first frame of the current buffer for input
>
> MSC
>If I use an absolute sleep there is basically no difference. The drift
>will be the same, but instead of scheduling events from \'now\' I can
>spcify the exact time. So a callback would then be like:
>
>- get the UST and MSC for the first frame of the current buffer for input
MSC implies timest
/wrote Paul Davis
| >By the way, I remember talks around here or jack's mailing list on the
| >issue of non constant frame number passed to the process callback... I
| >don't remember if anything was decided, but i was thinking it would be
| >nice to leave it bounded but non-constant, just for be
>By the way, I remember talks around here or jack's mailing list on the
>issue of non constant frame number passed to the process callback... I
>don't remember if anything was decided, but i was thinking it would be
>nice to leave it bounded but non-constant, just for being able to
>design an app
> nanosleep isn\'t based on time-of-day, which is what is subject to
> adjustment. nanosleep uses the schedule_timeout, which is based on
> jiffies, which i believe are monotonic.
I\'m not sure how nanosleep() is supposed to handle clock adjustment
but I agree it would probably not change its beh
Paul Davis wrote:
> because the transport time is the only important global time it indicates
> what you should be playing/doing right now.
But of course at the moment we've also got the ALSA sequencer clock running
to schedule our MIDI alongside our JACK callbacks - so external control of
our a
/wrote Paul Davis
| i believe that relative nanosleep is better than absolute sleep for
| the simple reason that its how you would avoid drift in
| practice. consider a JACK callback:
|
| process (jack_nframes_t nframes)
| {
| jack_transport_info_t now;
|
On Wed, 24 Jul 2002, Richard Bown wrote:
> Paul Davis wrote:
>
> > JACK provides both current time and information on the latency
>
> Oh right - so jack_port_get_total_latency() is working now? Excellent.
> Last time I asked it was still in the planning stage.
>
> What apps have an example im
>> JACK provides both current time and information on the latency
>
>Oh right - so jack_port_get_total_latency() is working now? Excellent.
>Last time I asked it was still in the planning stage.
no, it exists and works, but it can't handle all possible connection
graphs. it may never be able to.
>> >[...]
>> >> CLOCK_MONOTONIC doesn\\\'t change the scheduling resolution of the
>> >> kernel. its not useful, therefore, in helping with this problem.
>> >
>> >Not useful right now. CLOCK_MONOTONIC scheduling resolution will get
>> >better I hope.
>>
>> How can it? UST cannot be the clock tha
Paul Davis wrote:
> JACK provides both current time and information on the latency
Oh right - so jack_port_get_total_latency() is working now? Excellent.
Last time I asked it was still in the planning stage.
What apps have an example implementation of the JACK transport controls?
Neither ecaso
/wrote Paul Davis
| >/wrote Paul Davis <|@op.net> [Tue, 23 Jul 2002 19:58:16 -0400]
| >
| >|SGI tried to solve the problem of the Unix read/write API mismatch
| >|with realtime streaming media by adding timestamps to data. CoreAudio
| >|solves it by getting rid of read/write, and acknowledging t
> >[...]
> >> UST can be used for timestamping, but thats sort of useless, since the
> >> timestamps need to reflect audio time (see below).
> >
> >I\\\'d like to have both a frame count (MSC) and a corresponding system
time
> >(UST) for each buffer (the first frame). That way I can predict when (
[...]
> if you find the link for the
> ex-SGI video API developer\'s comment on the dmSDK, i think you may
> also see some serious grounds for concern about using this API. i\'m
> sorry i don\'t have it around right now.
is it http://www.lurkertech.com/linuxvideoio.html ?
this is about the older
>On Tue, Jul 23, 2002 at 01:42:25 -0700, Fernando Pablo Lopez-Lezcano wrote:
>> one as I just noticed last week that in 2.4.19-rc2 there is
>> now an option to compile a IEC61883-6 protocol stack module
>> for ieee1394 so I'll ask the question again...
>
>Yes, theres a mail refering to it here
>[...]
>> UST can be used for timestamping, but thats sort of useless, since the
>> timestamps need to reflect audio time (see below).
>
>I\'d like to have both a frame count (MSC) and a corresponding system time
>(UST) for each buffer (the first frame). That way I can predict when (UST)
>a certai
>[...]
>> Its worth noting that SGI\'s \"DM\" API has never really taken
>> off, and there are lots of reasons why, some technical, some
>> political.
>
>Perhaps. See http://www.khronos.org/ for where SGI\'s dmSDK might
i've seen khronos's stuff. its a group made up of current bit players
in the
>/wrote Paul Davis <[EMAIL PROTECTED]> [Tue, 23 Jul 2002 19:58:16 -0400]
>
>|the most fundamental problem with SGI's approach to audio+video is
>|that its not based on the desirability of achieving low latency for
>|realtime processing and/or monitoring. its centered on the playback of
>|existing,
Hello,
When i run java sound demo, in Linux, i am getting the below exception;
javax.sound.sampled.LineUnavailableException
Please let me know the workaround for this.
Thanks & Regards
Peter
On Tue, Jul 23, 2002 at 01:42:25 -0700, Fernando Pablo Lopez-Lezcano wrote:
> one as I just noticed last week that in 2.4.19-rc2 there is
> now an option to compile a IEC61883-6 protocol stack module
> for ieee1394 so I'll ask the question again...
Yes, theres a mail refering to it here:
http
On Tue, Jul 23, 2002 at 01:27:16 -0400, Paul Winkler wrote:
> Why is this hard?
>
> ensure ALSA is up
> start jackd
...
> run jackconnect with appropriate args to connect clients
...
> run aconnect with appropriate args
OK, good point. That moves some of the complexity out of the apps too, as
yo
On Tue, Jul 23, 2002 at 01:51:19 -0400, Paul Winkler wrote:
> Works for me. :)
> But most people aren't used to an "edit, run sfront, run gcc, run
> executable" cycle in their composing work. It's not exactly
> RCD (Rapid Composition Development (TM)).
Yep, and thats why God^TM invented Makefil
> their stuff has never been widely (if at all) used for low latency
> real time processing of audio.
[...]
> ...it doesn\'t get used this way
> because (1) their hardware costs too much (2) their API for audio
> doesn\'t encourage it in any way (3) their API for digital media in
> general is conf
[...]
> Its worth noting that SGI\'s \"DM\" API has never really taken
> off, and there are lots of reasons why, some technical, some
> political.
Perhaps. See http://www.khronos.org/ for where SGI\'s dmSDK might
still be going. I think this API might be good for video. So maybe
it is not that go
[...]
> UST can be used for timestamping, but thats sort of useless, since the
> timestamps need to reflect audio time (see below).
I\'d like to have both a frame count (MSC) and a corresponding system time
(UST) for each buffer (the first frame). That way I can predict when (UST)
a certain perfo
SpamAssassin was not at all hapy with this mail:
On Mon, Jul 22, 2002 at 05:18:46PM -0300, Juan Linietsky wrote:
> SPAM: Start SpamAssassin results --
> SPAM: This mail is probably spam. The original message has been altered
> SPAM: so you can recognise o
39 matches
Mail list logo