Re: [LAD] Tests directly routing pc's midi-in to midi-out (was: Re: ALSA MIDI latency test results are far away from reality)

2010-07-15 Thread Arnout Engelen
On Wed, Jul 14, 2010 at 11:54:35PM +0200, Ralf Mardorf wrote:
 On Wed, 2010-07-14 at 23:43 +0200, f...@kokkinizita.net wrote:
  On Wed, Jul 14, 2010 at 11:28:54PM +0200, Ralf Mardorf wrote:
   Or it was at 4 ms = +- 2ms or something like that. This is a delay that
   isn't audible for day-today-day audio events, but it can brake a groove
   easily.

That would mean my hardware synth (the yamaha vl70-m), connected *directly* 
(no PC involved), causes a huge latency (20-26ms) making it utterly unusable 
for making a groove - and not just slightly, but by a large margin.

That *could* be (as it's not a drum synth), but it'd be kind of surprising. 

  You keep repeating this, but so far I haven't seen a shred
  of verifyable evidence to support this claim.
 
 I could record audio for kick, snare, hi hat and bass one after the
 other and mix it to one rhythm group and additionally I could record all
 instruments at the same time and send the recordings to you and you
 could do the same by yourself. It's also hard to say, if there isn't
 more jitter, but 4 ms. At what point starts the attack of a signal
 within the ambient noise level?

Perhaps you could make a stereo recording, the left channel recording the
mic'ed 'tick' of hitting the trigger, the right channel recording the audio
coming from the speakers? You'd say e.g. a loud hi-hat should be recognisable
enough.

 At least I could record FluidSynth DSSI in unison played to the Alesis
 D4 by using different -p values. I'm sure everybody would be able to
 here the problem.

Let's keep this thread restricted to the situation with only ALSA MIDI in
routed directly to ALSA MIDI OUT - it's getting hard to keep track of what's
going on :).


Arnout
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Clemens Ladisch
Ralf Mardorf wrote:
 On Wed, 2010-07-14 at 19:56 +0200, Arnout Engelen wrote:
  On Wed, Jul 14, 2010 at 03:23:03PM +0200, Ralf Mardorf wrote:
   Yamaha DX7 -- Alesis D4 results in a 100% musical groove.
   Yamaha DX7 -- PC -- Alesis D4 results in extreme latency
  
  So here you're directly routing the MIDI IN to the MIDI OUT, and 
  experiencing
  latency. Are you using JACK here, or directly ALSA? In other words, are you 
  connecting 'in' to 'out' in the qjackctl 'MIDI' tab or in the 'ALSA' tab?
 
 I'm connecting MIDI in the Qtractor (quasi QjackCtl) ALSA MIDI tab.

Please make a test without any program using JACK, just connect the
DX7 port to the D4 port with aconnect(gui), and try that.


Regards,
Clemens
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out (was: Re: ALSA MIDI latency test results are far away from reality)

2010-07-15 Thread fons
On Thu, Jul 15, 2010 at 01:14:45AM +0200, Ralf Mardorf wrote:

  Apart from that, it remains to be seen if *real* timing errors of
  +/- 2 ms do 'destroy the groove'. To test this, make the same 
  recording 
  
  - without jitter,
  - with 1 ms jitter,
  - with 2 ms jitter,
  - with 3 ms jitter.
  
  and check if listeners are able to identify which is which,
  or at least to put them into order.
 
 I know very gifted musicians who do like me and they always 'preach'
 that I should stop using modern computers and I don't know much averaged
 people. So the listeners in my flat for sure would be able to hear even
 failure that I'm unable to hear.

I'm sure they would be sensitive to bad timing.  But that's not
the question. Would they be able to identify the recordings listed
above ? Until you try it you won't know, and your claim that 2 ms
of jitter 'destroys the groove' is pure conjecture.

 Anyway. this crowd shouldn't be the benchmark for good music. Am I
 wrong?

It's not about what 'good music' is. The question is if midi jitter
of 2 ms does degrade the quality of a rendering.

Ciao, 

-- 
Je veux que la mort me trouve plantant mes choux, mais
nonchalant d’elle, et encore plus de mon jardin imparfait.
(Michel de Montaigne)
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out (was: Re: ALSA MIDI latency test results are far away from reality)

2010-07-15 Thread Arnold Krille
On Thursday 15 July 2010 01:14:45 Ralf Mardorf wrote:
 On Thu, 2010-07-15 at 00:46 +0200, f...@kokkinizita.net wrote:
  Apart from that, it remains to be seen if *real* timing errors of
  +/- 2 ms do 'destroy the groove'. To test this, make the same
  recording
  
  - without jitter,
  - with 1 ms jitter,
  - with 2 ms jitter,
  - with 3 ms jitter.
  
  and check if listeners are able to identify which is which,
  or at least to put them into order.
 I know very gifted musicians who do like me and they always 'preach'
 that I should stop using modern computers and I don't know much averaged
 people. So the listeners in my flat for sure would be able to hear even
 failure that I'm unable to hear.

You really should do that test first before speculating about the outcome and 
your audience.

You would expect Audiophiles to spot the super sounding denon cables by 
listening, right? Yet a blind test showed the opposite. The test was to 
identify which audio take was played with denon-cables, el-cheapo cables from 
walmart and a bended cloth-hanger. If they where as good as they claimed, the 
denon-cable should get hits with probability significantly better then 1/3, 
otherwise its just luck.
Guess what the outcome was: There was a significant hit: But they spotted the 
cloth-hanger as the denon-cable. Thats what real experts do...

Do the listening test with as many people as possible and then show the 
results. And only afterwards start the speculations what the reason and the 
effects might be. (Thats called science btw.)

Have fun,

Arnold


signature.asc
Description: This is a digitally signed message part.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out (was: Re: ALSA MIDI latency test results are far away from reality)

2010-07-15 Thread fons
On Thu, Jul 15, 2010 at 01:46:41AM +0200, Ralf Mardorf wrote:

 I don't know any pipe organ for a church in Parma, but I'm sure if the
 keyboarder pushes a key the sound will be audible at the same time.

Absoulutely not. Organ pipes take some time before they will sound,
this can easily be 20 ms and considerably more for some. Apart from
that pipes are not all at the same distance from the player, this
again can easily introduce variations of 20 ms or more, even between
pipes of the same rank.

 I guess, Aeolus will play every note on real time when played by a Linux
 sequencer, e.g. Qtractor, but I guess not when played by my master
 keyboard, played by an organist.

Aeolus doesn't care or even know where the MIDI is coming from.
And it does quantise note on/off to Jack periods. Nobody ever
complained about that.

Ciao,

-- 
Je veux que la mort me trouve plantant mes choux, mais
nonchalant d’elle, et encore plus de mon jardin imparfait.
(Michel de Montaigne)
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


[LAD] Transport issue for Qtractor - has impact to the jitter issue

2010-07-15 Thread Ralf Mardorf
Transport issue for Qtractor - has impact to the jitter issue
So the advice to use amidiplay is something I'll follow soon.

Hi all :), hi Robin :), hi Devin :)

Robin, for 64 Studio 3.3 alpha the group has got read and write access
to /dev/hpet too. Btw '[...] | sudo tee [...]' for 3.3 alpha isn't good,
regarding to the enabled root account. Anyway, for this recording test I
kept the value 64 for hpet/max-user-freq, but I'll test higher values
soon.
Devin and some others want to know if the drum module is played before
FluidSynth DSSI is played.

JACK2 doesn't start with -p4096, so I started JACK2 with -Rch -dalsa
-dhw:0 -r44100 -p2048 -n2 to increase the unwanted effect, to get a
clear result.

Without recording it's clear audible that the external drum module
always is played before FluidSynth DSSI is played, plausible regarding
to the audio latency (stupid idea to use such high latency ;), so
there's the need to do an audio test with lower latency, to see if
jitter might change what instrument is played first.

Instead of a rhythm I did record 4 to the floor at 120BPM to 'see' the
jitter.

I did record FluidSynth DSSI by the sound card too. Left channel the
drum module, right channel FluidSynth DSSI and regarding to Qtractor's
graphic of the wavefrorms, there is jitter for both!

Ok, next recording with -Rch -dalsa -dhw:0 -r96000 -p512 -n2.

Without recording it's already audible that the drum module is played
first all the time ...

and it's visible too. Again there's jitter for both. The audio recording
of the drum module always is before the MIDI event. The recording of
FluidSynth DSSI sometimes is before and sometimes after the MIDI event.
There's no offset for the audio track.

I kept -Rch -dalsa -dhw:0 -r96000 -p512 -n2 and recorded FluidSynth DSSI
alone, internal Linux, without using the sound card.

The audio recordings are before the MIDI events and there's jitter. I
never noticed jitter internal Linux before.

I need to repeat the test ASAP, but by using 64 Studio 3.0 beta and
perhaps an older version of Qtractor.

Playing FluidSynth DSSI by MIDI and the recording made internal Linux in
unison, there isn't audible jitter. But after starting playing sometimes
MIDI and the audio recording are perfectly synced and sometimes there's
delay, real delay between the recording and MIDI, not only an early
reflection like effect (but without audible jitter, the jitter only is
visible by the waveforms).

$ qtractor -v
Qt: 4.5.2
Qtractor: 0.4.6

More maybe tomorrow.

Cheers!

Ralf

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Transport issue for Qtractor - has impact to the jitter issue

2010-07-15 Thread Rui Nuno Capela
On Thu, 15 Jul 2010 10:15:53 +0200, Ralf Mardorf wrote:
 Transport issue for Qtractor - has impact to the jitter issue
 

what transport issue? oh i see... :)

qtractor has in fact a known trouble with jack transport sync when slaving
to period/buffer sizes greater than 2048 frames. it skids. maybe that's
exactly what you're experiencing when running jackd -p4096. 

i'll recommend for you to turn jack transport mode to none or master
only, never slave nor full on those situations (in
view/options.../audio/jack Transport).

another recommendation would be to grab the latest from svn trunk where
the said trouble has been already mitigated :)

cheers
-- 
rncbc aka Rui Nuno Capela
rn...@rncbc.org
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Ralf Mardorf
On Thu, 2010-07-15 at 08:26 +0200, Clemens Ladisch wrote:
 Ralf Mardorf wrote:
  On Wed, 2010-07-14 at 19:56 +0200, Arnout Engelen wrote:
   On Wed, Jul 14, 2010 at 03:23:03PM +0200, Ralf Mardorf wrote:
Yamaha DX7 -- Alesis D4 results in a 100% musical groove.
Yamaha DX7 -- PC -- Alesis D4 results in extreme latency
   
   So here you're directly routing the MIDI IN to the MIDI OUT, and 
   experiencing
   latency. Are you using JACK here, or directly ALSA? In other words, are 
   you 
   connecting 'in' to 'out' in the qjackctl 'MIDI' tab or in the 'ALSA' tab?
  
  I'm connecting MIDI in the Qtractor (quasi QjackCtl) ALSA MIDI tab.
 
 Please make a test without any program using JACK, just connect the
 DX7 port to the D4 port with aconnect(gui), and try that.
 
 
 Regards,
 Clemens

I'll test this :).

On Thu, 2010-07-15 at 09:57 +0200, f...@kokkinizita.net wrote: 
 I'm sure they would be sensitive to bad timing.  But that's not
 the question. Would they be able to identify the recordings listed
 above ? Until you try it you won't know, and your claim that 2 ms
 of jitter 'destroys the groove' is pure conjecture.

Who knows? Perhaps 2ms shown by Audacity are 20ms, because Audacity has
a bug? Perhaps 2ms shown by Audacity are 2ms, but when playing JACK, the
sound card or what ever will add additional jitter?

I can't test the list, because on my machine there is something audible.

On Thu, 2010-07-15 at 09:56 +0200, Arnold Krille wrote: 
 On Thursday 15 July 2010 01:14:45 Ralf Mardorf wrote:
  On Thu, 2010-07-15 at 00:46 +0200, f...@kokkinizita.net wrote:
   Apart from that, it remains to be seen if *real* timing errors of
   +/- 2 ms do 'destroy the groove'. To test this, make the same
   recording
   
   - without jitter,
   - with 1 ms jitter,
   - with 2 ms jitter,
   - with 3 ms jitter.
   
   and check if listeners are able to identify which is which,
   or at least to put them into order.
  I know very gifted musicians who do like me and they always 'preach'
  that I should stop using modern computers and I don't know much averaged
  people. So the listeners in my flat for sure would be able to hear even
  failure that I'm unable to hear.
 
 You really should do that test first before speculating about the outcome and 
 your audience.
 
 You would expect Audiophiles to spot the super sounding denon cables by 
 listening, right? Yet a blind test showed the opposite. The test was to 
 identify which audio take was played with denon-cables, el-cheapo cables from 
 walmart and a bended cloth-hanger. If they where as good as they claimed, the 
 denon-cable should get hits with probability significantly better then 1/3, 
 otherwise its just luck.
 Guess what the outcome was: There was a significant hit: But they spotted the 
 cloth-hanger as the denon-cable. Thats what real experts do...
 
 Do the listening test with as many people as possible and then show the 
 results. And only afterwards start the speculations what the reason and the 
 effects might be. (Thats called science btw.)
 
 Have fun,
 
 Arnold

Perhaps it's not that 2ms, but here are audible issues. As I mentioned before. 
Audacity shows 2ms, but JACK, the driver the hardware might add jitter.

FWIW blind tests aren't scientific,just double-blind studies are meaningful. 
And if you wish to test cables you need to test the quality after one year, 
after two years etc.. Anyway, a bad cable might cause a bad sound quality, but 
not bad timing.
Timing is the meat and potatoes to music.

Regarding to my Linux computer such studies aren't needed. A bad timing is a 
bad timing.
At least for the USB MIDI that I'm not using anymore, I made tests with a 
Windows install (I don't have this install on my machine anymore, so I can't 
test the PCI card with Windows).
The USB MIDI was much better on Windows, even better than the PCI cards at the 
moment are on Linux. So I guess, yes I don't know, that the hardware is ok.

- Ralf

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Transport issue for Qtractor - has impact to the jitter issue

2010-07-15 Thread Ralf Mardorf
Hi Rui :)

no, a misunderstanding, at -Rch -dalsa -dhw:0 -r96000 -p512 -n2 there's
an issue internal Qtractor, no JACK transport.

There's a MIDI track with FluidSynth DSSI and there's an audio track
with a recording of FluidSynth DSSI, done by Qtractor audio out directly
connected to audio in.

The recording is several bars. If I start somewhere within the bars
Qtractor might play the MIDI track and audio track in sync. If I stop
and start playing again, somewhere else, the same tracks are played with
delay. The delay can be around 1/16 note (this is around 125 ms @ 120
BPM, not measured, I'm able to here what a 1/16 note is ;).

It's a kick played at each beat. Sometimes when playing the MIDI and
audio kick are played in unison and after stopping and starting again,
there can be delay of around 1/16 note or sometimes less, but 1/16 note.

The MIDI and audio track aren't synced all the time after pushing play.

Cheers!

Ralf

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Arnout Engelen
On Thu, Jul 15, 2010 at 10:40:53AM +0200, Ralf Mardorf wrote:
 On Thu, 2010-07-15 at 09:57 +0200, f...@kokkinizita.net wrote: 
  Until you try it you won't know, and your claim that 2 ms of jitter 
  'destroys the groove' is pure conjecture.
 
 Who knows? Perhaps 2ms shown by Audacity are 20ms, because Audacity has
 a bug? 

Sounds unlikely - but if you post the recording we can check.

 Perhaps 2ms shown by Audacity are 2ms, but when playing JACK, the
 sound card or what ever will add additional jitter?

You claimed the latency/timing was obviously noticably broken without needing
'golden ears' even without JACK, right? Let's first try to get this reliable 
without JACK, just ALSA, then we can add other things to the mix later. If 
this is not possible, anything 'on top of' this is doomed, too. I doubt it 
though.


Arnout

 I can't test the list, because on my machine there is something audible.
 
 On Thu, 2010-07-15 at 09:56 +0200, Arnold Krille wrote: 
  On Thursday 15 July 2010 01:14:45 Ralf Mardorf wrote:
   On Thu, 2010-07-15 at 00:46 +0200, f...@kokkinizita.net wrote:
Apart from that, it remains to be seen if *real* timing errors of
+/- 2 ms do 'destroy the groove'. To test this, make the same
recording

- without jitter,
- with 1 ms jitter,
- with 2 ms jitter,
- with 3 ms jitter.

and check if listeners are able to identify which is which,
or at least to put them into order.
   I know very gifted musicians who do like me and they always 'preach'
   that I should stop using modern computers and I don't know much averaged
   people. So the listeners in my flat for sure would be able to hear even
   failure that I'm unable to hear.
  
  You really should do that test first before speculating about the outcome 
  and 
  your audience.
  
  You would expect Audiophiles to spot the super sounding denon cables by 
  listening, right? Yet a blind test showed the opposite. The test was to 
  identify which audio take was played with denon-cables, el-cheapo cables 
  from 
  walmart and a bended cloth-hanger. If they where as good as they claimed, 
  the 
  denon-cable should get hits with probability significantly better then 1/3, 
  otherwise its just luck.
  Guess what the outcome was: There was a significant hit: But they spotted 
  the 
  cloth-hanger as the denon-cable. Thats what real experts do...
  
  Do the listening test with as many people as possible and then show the 
  results. And only afterwards start the speculations what the reason and the 
  effects might be. (Thats called science btw.)
  
  Have fun,
  
  Arnold
 
 Perhaps it's not that 2ms, but here are audible issues. As I mentioned 
 before. Audacity shows 2ms, but JACK, the driver the hardware might add 
 jitter.
 
 FWIW blind tests aren't scientific,just double-blind studies are meaningful. 
 And if you wish to test cables you need to test the quality after one year, 
 after two years etc.. Anyway, a bad cable might cause a bad sound quality, 
 but not bad timing.
 Timing is the meat and potatoes to music.
 
 Regarding to my Linux computer such studies aren't needed. A bad timing is a 
 bad timing.
 At least for the USB MIDI that I'm not using anymore, I made tests with a 
 Windows install (I don't have this install on my machine anymore, so I can't 
 test the PCI card with Windows).
 The USB MIDI was much better on Windows, even better than the PCI cards at 
 the moment are on Linux. So I guess, yes I don't know, that the hardware is 
 ok.
 
 - Ralf
 
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Ralf Mardorf
On Thu, 2010-07-15 at 11:48 +0200, Arnout Engelen wrote:
  Who knows? Perhaps 2ms shown by Audacity are 20ms, because Audacity has
  a bug? 
 
 Sounds unlikely

Indeed.

  Perhaps 2ms shown by Audacity are 2ms, but when playing JACK, the
  sound card or what ever will add additional jitter?
 
 You claimed the latency/timing was obviously noticably broken without needing
 'golden ears' even without JACK, right?

Correct.

 Let's first try to get this reliable without JACK, just ALSA

So I'll do

- instead of dev.hpet.max-user-freq=64 I'll try 1024 or 2048 as Robin
mentioned

- connect the DX7 port to the D4 port with aconnect(gui) as Clemens
recommended

If nobody should demand to it, I won't use

- aplaymidi, because there might be a problem. All instruments at the
same time by the machine is missing the reference of recorded
instruments that might have had another fluctuation for the timing or
the alternative, it's missing the feeling of a musician playing live

How should I do

- I should be able to get my Atari ST working. It's MIDI is good enough
for my needs. So, how should I use the Atari ST to make a
'timestamp'-test as somebody recommended?
The Atari e.g. could play 4 on the floor at 120 BPM. I could watch the
received events by a Linux MIDI monitor or record it by a Linux MIDI
sequencer. But, should I sync by MTC or anything else etc.?

Cheers!

Ralf



___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Ralf Mardorf
PS:

 How should I do
 
 - I should be able to get my Atari ST working. It's MIDI is good enough
 for my needs. So, how should I use the Atari ST to make a
 'timestamp'-test as somebody recommended?
 The Atari e.g. could play 4 on the floor at 120 BPM. I could watch the
 received events by a Linux MIDI monitor or record it by a Linux MIDI
 sequencer. But, should I sync by MTC or anything else etc.?


I guess openoctave supports an ALSA MIDI sequencer only, based on
Rosegarden, but without audio, hence jackd isn't needed?

Perhaps seq24 and other only need ALSA MIDI, but JACK audio ... but make
use of those isn't easy and perhaps they don't know MIDI sync
possibilities?

OTOH MIDI sync by e.g. MTC might not be useful for this test?

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Clemens Ladisch
Ralf Mardorf wrote:
 - instead of dev.hpet.max-user-freq=64 I'll try 1024 or 2048 as Robin
 mentioned

This parameter will not have any effect on anything because there is no
program that uses the HPET timers from userspace.  When high-resolution
timers are used by ALSA, this is done inside the kernel where there is
no limit on the maximum frequency.


Regards,
Clemens
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Ralf Mardorf
On Thu, 2010-07-15 at 12:55 +0200, Clemens Ladisch wrote:
 Ralf Mardorf wrote:
  - instead of dev.hpet.max-user-freq=64 I'll try 1024 or 2048 as Robin
  mentioned
 
 This parameter will not have any effect on anything because there is no
 program that uses the HPET timers from userspace.  When high-resolution
 timers are used by ALSA, this is done inside the kernel where there is
 no limit on the maximum frequency.
 
 
 Regards,
 Clemens

IIRC someone on jack-devel mailing list had issues when using mplayer
with the value 64 and it was solved when using the value 1024. But as I
mentioned before, for my USB MIDI there was a difference between system
timer and hr timer, but there was no difference for the value 64 and
1024, when using hr timer.

Btw. I don't understand what a maximum frequency in the context does
mean. If 64 or 1024 should have impact, what would be the result?

System timer for a kernel-rt is set up to 1000Hz and hr timer is at
10Hz.

What is 'max-user-freq' for?

Cheers!

Ralf


___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Robin Gareus
On 07/15/2010 01:07 PM, Ralf Mardorf wrote:
 On Thu, 2010-07-15 at 12:55 +0200, Clemens Ladisch wrote:
 Ralf Mardorf wrote:
 - instead of dev.hpet.max-user-freq=64 I'll try 1024 or 2048 as Robin
 mentioned

 This parameter will not have any effect on anything because there is no
 program that uses the HPET timers from userspace. 

That'd be correct if Ralf would stick to 'amidiplay' and friends for his
tests.

There are a couple of audio-tools who can use either RTC or HPET for
timing, although most of them need an option to explicitly enable it.

 When high-resolution
 timers are used by ALSA, this is done inside the kernel where there is
 no limit on the maximum frequency.

Thanks for that explanation. It makes perfect sense.
I take it the same must be true for dev.rtc.max-user-freq as well.

BTW. Do you know the unit of these values?
  cat /sys/class/rtc/rtc0/max_user_freq
  cat /proc/sys/dev/hpet/max-user-freq
are they Hz?

linux-2.6/Documentation/hpet.txt does not mention it at all and
linux-2.6/Documentation/rtc.txt hints it's Hz but does not explicitly
say so.

 Regards,
 Clemens
 
 IIRC someone on jack-devel mailing list had issues when using mplayer
 with the value 64 and it was solved when using the value 1024. But as I
 mentioned before, for my USB MIDI there was a difference between system
 timer and hr timer, but there was no difference for the value 64 and
 1024, when using hr timer.
 
 Btw. I don't understand what a maximum frequency in the context does
 mean. If 64 or 1024 should have impact, what would be the result?
 
 System timer for a kernel-rt is set up to 1000Hz and hr timer is at
 10Hz.
 
 What is 'max-user-freq' for?

It limits the maximum frequency at which user-space applications can
[request to] receive wakeup calls from the hr-timer.

see also the Timers thread on LAD last November:
http://linuxaudio.org/mailarchive/lad/2009/11/7/161647

 Cheers!
 
 Ralf
 
 
best,
robin

-- 
Robin Gareus   mail: ro...@gareus.org
site: http://gareus.org/   chat: xmpp:rgar...@ik.nu
blog: http://rg42.org/ lab : http://citu.fr/

Public Key at http://pgp.mit.edu/
Fingerprint : 7107 840B 4DC9 C948 076D 6359 7955 24F1 4F95 2B42
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Jeremy

Louigi Verona wrote:



On Wed, Jul 7, 2010 at 4:55 PM, Jeremy Jongepier jer...@autostatic.com 
mailto:jer...@autostatic.com wrote:


Ok, got it working in Qtractor:
- In Qtractor I've created four tracks with corresponding buses: Formant
Synth
(1 channel), Carrier Voice (1 channel), Vocoder (2 channels) and Vocoder
Mix (2 channels)
- All tracks output their signal to their corresponding buses,
except for Vocoder Mix which outputs to the Master output buses
- Then I've loaded the vocoder LADSPA plugin into the Vocoder track.
Input 1 of the Vocoder bus is the carrier and input 2 is the formant
- After that I've made the following connections in Qtractor's
Connections window:
http://linux.autostatic.com/images/2010-07/vocoder-connections.png



Wow!
Fantastic! You are THE man!
Tell me, why do you need Vocoder Mix? Why can't Vocoder output straight 
to Master Out?

I will try the above myself, of course, but just curious.


--
Louigi Verona
http://www.louigiverona.ru/


Hello Louigi,

You can output the Vocoder output to the Master outputs but then you 
cannot adjust the volume and panning within a single track.
Also, if you have multiple tracks with sidechaining like the vocoder 
example it can get quite inconvenient.


Best,

Jeremy
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Louigi Verona
On Thu, Jul 15, 2010 at 3:48 PM, Jeremy autosta...@gmail.com wrote:

  Hello Louigi,

 You can output the Vocoder output to the Master outputs but then you cannot
 adjust the volume and panning within a single track.
 Also, if you have multiple tracks with sidechaining like the vocoder
 example it can get quite inconvenient.


 Best,

 Jeremy
 ___
 Linux-audio-dev mailing list
 Linux-audio-dev@lists.linuxaudio.org
 http://lists.linuxaudio.org/listinfo/linux-audio-dev



Yeah, well, Jeremy - I could not reproduce what you did. It simply does not
work on my machine
and I am not sure why. If you read the mailing list, I described in detail
what result I got. Unfortunately,
ladspa vocoder does not seem to be usable within sequencers on linux.



-- 
Louigi Verona
http://www.louigiverona.ru/
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Jeremy



Louigi Verona wrote:
So I see. But I am still interested in how he did what he reported since 
I still cannot get

his scheme to work.





Sorry for the late replies,

The Vocoder track needs to be a 2 channel track, one input will be 
assigned to the carrier input of the vocoder LADSPA plugin and the other 
to the formant input. Input 1 of the Vocoder track/bus is the carrier, 
input 2 the formant.
It also doesn't matter if the sources for the carrier and formant are 
internal or external, in my case I connected Yoshimi to the Formant 
Synth in port, but you could also connect the output of a specific 
track. The same goes for the carrier, in my case it's the built-in mic 
(system capture), but it could also be the output of a specific track.
I think what is important is that you create separate buses for all your 
tracks and use these buses for both input and output on all your tracks 
except on the mix tracks. All tracks have to be monitored, not their 
inputs or outputs but the tracks themselves, so the middle part of the 
Qtractor Mixer panel.
Hope that helps a bit. I have access to only my netbook (I'm on 
vacation, long live Android and mobile internet) but maybe I can make 
some screenshots of how to set up the Qtractor Mixer panel.


Best,

Jeremy
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Louigi Verona
On Thu, Jul 15, 2010 at 4:04 PM, Jeremy autosta...@gmail.com wrote:



 Louigi Verona wrote:

 So I see. But I am still interested in how he did what he reported since I
 still cannot get
 his scheme to work.


 


 Sorry for the late replies,

 The Vocoder track needs to be a 2 channel track, one input will be assigned
 to the carrier input of the vocoder LADSPA plugin and the other to the
 formant input. Input 1 of the Vocoder track/bus is the carrier, input 2 the
 formant.
 It also doesn't matter if the sources for the carrier and formant are
 internal or external, in my case I connected Yoshimi to the Formant Synth in
 port, but you could also connect the output of a specific track. The same
 goes for the carrier, in my case it's the built-in mic (system capture), but
 it could also be the output of a specific track.
 I think what is important is that you create separate buses for all your
 tracks and use these buses for both input and output on all your tracks
 except on the mix tracks. All tracks have to be monitored, not their inputs
 or outputs but the tracks themselves, so the middle part of the Qtractor
 Mixer panel.
 Hope that helps a bit. I have access to only my netbook (I'm on vacation,
 long live Android and mobile internet) but maybe I can make some screenshots
 of how to set up the Qtractor Mixer panel.

 Best,

 Jeremy



I did it exactly as you described. I got a mix of original sound and vocoded
sound.
You should recheck what you did and make sure that you get a clean vocoded
sound.


-- 
Louigi Verona
http://www.louigiverona.ru/
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Jeremy

Louigi Verona wrote:

Okay, last message for today.

Eventually I could partially reproduce. If I create a stereo Vocoder 
track FIRST, then the latter mono carrier and modifier tracks would 
channel into Vocoder provided I press monitor on it. But the output is 
a weird mix of vocoded signal and the modifier. If I change modifier and 
carrier places - the signal does not change. Why would track creation 
matter I do not know. Rui?




The reason why you get a mixed signal could be because you use both 
outputs of the Vocoder track. The vocoder LADSPA plugin outputs the dry 
signal on the second output and the vocoded signal on the first output. 
That's why I connected the first output of the Vocoder track to both 
inputs of the Vocoder Mix track, like this you have faux-stereo and 
better control over the output of the Vocoder track.


Also, when Vocoder does not receive anything, it starts to generate 
XRUNs, so if you close Qtractor while Vocoder is activated, Qtractor 
will freeze.




The more bands you select in the vocoder LADSPA plugin, the more CPU it 
takes, so bigger chance for xruns. I didn't thoroughly test this setup 
yet so it might be something else.


Best,

Jeremy
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Louigi Verona
On Thu, Jul 15, 2010 at 4:16 PM, Jeremy autosta...@gmail.com wrote:

 Louigi Verona wrote:

 Okay, last message for today.

 Eventually I could partially reproduce. If I create a stereo Vocoder track
 FIRST, then the latter mono carrier and modifier tracks would channel into
 Vocoder provided I press monitor on it. But the output is a weird mix of
 vocoded signal and the modifier. If I change modifier and carrier places -
 the signal does not change. Why would track creation matter I do not know.
 Rui?


 The reason why you get a mixed signal could be because you use both outputs
 of the Vocoder track. The vocoder LADSPA plugin outputs the dry signal on
 the second output and the vocoded signal on the first output. That's why I
 connected the first output of the Vocoder track to both inputs of the
 Vocoder Mix track, like this you have faux-stereo and better control over
 the output of the Vocoder track.



Ah, this could be the reason. I will recheck carefully. I think I tried
routing only one but I am not sure.





  Also, when Vocoder does not receive anything, it starts to generate XRUNs,
 so if you close Qtractor while Vocoder is activated, Qtractor will freeze.


 The more bands you select in the vocoder LADSPA plugin, the more CPU it
 takes, so bigger chance for xruns. I didn't thoroughly test this setup yet
 so it might be something else.



LADSPA vocoder in JACK Rack has the same problem. Being already pretty
experienced in this regard (hehehe) I can say that this is because of an
effect which I forgot how is called when the plugin stops receiving the
signal but continues to try to process data and starts going into
calculations of really small numbers and it triggers a software part of your
CPU and it uses a lot of resources, these calculations. The developer should
find a way to fix this somehow.





 Best,

 Jeremy
 ___
 Linux-audio-dev mailing list
 Linux-audio-dev@lists.linuxaudio.org
 http://lists.linuxaudio.org/listinfo/linux-audio-dev



Thx Jeremy )

-- 
Louigi Verona
http://www.louigiverona.ru/
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Jeremy

Louigi Verona wrote:



On Thu, Jul 15, 2010 at 4:16 PM, Jeremy autosta...@gmail.com 
mailto:autosta...@gmail.com wrote:


Louigi Verona wrote:

Okay, last message for today.

Eventually I could partially reproduce. If I create a stereo
Vocoder track FIRST, then the latter mono carrier and modifier
tracks would channel into Vocoder provided I press monitor on
it. But the output is a weird mix of vocoded signal and the
modifier. If I change modifier and carrier places - the signal
does not change. Why would track creation matter I do not know. Rui?


The reason why you get a mixed signal could be because you use both
outputs of the Vocoder track. The vocoder LADSPA plugin outputs the
dry signal on the second output and the vocoded signal on the first
output. That's why I connected the first output of the Vocoder track
to both inputs of the Vocoder Mix track, like this you have
faux-stereo and better control over the output of the Vocoder track.



Ah, this could be the reason. I will recheck carefully. I think I tried 
routing only one but I am not sure.




I'll recheck too, I did bring all the necessary stuff on vacation (MIDI 
keyboard, external soundcard, mics, etc.) but didn't have the time to 
set it up yet.


Best,

Jeremy
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Ralf Mardorf
On Thu, 2010-07-15 at 13:45 +0200, Robin Gareus wrote:
 On 07/15/2010 01:07 PM, Ralf Mardorf wrote:
  On Thu, 2010-07-15 at 12:55 +0200, Clemens Ladisch wrote:
  Ralf Mardorf wrote:
  - instead of dev.hpet.max-user-freq=64 I'll try 1024 or 2048 as Robin
  mentioned
 
  This parameter will not have any effect on anything because there is no
  program that uses the HPET timers from userspace. 
 
 That'd be correct if Ralf would stick to 'amidiplay' and friends for his
 tests.

While at least one friend throw his Apple MacOs through the window. I'm
not kidding. Modern computers + music + my friends = not good. I've got
two kinds of friends, the once who say I shouldn't use a modern computer
and the others who say, I should buy Nuendo and expensive hardware. And
btw. I don't have got 1000 friends, but just a handful.
If I would ask my neighbours to listen, they would be fine with bad
timing, regarding to their habits, listening to the radio at prime time.

Anyway, I could ask people to listen, but I'm sure this would be
useless.

 There are a couple of audio-tools who can use either RTC or HPET for
 timing, although most of them need an option to explicitly enable it.
 
  When high-resolution
  timers are used by ALSA, this is done inside the kernel where there is
  no limit on the maximum frequency.
 
 Thanks for that explanation. It makes perfect sense.
 I take it the same must be true for dev.rtc.max-user-freq as well.
 
 BTW. Do you know the unit of these values?
   cat /sys/class/rtc/rtc0/max_user_freq
   cat /proc/sys/dev/hpet/max-user-freq
 are they Hz?
 
 linux-2.6/Documentation/hpet.txt does not mention it at all and
 linux-2.6/Documentation/rtc.txt hints it's Hz but does not explicitly
 say so.
 
  Regards,
  Clemens
  
  IIRC someone on jack-devel mailing list had issues when using mplayer
  with the value 64 and it was solved when using the value 1024. But as I
  mentioned before, for my USB MIDI there was a difference between system
  timer and hr timer, but there was no difference for the value 64 and
  1024, when using hr timer.
  
  Btw. I don't understand what a maximum frequency in the context does
  mean. If 64 or 1024 should have impact, what would be the result?
  
  System timer for a kernel-rt is set up to 1000Hz and hr timer is at
  10Hz.
  
  What is 'max-user-freq' for?
 
 It limits the maximum frequency at which user-space applications can
 [request to] receive wakeup calls from the hr-timer.
 
 see also the Timers thread on LAD last November:
 http://linuxaudio.org/mailarchive/lad/2009/11/7/161647

Okay, what ever user-space might be, I assume regarding to the
explanations it's unimportant for rt. And as mentioned by the text of
the link, I can confirm that at least my USB MIDI is better when using
hr timer.

I'll test amidiplay, but again, playing all MIDI instruments at the same
isn't the major problem, perhaps just for me, resp. for people with
'golden ears'.

The major issue is sync to audio recordings of MIDI instruments, while
doing audio recordings of other, resp. the same instruments again.

I e.g. do only have one DX7 and one Matrix-1000 and my TG33 (with vector
control) is broken, hence I can't use all outputs, but one stereo output
of the TG33.

I wish to be able to record a synth several times. This means that even
inaudible jitter will become audible, for a production.

Of course I won't record one drum after the other from any of my drum
modules, because this just would result in more noise, but recording the
snare or kick separated to other drum samples is needed, when doing the
mastering by Linux and using a compressor like JAMin.

Accumulation of jitter, caused by recording one instrument after the
other is breaking the groove.

 
  Cheers!
  
  Ralf
  
  
 best,
 robin

:)

Ralf


___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Robin Gareus
On 07/15/2010 02:45 PM, Ralf Mardorf wrote:
 On Thu, 2010-07-15 at 13:45 +0200, Robin Gareus wrote:
 On 07/15/2010 01:07 PM, Ralf Mardorf wrote:
 On Thu, 2010-07-15 at 12:55 +0200, Clemens Ladisch wrote:
 Ralf Mardorf wrote:
 - instead of dev.hpet.max-user-freq=64 I'll try 1024 or 2048 as Robin
 mentioned

 This parameter will not have any effect on anything because there is no
 program that uses the HPET timers from userspace. 

 That'd be correct if Ralf would stick to 'amidiplay' and friends for his
 tests.
 
 While at least one friend throw his Apple MacOs through the window. I'm
 not kidding. Modern computers + music + my friends = not good. I've got
 two kinds of friends, the once who say I shouldn't use a modern computer
 and the others who say, I should buy Nuendo and expensive hardware. And
 btw. I don't have got 1000 friends, but just a handful.
 If I would ask my neighbours to listen, they would be fine with bad
 timing, regarding to their habits, listening to the radio at prime time.

Took me a while to catch that drift. I meant friends of 'amidiplay'
(eg 'amidirecord', 'aseqdump', etc) with the aim to keep things simple
for testing.

..and friends is an English idiom; sorry to put you off track.

 Anyway, I could ask people to listen, but I'm sure this would be
 useless.
 
 There are a couple of audio-tools who can use either RTC or HPET for
 timing, although most of them need an option to explicitly enable it.

 When high-resolution
 timers are used by ALSA, this is done inside the kernel where there is
 no limit on the maximum frequency.

 Thanks for that explanation. It makes perfect sense.
 I take it the same must be true for dev.rtc.max-user-freq as well.

 BTW. Do you know the unit of these values?
   cat /sys/class/rtc/rtc0/max_user_freq
   cat /proc/sys/dev/hpet/max-user-freq
 are they Hz?

 linux-2.6/Documentation/hpet.txt does not mention it at all and
 linux-2.6/Documentation/rtc.txt hints it's Hz but does not explicitly
 say so.

 Regards,
 Clemens

 IIRC someone on jack-devel mailing list had issues when using mplayer
 with the value 64 and it was solved when using the value 1024. But as I
 mentioned before, for my USB MIDI there was a difference between system
 timer and hr timer, but there was no difference for the value 64 and
 1024, when using hr timer.

 Btw. I don't understand what a maximum frequency in the context does
 mean. If 64 or 1024 should have impact, what would be the result?

 System timer for a kernel-rt is set up to 1000Hz and hr timer is at
 10Hz.

 What is 'max-user-freq' for?

 It limits the maximum frequency at which user-space applications can
 [request to] receive wakeup calls from the hr-timer.

 see also the Timers thread on LAD last November:
 http://linuxaudio.org/mailarchive/lad/2009/11/7/161647
 
 Okay, what ever user-space might be, 

It's the opposite of kernel-space :)
http://en.wikipedia.org/wiki/User_space

 I assume regarding to the
 explanations it's unimportant for rt. And as mentioned by the text of
 the link, I can confirm that at least my USB MIDI is better when using
 hr timer.
 
 I'll test amidiplay, but again, playing all MIDI instruments at the same
 isn't the major problem, perhaps just for me, resp. for people with
 'golden ears'.

Lets try something very simple:

ONE)
- Take a very simple midi-file.
- Use 'aplaymidi' to send it from your PC to your external midi-drum
machine. - Use your drum-synth's klick or woodblock sound or sth
very dry with a good attack.
- do a quick ear-test if you can hear midi-jitter?
- capture the audio-output of the external-midi-synth with arecord.

Rewind and do it again.
Post the two files and the used .mid on a web-server or some drop-box.

If you want to check yourself: Align the first-beat of the captured
audio files in a multi-track audio-editor. While it will involve a bit
of manual labor to quantify the jitter. It'll at least be solid evidence.


TWO)
- launch an external MIDI-metronome (eg your Atari ST)
- connect it to your PC
- use 'arecordmidi' to generate a .mid file from it.

Repeat at least once and post the two midi files somewhere for us to
download.


THREE)
  midi-metronom - PC - external-synth == audio

This combines ONE+TWO. just use 'aconnect[gui]' to use
your PC as MIDI-THRU. and 'arecord' to record audio that comes
from your drum-synth.

Note, at the moment we're not interested in latency, but in jitter.
The ticks in the recorded audio file _should be_ at identical intervals
+- jitter.

The output of
  cat /proc/asound/timers
and
  cat /proc/asound/version
might be interesting

Is that about right? Comments anyone?

 The major issue is sync to audio recordings of MIDI instruments, while
 doing audio recordings of other, resp. the same instruments again.
 
 I e.g. do only have one DX7 and one Matrix-1000 and my TG33 (with vector
 control) is broken, hence I can't use all outputs, but one stereo output
 of the TG33.
 
 I wish to be able to record a synth several times. This 

Re: [LAD] controlling access to the sound card

2010-07-15 Thread Jason Butler
Thanks for your feedback. The system my application is going to run on 
will not include PulseAudio, so I would have to do it with just ALSA and 
the OS.


On 10-07-14 02:25 PM, Ralf Mardorf wrote:


On Wed, 2010-07-14 at 15:02 -0500, Jason Butler wrote:
   

I have three applications that want to use the sound card, two audio  stream 
players, and a voip phone.
I want to set up linux so that if a call comes in on the phone the OS
will disconnect the audio players, give exclusive access to the voip
phone, and then when the phone is done reconnect the audio players to
the sound card.

How can this be done?
 

Regarding to the German Wiki PulseAudio should be able to handle stuff
like this. While PulseAudio is a PITA to many Linux audio production
users, IIUC PulseAudio should help for needs as yours. Of course you
have to code apps, resp. to use apps that were programmed for usage with
PulseAudio.

I've got no knowledge about this!!, but if PulseAudio shouldn't be able
to handle this kind of things, than I wonder what the advantages of it
should be.

Unfortunately the Wiki on English isn't that informative as the German
Wiki ... I anyway didn't read the complete German Wiki.

- Ralf

   


___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Tests directly routing pc's midi-in to midi-out

2010-07-15 Thread Clemens Ladisch
Robin Gareus wrote:
 On 07/15/2010 01:07 PM, Ralf Mardorf wrote:
 On Thu, 2010-07-15 at 12:55 +0200, Clemens Ladisch wrote:
 Ralf Mardorf wrote:
 dev.hpet.max-user-freq

 This parameter will not have any effect on anything because there is no
 program that uses the HPET timers from userspace. 
 
 That'd be correct if Ralf would stick to 'amidiplay' and friends for his
 tests.
 
 There are a couple of audio-tools who can use either RTC or HPET for
 timing, although most of them need an option to explicitly enable it.

Jack can read the current time from /dev/hpet, but it does not use it to
generate interrupts.  As far as I know, there is no program that does.

 BTW. Do you know the unit of these values?
   cat /sys/class/rtc/rtc0/max_user_freq
   cat /proc/sys/dev/hpet/max-user-freq
 are they Hz?

Yes.

 IIRC someone on jack-devel mailing list had issues when using mplayer
 with the value 64 and it was solved when using the value 1024.

This has nothing to do with MIDI timing; mplayer can use the RTC (not
HPET) for audio/video synchronization to work around the 100 Hz limit of
the system timer on old Linux kernels.


Regards,
Clemens
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] ALSA MIDI latency test results are far away from reality

2010-07-15 Thread Niels Mayer
On Wed, Jul 14, 2010 at 6:23 AM, Ralf Mardorf
ralf.mard...@alice-dsl.net wrote:
 For the MIDI audio test's LXDE session I'm running GNOME terminal tab 1
 JACK2, tab 2 Qtractor. Evolution offline + this Email opened for writing
 to the lists. Envy24 Control and nothing else.

 For Qtractor I switched from queue timer (resolution) 'system timer
 (1000 Hz)' to 'HR timer (10 Hz)' and restarted Qtractor.

For your prior failures, are you sure you had the correct soundcard
setup as the MIDI queue timer?
This caused me some issues in the past:
http://old.nabble.com/the-defacto-host-for-lv2-tt28789821.html#a28793953

Now my Qtractor View-Options-MIDI-Playback-Queue Timer is
permanently set to HR timer which avoids such issues when switching
soundcards.

You may want to check this setting prior to running tests... I also
found this thread helpful w/r/t explaining timer issues in MIDI:
http://old.nabble.com/Is-ALSA-using-hrtimer--ts28699922.html

-- Niels
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Harry Van Haaren
 LADSPA vocoder in JACK Rack has the same problem. Being already pretty
 experienced in this regard (hehehe) I can say that this is because of an
 effect which I forgot how is called when the plugin stops receiving the
 signal but continues to try to process data and starts going into
 calculations of really small numbers and it triggers a software part of your
 CPU and it uses a lot of resources, these calculations. The developer should
 find a way to fix this somehow.


Your thinking of Denormal numbers.
http://en.wikipedia.org/wiki/Denormal_number

A nice quick fix for these is:
if ( number  0.1)
  number = 0.0;

;-)

-Harry
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread James Morris
On 15 July 2010 18:23, Harry Van Haaren harryhaa...@gmail.com wrote:

 LADSPA vocoder in JACK Rack has the same problem. Being already pretty
 experienced in this regard (hehehe) I can say that this is because of an
 effect which I forgot how is called when the plugin stops receiving the
 signal but continues to try to process data and starts going into
 calculations of really small numbers and it triggers a software part of your
 CPU and it uses a lot of resources, these calculations. The developer should
 find a way to fix this somehow.

 Your thinking of Denormal numbers.
 http://en.wikipedia.org/wiki/Denormal_number

 A nice quick fix for these is:
 if ( number  0.1)
   number = 0.0;

 ;-)

Don't forget about negative numbers ;-)

n = ((n  0.0  n  0.1) || (n  0.0  n  -0.1)) ? 0.0 : n;
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Harry Van Haaren
  A nice quick fix for these is:
  if ( number  0.1)
number = 0.0;
 
  ;-)

 Don't forget about negative numbers ;-)

 n = ((n  0.0  n  0.1) || (n  0.0  n  -0.1)) ? 0.0 :
 n;


I stand corrected :-)
Cheers, -Harry
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Louigi Verona
Great. Maybe I should just contact the author of the vocoder and ask him. In
fact, I'll go write an email now.

L.V.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Transport issue for Qtractor - has impact to the jitter issue

2010-07-15 Thread Devin Anderson
On Thu, Jul 15, 2010 at 1:15 AM, Ralf Mardorf
ralf.mard...@alice-dsl.net wrote:

 Devin and some others want to know if the drum module is played before
 FluidSynth DSSI is played.

 JACK2 doesn't start with -p4096, so I started JACK2 with -Rch -dalsa
 -dhw:0 -r44100 -p2048 -n2 to increase the unwanted effect, to get a
 clear result.

 Without recording it's clear audible that the external drum module
 always is played before FluidSynth DSSI is played, plausible regarding
 to the audio latency (stupid idea to use such high latency ;), so
 there's the need to do an audio test with lower latency, to see if
 jitter might change what instrument is played first.

As we discussed yesterday, I don't think this is due to jitter.  I
think this is due to the fact that the (necessary) latency imposed by
JACK on audio is not imposed on ALSA MIDI.

The MIDI path to your soft-synth probably looks something like this:

MIDI controller - MIDI in port - ALSA - Fluidsynth

... after which Fluidsynth generates audio that travels a path that's
something like this:

Fluidsynth - Jack - audio out ports

Jack will impose a certain amount of latency on the audio before it's
output to the audio out ports.  In the case above, I think it would be
around 90 milliseconds or so.  There might be some other, minor
latency (i.e. internal buffering, etc.), but IDK, as I haven't seen
the code.

The MIDI path to your external drum module probably looks something like this:

MIDI controller - MIDI in port - ALSA - MIDI out port -
external drum module

... after which the drum module generates audio that travels a path
that's something like this:

drum module - drum module out ports

The drum module imposes some sort of latency, but it's clearly not as
high as the latency imposed by JACK with the settings above.  Note
that nothing is routed through JACK on the way to the drum module.

Solving this problem is complex.  It's unlikely that your drum module
allows you to specify its latency.  AFAIK, you can't directly specify
the latency of ALSA MIDI ports, but I have very little experience with
ALSA and could be totally wrong.

If Fluidsynth supports JACK MIDI, then you might solve this problem by
routing your MIDI through JACK using the ALSA 'seq' or 'raw'
interfaces.  MIDI that runs through JACK is subject to the same
latency as audio up until it reaches the MIDI drivers, after which
it's up to the driver to sync audio and MIDI if the MIDI port isn't
subject to the same latency as audio.  I know that the FFADO driver in
JACK 2 syncs MIDI closely with audio.  After looking over the ALSA
drivers, I'm pretty sure they try to sync MIDI with audio too by
adding a delay of `nframes` to the time that the MIDI message goes out
(I could be wrong, as the code for the ALSA MIDI drivers is a bit
complicated).

-- 
Devin Anderson
devin (at) charityfinders (dot) com

CharityFinders - http://www.charityfinders.com/
synthclone - http://synthclone.googlecode.com/
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


[LAD] Denormals, Re: Routing signals in mixer (Ardour, Qtractor)

2010-07-15 Thread Tim Goetze
[James Morris]

On 15 July 2010 18:23, Harry Van Haaren harryhaa...@gmail.com wrote:
 http://en.wikipedia.org/wiki/Denormal_number

 A nice quick fix for these is:
 if ( number  0.1)
   number = 0.0;

 ;-)

Don't forget about negative numbers ;-)

n = ((n  0.0  n  0.1) || (n  0.0  n  -0.1)) ? 0.0 : n;

I can't seem to remember the most sensible numeric limit for this 
branching variant of denormal removal -- besides, it obviously depends 
on the context in question and the use of floats versus doubles (as 
does the use of fabsf versus fabs in what follows) -- but I am quite 
positive that the following line, which replaces the above one, is not 
only easier to read, but also stands a good chance of executing 
quicker on most contemporary CPUs due to the elimination of at least 
one branching instruction:

n = fabs(n)  1e-9 ? 0 : n; 

Cheers, Tim___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


[LAD] There is nice way to improve all stand-alone MIDI-manager software

2010-07-15 Thread Mike Cookson
Hello all developers. Sorry for, probably, slow and straightfull english.

All stand-alone instruments, processors and other modules, controlled through 
midi, as you understand, currently have a serious disadvantage of audio 
plugins: user must remember all midi parameter numbers and, sometime, values 
(aeolus stops switching) in order to controll them via MIDI.

It would be nice for all MIDI-managed software, to have capability to send out 
changes of MIDI parameters when user changes them using native GUI of this 
software. For example, when user toggles several stops on aeolus, it would send 
approriate MIDI signals, so, that when sent back into instrument, it will 
toggle the same stops.

So, users could use standalone software as easy as audio-plugins: Aeolus, 
Yoshimi/ZynAddSubFX (not sure, that all parameters available via GUI), 
Phasex... Jack-rack, fst and vsthost (dssi-vst), rakarrack, many other.
Kokkiniza! ;)
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


[LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread Nathanael Anderson
Here is the setup i'm looking to do:

midi master - special program
ddr pad - special program
special program - hydrogen

I've got a hardware midi device controlling the master tempo. aseqdump shows
the tempo messages x times a second.

I want to either find or make a program so I can use a ddr pad (usb gamepad)
and assign different timed midi patterns to the buttons, so for example:
if the program is configured for 4/4 time, when I hold down or toggle a
button

if note 36 is a bass drum and 41 is a snare then you'd have a basic drum and
snare beat when both buttons are held down or toggled, of course i'd like to
be able to have multiple patterns bound per button as well

button one is set to play a quarter note midi note 36, then rest for the
remaining 3 beats
1 - 36,quarter|rest,quarter|36,quarter|rest,quarter
2 -
41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|rest,eighth

Each pattern would describe one measure, and the flow of a live solo
performance could easily be changed on the fly, leaving hands free to play
guitar.

Anything out there like this, and I just don't know about it?

If I don't find anything I want to hack something together, so a primer on
midi tempo sync, and how to program with it would be appreciated.

Thanks,
Nathanael
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread James Morris
On 15 July 2010 20:27, Nathanael Anderson wirelessdrea...@gmail.com wrote:

 If I don't find anything I want to hack something together, so a primer on
 midi tempo sync, and how to program with it would be appreciated.

Hi Nathanael,

I'm using JACK MIDI which makes it quite easy (compared with designing
the program architecture) to output MIDI events.

I'm also trying to use JACK transport to a) sync with other timebase
master programs, b) to act as a simple timebase master.

The JACK timebase and transport documentation focuses on using JACK to
accomplish these tasks.
see here:  http://jackaudio.org/files/docs/html/transport-design.html
There is also a transport example in the examples directory within the
source code for JACK.

But these documents do not go into the issues you'll come across when
trying to accurately synchronize your application with another. Or
accurately acting as timebase master (ie you've recorded some audio
sequenced by your MIDI program acting as timebase master but it does
not line up when imported into a DAW with the same tempo/meter).

Beyond this, you'll have to look into other software source code
(AFAIK). But it's far from trivial.

Cheers,
James.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread Ralf Mardorf
On Thu, 2010-07-15 at 21:14 +0100, James Morris wrote:
 On 15 July 2010 20:27, Nathanael Anderson wirelessdrea...@gmail.com wrote:
 
  If I don't find anything I want to hack something together, so a primer on
  midi tempo sync, and how to program with it would be appreciated.
 
 Hi Nathanael,
 
 I'm using JACK MIDI which makes it quite easy (compared with designing
 the program architecture) to output MIDI events.
 
 I'm also trying to use JACK transport to a) sync with other timebase
 master programs, b) to act as a simple timebase master.
 
 The JACK timebase and transport documentation focuses on using JACK to
 accomplish these tasks.
 see here:  http://jackaudio.org/files/docs/html/transport-design.html
 There is also a transport example in the examples directory within the
 source code for JACK.
 
 But these documents do not go into the issues you'll come across when
 trying to accurately synchronize your application with another. Or
 accurately acting as timebase master (ie you've recorded some audio
 sequenced by your MIDI program acting as timebase master but it does
 not line up when imported into a DAW with the same tempo/meter).
 
 Beyond this, you'll have to look into other software source code
 (AFAIK). But it's far from trivial.
 
 Cheers,
 James.

Perhaps the source code for Qtractor could help.

I've got timing issues, but they have nothing to do with JACK transport.
I guess (didn't use it) JACK transport is okay, with the limits Rui
mentioned today. Usually Rui does good programming.

 Forwarded Message 
From: Rui Nuno Capela rn...@rncbc.org
To: Ralf Mardorf ralf.mard...@alice-dsl.net
Cc: Robin Gareus ro...@gareus.org, Devin Anderson
de...@charityfinders.com, qtractor-devel
qtractor-de...@lists.sourceforge.net, 64studio-devel
64studio-de...@lists.64studio.com, Linux-Audio-Dev
linux-audio-dev@lists.linuxaudio.org
Subject: Re: [LAD] Transport issue for Qtractor - has impact to the
jitter issue
Date: Thu, 15 Jul 2010 09:33:53 +0100

On Thu, 15 Jul 2010 10:15:53 +0200, Ralf Mardorf wrote:
 Transport issue for Qtractor - has impact to the jitter issue
 

what transport issue? oh i see... :)

qtractor has in fact a known trouble with jack transport sync when slaving
to period/buffer sizes greater than 2048 frames. it skids. maybe that's
exactly what you're experiencing when running jackd -p4096. 

i'll recommend for you to turn jack transport mode to none or master
only, never slave nor full on those situations (in
view/options.../audio/jack Transport).

another recommendation would be to grab the latest from svn trunk where
the said trouble has been already mitigated :)

cheers

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread Ralf Mardorf
On Thu, 2010-07-15 at 14:27 -0500, Nathanael Anderson wrote:
 Here is the setup i'm looking to do:
 
 midi master - special program
 ddr pad - special program
 special program - hydrogen
 
 I've got a hardware midi device controlling the master tempo. aseqdump
 shows the tempo messages x times a second.
 
 I want to either find or make a program so I can use a ddr pad (usb
 gamepad) and assign different timed midi patterns to the buttons, so
 for example:
 if the program is configured for 4/4 time, when I hold down or toggle
 a button
 
 if note 36 is a bass drum and 41 is a snare then you'd have a basic
 drum and snare beat when both buttons are held down or toggled, of
 course i'd like to be able to have multiple patterns bound per button
 as well
 
 button one is set to play a quarter note midi note 36, then rest for
 the remaining 3 beats
 1 - 36,quarter|rest,quarter|36,quarter|rest,quarter
 2 - 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|rest,eighth|
 41,eighth|rest,eighth
 
 Each pattern would describe one measure, and the flow of a live solo
 performance could easily be changed on the fly, leaving hands free to
 play guitar.
 
 Anything out there like this, and I just don't know about it?

Perhaps a MIDI control change command could switch between different
drum patterns for a soft drum module like Hydrogen?

IIUC you aren't searching for something like JACK transport, you just
wish to switch between drum patterns.

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread Nathanael Anderson
hydrogen accepting a cc or pc, along with tempo sync would actually be the
perfect solution

On Thu, Jul 15, 2010 at 3:58 PM, Ralf Mardorf ralf.mard...@alice-dsl.netwrote:

 On Thu, 2010-07-15 at 14:27 -0500, Nathanael Anderson wrote:
  Here is the setup i'm looking to do:
 
  midi master - special program
  ddr pad - special program
  special program - hydrogen
 
  I've got a hardware midi device controlling the master tempo. aseqdump
  shows the tempo messages x times a second.
 
  I want to either find or make a program so I can use a ddr pad (usb
  gamepad) and assign different timed midi patterns to the buttons, so
  for example:
  if the program is configured for 4/4 time, when I hold down or toggle
  a button
 
  if note 36 is a bass drum and 41 is a snare then you'd have a basic
  drum and snare beat when both buttons are held down or toggled, of
  course i'd like to be able to have multiple patterns bound per button
  as well
 
  button one is set to play a quarter note midi note 36, then rest for
  the remaining 3 beats
  1 - 36,quarter|rest,quarter|36,quarter|rest,quarter
  2 - 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|rest,eighth|
  41,eighth|rest,eighth
 
  Each pattern would describe one measure, and the flow of a live solo
  performance could easily be changed on the fly, leaving hands free to
  play guitar.
 
  Anything out there like this, and I just don't know about it?

 Perhaps a MIDI control change command could switch between different
 drum patterns for a soft drum module like Hydrogen?

 IIUC you aren't searching for something like JACK transport, you just
 wish to switch between drum patterns.


___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread Ralf Mardorf
On Thu, 2010-07-15 at 22:58 +0200, Ralf Mardorf wrote:
 On Thu, 2010-07-15 at 14:27 -0500, Nathanael Anderson wrote:
  Here is the setup i'm looking to do:
  
  midi master - special program
  ddr pad - special program
  special program - hydrogen
  
  I've got a hardware midi device controlling the master tempo. aseqdump
  shows the tempo messages x times a second.
  
  I want to either find or make a program so I can use a ddr pad (usb
  gamepad) and assign different timed midi patterns to the buttons, so
  for example:
  if the program is configured for 4/4 time, when I hold down or toggle
  a button
  
  if note 36 is a bass drum and 41 is a snare then you'd have a basic
  drum and snare beat when both buttons are held down or toggled, of
  course i'd like to be able to have multiple patterns bound per button
  as well
  
  button one is set to play a quarter note midi note 36, then rest for
  the remaining 3 beats
  1 - 36,quarter|rest,quarter|36,quarter|rest,quarter
  2 - 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|rest,eighth|
  41,eighth|rest,eighth
  
  Each pattern would describe one measure, and the flow of a live solo
  performance could easily be changed on the fly, leaving hands free to
  play guitar.
  
  Anything out there like this, and I just don't know about it?
 
 Perhaps a MIDI control change command could switch between different
 drum patterns for a soft drum module like Hydrogen?
 
 IIUC you aren't searching for something like JACK transport, you just
 wish to switch between drum patterns.

E.g.

pattern one does
36,quarter|rest,quarter|36,quarter|rest,quarter

pattern two does
41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|
rest,eighth41,eighth|rest,eighth

and pattern three does
36,quarter|rest,quarter|36,quarter|rest,quarter
+
41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|
rest,eighth41,eighth|rest,eighth

hence you only need to program an app that switch between drum patterns.
Perhaps the used drum machine is able to switch by a CC command.

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread Nathanael Anderson
and i've already got code for gamepad to cc/pc laying around. but hydrogen
is qt, which i'm quite unfamiliar with =|

On Thu, Jul 15, 2010 at 4:02 PM, Ralf Mardorf ralf.mard...@alice-dsl.netwrote:

 On Thu, 2010-07-15 at 22:58 +0200, Ralf Mardorf wrote:
  On Thu, 2010-07-15 at 14:27 -0500, Nathanael Anderson wrote:
   Here is the setup i'm looking to do:
  
   midi master - special program
   ddr pad - special program
   special program - hydrogen
  
   I've got a hardware midi device controlling the master tempo. aseqdump
   shows the tempo messages x times a second.
  
   I want to either find or make a program so I can use a ddr pad (usb
   gamepad) and assign different timed midi patterns to the buttons, so
   for example:
   if the program is configured for 4/4 time, when I hold down or toggle
   a button
  
   if note 36 is a bass drum and 41 is a snare then you'd have a basic
   drum and snare beat when both buttons are held down or toggled, of
   course i'd like to be able to have multiple patterns bound per button
   as well
  
   button one is set to play a quarter note midi note 36, then rest for
   the remaining 3 beats
   1 - 36,quarter|rest,quarter|36,quarter|rest,quarter
   2 - 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|rest,eighth|
   41,eighth|rest,eighth
  
   Each pattern would describe one measure, and the flow of a live solo
   performance could easily be changed on the fly, leaving hands free to
   play guitar.
  
   Anything out there like this, and I just don't know about it?
 
  Perhaps a MIDI control change command could switch between different
  drum patterns for a soft drum module like Hydrogen?
 
  IIUC you aren't searching for something like JACK transport, you just
  wish to switch between drum patterns.

 E.g.

 pattern one does
 36,quarter|rest,quarter|36,quarter|rest,quarter

 pattern two does
 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|
 rest,eighth41,eighth|rest,eighth

 and pattern three does
 36,quarter|rest,quarter|36,quarter|rest,quarter
 +
 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|
 rest,eighth41,eighth|rest,eighth

 hence you only need to program an app that switch between drum patterns.
 Perhaps the used drum machine is able to switch by a CC command.


___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread Ralf Mardorf
On Thu, 2010-07-15 at 16:03 -0500, Nathanael Anderson wrote:
 and i've already got code for gamepad to cc/pc laying around. but
 hydrogen is qt, which i'm quite unfamiliar with =|

Alessandro Cominu (Comix's blog)
[co...@users.sourceforge.net]
Maintainer, main coder (http://www.hydrogen-music.org/?p=authors)

I'm sure they will have a sympathetic ear regarding to your needs.

Perhaps you should include 'Hydrogen' to the email's subject for LAD.

- Ralf

PS: Capability for freestyle improvisation is much harder than what I
expect from a PC. I guess you not only wish to chose between a fistful
of possibilities for drum patterns, while you're playing the guitar ;)?!

 On Thu, Jul 15, 2010 at 4:02 PM, Ralf Mardorf
 ralf.mard...@alice-dsl.net wrote:
 
 On Thu, 2010-07-15 at 22:58 +0200, Ralf Mardorf wrote:
  On Thu, 2010-07-15 at 14:27 -0500, Nathanael Anderson wrote:
   Here is the setup i'm looking to do:
  
   midi master - special program
   ddr pad - special program
   special program - hydrogen
  
   I've got a hardware midi device controlling the master
 tempo. aseqdump
   shows the tempo messages x times a second.
  
   I want to either find or make a program so I can use a ddr
 pad (usb
   gamepad) and assign different timed midi patterns to the
 buttons, so
   for example:
   if the program is configured for 4/4 time, when I hold
 down or toggle
   a button
  
   if note 36 is a bass drum and 41 is a snare then you'd
 have a basic
   drum and snare beat when both buttons are held down or
 toggled, of
   course i'd like to be able to have multiple patterns bound
 per button
   as well
  
   button one is set to play a quarter note midi note 36,
 then rest for
   the remaining 3 beats
   1 - 36,quarter|rest,quarter|36,quarter|rest,quarter
   2 - 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|
 rest,eighth|
   41,eighth|rest,eighth
  
   Each pattern would describe one measure, and the flow of a
 live solo
   performance could easily be changed on the fly, leaving
 hands free to
   play guitar.
  
   Anything out there like this, and I just don't know about
 it?
 
  Perhaps a MIDI control change command could switch between
 different
  drum patterns for a soft drum module like Hydrogen?
 
  IIUC you aren't searching for something like JACK transport,
 you just
  wish to switch between drum patterns.
 
 
 E.g.
 
 pattern one does
 36,quarter|rest,quarter|36,quarter|rest,quarter
 
 
 pattern two does
 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|
 rest,eighth41,eighth|rest,eighth
 
 
 and pattern three does
 36,quarter|rest,quarter|36,quarter|rest,quarter
 
 +
 41,eighth|rest,eighth|41,eighth|rest,eighth|41,eighth|
 rest,eighth41,eighth|rest,eighth
 
 
 hence you only need to program an app that switch between drum
 patterns.
 Perhaps the used drum machine is able to switch by a CC
 command.


___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Midi tempo sync'd live drum machine

2010-07-15 Thread David
On Fri, 16 Jul 2010 00:14:50 +0200
Ralf Mardorf ralf.mard...@alice-dsl.net wrote:

 [too much]

Ralf,

Please, please, please, refrain from posting so much. Please, please,
please, take a good breath of thinking each time you're about to post
on LAD. You are welcomed to post. Sometimes, a lot of posts for
debugging purpose is fine, as long as you stick to what those who know
(thanks and kudos to them) ask you, say a log file or the result of
simple test cases you'd have to run on your system.

Thanks.

PS: If you really wish to reply to this email, reply to me, and
NOT to the list. Please, please, please.

-- David
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] FIxed alsa-tools' envy24control missing peak level meters and Reset Peaks

2010-07-15 Thread Niels Mayer
On Tue, Jul 13, 2010 at 9:54 PM, Niels Mayer nielsma...@gmail.com wrote:
 As a second patch (coming soon), I've rewritten  the meters in a more
 sensible fashion, drawing a single rectangle to represent the
 instantaneous level (2-3 X-primitive draws per meter total and one
 blit, versus hundreds of draws and a blit per value change per meter
 in the original code).

Patch for https://bugzilla.redhat.com/show_bug.cgi?id=602903
(see also 
http://old.nabble.com/FIxed-alsa-tools'-envy24control-missing-peak-level-meters-and-Reset-Peaks-ts29144830.html
)

(1) http://nielsmayer.com/npm/Screenshot-Efficient-Meters-Envy24Control.png
 * To see what the new meters look like.
(2) http://nielsmayer.com/npm/Efficient-Meters-Envy24Control.tgz
 * Contains levelmeters.c and x86_64 binary 'envy24control' that should at
   least work on Fedora 12 and OpenSuse and other 2.6.32-based distros.
(3) http://nielsmayer.com/npm/Efficient-Meters-Envy24Control.patch
 * To apply the patch, grab the most recent stable release (
   ftp://ftp.alsa-project.org/pub/tools/alsa-tools-1.0.23.tar.bz2 ) or
   git pull from trunk of the alsa-tools project.
 * After unpacking and assuming you've got the patch in
   ~/Efficient-Meters-Envy24Control.patch.patch do:
cd alsa-tools-1.0.23
cat ~/Efficient-Meters-Envy24Control.patch | patch -p1
  * It should give message patching file envy24control/levelmeters.c
  * Follow the directions to compile alsa-tools.

FYI here's what my top processes look like when running a test to
output individual streams to all 10 PCM output channels -- note X
consumes between 1.7% to 2.0 % and envy24control 0.7% to
1.0%:

15210 npm   20   0  643m  14m 8964 S  6.6  0.4   0:05.85 gst123
15184 npm   20   0  643m  14m 8984 S  6.3  0.4   0:13.11 gst123
15190 npm   20   0  643m  14m 8980 S  6.0  0.4   0:12.48 gst123
15172 npm   20   0  643m  14m 8968 S  5.6  0.4   0:15.42 gst123
15178 npm   20   0  643m  14m 8984 S  5.6  0.4   0:14.17 gst123
13684 root  20   0  527m 112m  31m S  2.0  2.8   3:26.11 X
13923 npm   20   0  863m  60m  37m S  1.0  1.5   2:49.08 plasma-desktop
14163 npm   20   0  597m  27m  15m S  1.0  0.7   1:08.29 chrome
14155 npm   20   0 1038m 170m  14m S  0.7  4.3   1:00.87 chrome
15226 npm   20   0  192m 9316 6908 S  0.7  0.2   0:00.51 envy24control

Here's the envy24control from my first patch, using the original
meters that cause many separate XDrawRectangles for each LED-looking
segment. The performance difference is quite noticeable as the fans
start running louder and the system load climbs upwards as soon as the
original envy24control starts running: X consumes 5.7-10% CPU, and
envy24control between 2.0% and 2.7%.

15172 npm   20   0  643m  14m 8968 S  6.1  0.4   0:53.83 gst123
15178 npm   20   0  643m  14m 8984 S  6.1  0.4   0:51.48 gst123
15190 npm   20   0  643m  14m 8980 S  6.1  0.4   0:49.60 gst123
15210 npm   20   0  643m  14m 8964 S  6.1  0.4   0:44.01 gst123
13684 root  20   0  527m 112m  31m S  5.7  2.8   3:42.78 X
15184 npm   20   0  643m  14m 8984 S  5.7  0.4   0:51.07 gst123
15398 npm   20   0  192m 9332 6908 S  2.4  0.2   0:02.32
envy24control.f
14163 npm   20   0  597m  27m  15m S  1.0  0.7   1:15.54 chrome
13923 npm   20   0  863m  60m  37m S  0.6  1.5   2:55.04
plasma-desktop

Just to show that it's the same performance as the original
envy24control from alsa-tools-1.0.22-1.1.fc12.ccrma.x86_64:

15178 npm   20   0  643m  14m 8984 S  6.3  0.4   1:28.72 gst123
15190 npm   20   0  643m  14m 8980 S  6.3  0.4   1:26.65 gst123
15210 npm   20   0  643m  14m 8964 S  6.3  0.4   1:20.59 gst123
15184 npm   20   0  643m  14m 8984 S  6.0  0.4   1:28.21 gst123
13684 root  20   0  527m 112m  31m R  5.6  2.8   4:21.30 X
15172 npm   20   0  643m  14m 8968 S  5.6  0.4   1:31.51 gst123
15455 npm   20   0  192m 8700 6316 S  2.3  0.2   0:01.74
envy24control
14163 npm   20   0  597m  27m  15m S  1.3  0.7   1:23.02 chrome
13923 npm   20   0  863m  60m  37m S  0.7  1.5   3:00.72 plasma-desktop

-- Niels
http://nielsmayer.com
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev