Hello all, Hello Scott,

I’ve followed this thread with a great interest because I’ve also some personal 
musical projects on different ARM platforms including the RPI2 (also looking at 
NXP/Freescale iMX6Q, and Rockchip RK3288). Like you Scott, these are projects 
to keep me busy when I’m retired (which will happen soon…). And I also did work 
on FPGA at some point during my professional activities. 

I’m not using JACK for the same reason as yours. JACK is a layer above an 
underlying audio/midi OS layer (usually ALSA but not necessarilly) which was - 
if I’ve well understood - designed to provide synchronous multi-client 
capability. ALSA is not fully multi client. For instance you cannot send MIDI 
to a same device from 2 processes. I think (but not 100% sure) that you cannot 
use the same audio device from different processes.

Back to your point.

How did you measure the 1/3 ms latency?
Would you accept to share your modified pcm.c example?

For my work I did some measurement a couple of years ago on a standard PC 
equipped with a pentium G3420 processor and running Linux (Ubuntu). I used a 
scope with a gate synchronised with a MIDI note ON message on one channel, and 
the audio output taken on the phone connector of the embedded audio interface 
on a second channel.

The PC was connected to the MIDI source via a USB port, and my ALSA code is 
using the “interrupt” mode, that is the user code is waiting to be woken up - 
theoretically - at each period. My “pcm device” settings were 44100Khz,  64 
frames period and  4 periods buffer size (so 256 frames). With this setup I 
measured on the scope a latency from the external MIDI gate to the analog audio 
output between 3.5 ms and 6.5 ms (which is enough for musical applications). 
This kind of measure includes nearly all the sources of latency, including the 
latency which is added by the output DAC (and this DAC latency may be high 
depending on the internal DAC design, on an AK4490 this internal latency 
reaches nearly 30 sampling frequency periods that is around 680 microseconds!). 
The latency source which is not included in my measure is the time needed by 
the external MIDI equipment to generate the gate and the MIDI note message from 
an action on the key.


> On 05 Feb 2016, at 00:28, Scott Gravenhorst <music.ma...@gte.net> wrote:
> 
> to: Paul Stoffregen
> 
> I've done quite some work with the Microchip dsPIC33FJ128.  It is a 
> microcontroller capable of 40 MIPS and it has some real DSP instructions that 
> cause parallelism within the dsPIC.  Has a built in 16 bit stereo DAC.  I've 
> been able to make a 12 voice Karplus-Strong MIDI synth with pitch bend and 
> other features.  So yeah, microcontrollers can be a lot of fun and I've done 
> several synths using dsPIC (home-made boards on stripboard).  I've also done 
> the same sort of stuff using FPGAs.  After playing with the Rpi2, I'm not so 
> interested in the zero because it would need too much "stuff" added to it for 
> my liking.  (I hate using an SD card for a filesystem - I've added a portable 
> USB HD to my Rpi2 because [please suppress your laughter] I actually develop 
> on it).
> 
> to: Theo
> 
> I saw your suggestion regarding JACK and I'm wondering "why?".  The 
> documentation on ALSA says that using ALSA provides the tightest interface to 
> the audio driver.  I'm currently investigating JACK and I see that it uses 
> ALSA to talk to the driver anyway, so I wonder what I need JACK for?  I do 
> realize that JACK allows interconnection with other audio apps, but that is 
> not my interest.  My interest is to (hopefully) write a high performance 
> audio application for the Rpi2 (I'm retired and need something to keep me off 
> the streets).  So far, I've had some good progress.  Messing with ALSA 
> transfer methods, I've discovered (by hitting the wall face-first) that my 
> best option for using isolated cores is to NOT use async, but rather to use 
> direct write.  Direct write uses a sort of polling paradigm and the mmap 
> system to talk directly to the DMA RAM and since it's not a callback, it can 
> easily be stuck to an isolated core.  Using the example program pcm.c 
> (modified to use threads), I've been able to run the direct write loop on an 
> isolated core and got 45 "voices" of sine production with a period size of 15 
> frames at SR 44100 (I calculated some 1/3 millisecond of latency).
> 
> I may not have said in my original post - For this system, I'm stripping the 
> Linux to bare bones, no unnecessary services, no X-windows (headless) etc.  
> No USB MIDI is supported (I use the hardware UART port for that).  So this 
> code will not be "nice" to regular Linux users.  As such, it's only for me.  
> I have no intention of connecting this synth to other applications via JACK, 
> my plan is to do effects myself (again, keeps me off the street).  Not sure 
> if it will all work, but so far, things are working as I expect.
> 
> Since I won't be needing JACK connectivity - why would I want to program with 
> JACK instead of ALSA when it seems ALSA is used by JACK anyway?  I read 
> something about the API being "nicer", but as so often happens, this was not 
> detailed in a meaningful way.
> 
> Am I missing something about JACK?
> 
> -- Scott Gravenhorst
> 
> 
> _______________________________________________
> dupswapdrop: music-dsp mailing list
> music-dsp@music.columbia.edu
> https://lists.columbia.edu/mailman/listinfo/music-dsp
> 

_______________________________________________
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Reply via email to