Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Thorsten Wilms

On 23.04.2015 22:59, Fons Adriaensen wrote:

And in the case I mentioned (flight deck displays and user interfaces)
were are talking about*specialists*  in ergonomics who have conducted
a not one but a series of studies and experiments involving a large
group of*expert*  users and costing tons of money. And the result is
quite different. So whom do you think I should believe ?


Writing a letter sitting safely at a desk leads to slightly different 
requirements for a UI than piloting an airplane ...


You do not seriously believe common aspects of mainstream desktop 
environments and core applications like the behavior of radio buttons, 
checkboxes, menus, dialogs and so on came to be without many rounds of 
research and refinement, do you?


There may admittedly be a problem with cargo-cult guideline writing, 
copying without taking first principles into account. Plus the people 
now working at Microsoft, Apple or Gnome and KDE are at risk of 
forgetting some of the things the GUI pioneers already understood.


Now in intensity and information load, applications like Blender or 
Ardour may come closer to a cockpit than a spreadsheet application does. 
But I guess the glass cockpits, just the screens, are not meant for 
direct manipulation, which surely influences the design. Centralized 
pure display combined with a shitload of buttons and doodads do not lend 
themselves as a model for a multi-purpose computer UI.



--
Thorsten Wilms

thorwil's design for free software:
http://thorwil.wordpress.com/
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Thorsten Wilms

On 23.04.2015 21:55, Len Ovens wrote:


That is why being able to adjust with both horizontal and vertical
movement is a plus. Take a look at zita-mu1 for an example. It is also
important to continue watching the position of the mouse when it leaves
the application window.


Yes. If the linear knob happens to be too close to a corner of the 
screen, both part of the vertical and the horizontal range may be out of 
screen, though. Changing direction forces you to spend attention instead 
of relying on autonomous movement, trained by repetition.


With pointer-based usage, you can allow the pointer to go beyond the 
edge. Some 3D application will have the pointer appear on the other 
side, as if it traveled through a portal. But with touch, you are out of 
luck, have to move the active area and allow the finger to be repositioned.


I think in many cases, horizontal sliders with labels and numerical 
values inside the slider area, are the better approach.



--
Thorsten Wilms

thorwil's design for free software:
http://thorwil.wordpress.com/
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] [alsa-devel] Fw: Using loopback card to Connect GSM two way call to the real sound card UDA1345TS

2015-04-24 Thread Clemens Ladisch
Srinivasan S wrote:
> could you please provide me some sample application links without
> using dshare plugin , ie., using the two channels ie., left & right
> directly

I am not aware of any (sample) program that does something like this
(except maybe Jack, but floating-point samples would not be appropriate
for your application).

You have to implement this yourself.


Regards,
Clemens
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] [alsa-devel] Fw: Using loopback card to Connect GSM two way call to the real sound card UDA1345TS

2015-04-24 Thread Clemens Ladisch
Srinivasan S wrote:
> did you mean that , If I use Jack plugin , does it resolve this
> problem (ie., does the CPU consumption reduce drastically instead of
> dshare)

It is unlikely that running your two programs on top of Jack will use
less CPU than with dshare.  However, I don't know the details of your
architecture, so the only way to find out would be for you to try it.


Regards,
Clemens
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Len Ovens

On Fri, 24 Apr 2015, Thorsten Wilms wrote:

I think in many cases, horizontal sliders with labels and numerical values 
inside the slider area, are the better approach.


Like knobs, sliders can be done right or wrong too. Pick up a handy 
android device for examples of wrong. (In audio applications) I think it 
is the available ways of doing things on the android because even 
applications made by a company that also does a pc version, the android 
version is not as good.


In my opinion the best slider will allow the pointing device (finger or 
mouse) to be placed anywhere on the slider and moving the mouse will move 
the value from where it was in the direction the finger moves. (Ardour 
fader for example, but lots get this right)


The next best (best that can be done on android it seems) is that the 
value will not move untill you pass the current value.


The third best the value will not move unless the mouse or finger first 
touches at the current value.


Fourth best is having the value jump to where you first put the mouse or 
finger.


The worst one looks like second best... That is putting the mouse on the 
slider has no effect untill getting to the current value... but because 
the slider control "looks like" a real fader knob, the value first jumps 
in the oposite direction the mouse/finger is moving as soon as the 
mouse/finger touches the graphic of the fader knob rather than waiting 
till the finger is at the middle of the fader knob. This one is useless.


While horizontal faders can use less space (Ardour plugins use this) it 
becomes less "stage usable" real quick.


In the end, for stage a hardware controller seems the best.

--
Len Ovens
www.ovenwerks.net

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Harry van Haaren
On Fri, Apr 24, 2015 at 2:26 PM, Len Ovens  wrote:
> In my opinion the best slider will allow the pointing device (finger or
> mouse) to be placed anywhere on the slider and moving the mouse will move
> the value from where it was in the direction the finger moves. (Ardour fader
> for example, but lots get this right)

+1. Jumps in output value should be avoided but interaction should
occur immidiatly on movement - this is the only choice that satisfies
those criteria.

-- 

http://www.openavproductions.com
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


[LAD] GuitarSynth improved pitch detector

2015-04-24 Thread Gerald
Hi guys,
I've improved the pitch detector in GuitarSynth so that when using a
plektrum the freq doesn't jerk around
during attack phases.
Plans for this weekend are to move it falktx's dpf, lets hope that goes
well.
Gerald
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Fons Adriaensen
On Fri, Apr 24, 2015 at 09:47:16AM +0200, Thorsten Wilms wrote:
 
> Writing a letter sitting safely at a desk leads to slightly
> different requirements for a UI than piloting an airplane ...

Certainly. But mixing a live show or controlling a complex
broadcast setup would be more similar.
 
> You do not seriously believe common aspects of mainstream desktop
> environments and core applications like the behavior of radio
> buttons, checkboxes, menus, dialogs and so on came to be without
> many rounds of research and refinement, do you?

No, but I know very few apps that use them correctly, despite
all the guidelines. Even the simplest things often go wrong.
Consider a button that toggles between 'stop' and 'play'. Does
it show the current state of the player, or the one you get
when you click on it ? Similar situation with 'slider switches'
which show 'on' or 'off' on the flat part. If you have no other
feedback, the state of the button or slider gives you a very
ambiguous hint at best. To remain in the flight deck context,
imagine such a widget being used to control your landing gear... 

Checkboxes are another common problem, they all look the same.
There's no hint at all if they control something essential or
some irrelevant detail.

Returning to audio, how many apps do you know where the rotary
or linear controls will assure you at a glance, without having
to read text, that they are set to approximately the value you
expect ? Even in audio you often have to preset things before
they become active, in other words before you have any feedback
apart from the widgets themselves.
 
> There may admittedly be a problem with cargo-cult guideline writing,
> copying without taking first principles into account. Plus the
> people now working at Microsoft, Apple or Gnome and KDE are at risk
> of forgetting some of the things the GUI pioneers already
> understood.

Not only at risk...
 
> Now in intensity and information load, applications like Blender or
> Ardour may come closer to a cockpit than a spreadsheet application
> does. But I guess the glass cockpits, just the screens, are not
> meant for direct manipulation, which surely influences the design.

Yes, the screens are display only, but the actual controls are
designed in the same way. There's a lot of subliminal hinting
everywhere. For example on the FCU (the 'autopilot') you have
rotary controls to set your target airspead, heading and altitude.
They are the same size and color, but they all feel differently.
Just by touching one of them you know if you have the right one,
without looking or thinking.

Ciao,

-- 
FA

A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Fons Adriaensen
On Fri, Apr 24, 2015 at 01:14:08AM +0200, t...@trellis.ch wrote:
 
> The only point i'd challenge is that "play around a bit" isn't useful. I
> even think if developers don't do it themselves, it's absolutely necessary
> that users do it. If you're too focused on stuff that should work, you
> won't find out all the stuff that doesn't. And finding that out in a
> non-playing around session isn't fun, so better play beforehand :)

Agreed, playing around to get to know your equipment or software
is a good thing to do, certainly if later you have to use it in
potentially stressy circumstances. Any professional will do that,
and probably won't mind if things don't work immediately and he
or she has to consult some documentation or configure something.
But it's not that sort of playing around that I referred to.

There is a class of users (non-users would be more correct) that
will 'play around' with everything they can get their hands on
without having any intention to really learn anything. Just for
the fun of it, to kill some time and because it's free anyway.
Since they have no real interest in whatever they are playing with,
they will give up immediately when they don't get instant results
or have to think for longer than a second. Those are also the ones
that will complain about 'bad user eXperience'. As said before,
I'm not interested in that type of user.

Ciao,

-- 
FA

A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] GuitarSynth

2015-04-24 Thread Albert Graef
Hi Gerald,

cool project, I'm looking forward to give it a try. :)

On Thu, Apr 23, 2015 at 7:24 AM, Gerald  wrote:

> definately, but that comes with the cost of extra hardware (pickup,
> 6chan soundcard). I would build that into GuitarSynth if I had that gear.
> But I'm also rather interested  multipitch out of one signal. It's just
> more convenient too
>

Polyphonic pitch detection is much more involved and requires more advanced
algorithms which are computationally intensive and thus hard to perform in
real-time.

Commercial closed-source software like Melodyne can do this, at least in
off-line processing. AFAICT, the latest Melodyne versions also do some
real-time processing, but I haven't used Melodyne for some time and so I
don't know how well that works.

I'm not sure either whether there are any good open-source codes for this
available yet, maybe others can provide corresponding links. But here are
some relevant answers from Stackoverflow and Stackexchange:

http://stackoverflow.com/questions/9613768/multiple-pitch-detection-fft-or-other/9626849#9626849

http://dsp.stackexchange.com/questions/11433/polyphonic-detection-mulit-pitch-detection-chord-recognition

Also, here's an interesting recent DAFx paper on doing polyphonic pitch
detection using autocorrelation:

http://www.dafx14.fau.de/papers/dafx14_sebastian_kraft_polyphonic_pitch_detectio.pdf

And then there's the work of Anssi Klapuri and others at Tampere University:

http://www.cs.tut.fi/~klap/iiro/

Also, there are algorithms for doing spectrum estimation such as filter
diagonalization methods (FDM) and the classical Prony algorithm, but due to
their complexity these probably aren't well-suited for real-time processing
either (the Prony algorithm also suffers from numerical instabilities
IIRC), and you still have to do the partitioning of the overtone series
afterwards.

There's surely more, but that's what I could find with a quick Google
search or remember from the top of my head.

Hope this helps,
Albert

-- 
Dr. Albert Gr"af
Computer Music Research Group, JGU Mainz, Germany
Email:  aggr...@gmail.com
WWW:https://plus.google.com/+AlbertGraef
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Tim E. Real
On April 24, 2015 10:04:36 AM Thorsten Wilms wrote:
> On 23.04.2015 21:55, Len Ovens wrote:
> > That is why being able to adjust with both horizontal and vertical
> > movement is a plus. Take a look at zita-mu1 for an example. It is also
> > important to continue watching the position of the mouse when it leaves
> > the application window.
> 
> Yes. If the linear knob happens to be too close to a corner of the
> screen, both part of the vertical and the horizontal range may be out of
> screen, though. Changing direction forces you to spend attention instead
> of relying on autonomous movement, trained by repetition.
> 
> With pointer-based usage, you can allow the pointer to go beyond the
> edge. Some 3D application will have the pointer appear on the other
> side, as if it traveled through a portal. But with touch, you are out of
> luck, have to move the active area and allow the finger to be repositioned.
> 
> I think in many cases, horizontal sliders with labels and numerical
> values inside the slider area, are the better approach.

Hey guys.
For anyone wrestling with control design and the mouse pointer 
 being too close to the screen edge, there *IS* an attractive 
 technique you might not be aware of or forgot.

If you use a trackball, it is heaven!
Er... but if it's a touch pad, as mentioned ye may be lifting yer finger.

It involves hiding the mouse pointer and forcing it to do tricks.

1: Upon receiving a mousePress event, immediately *hide* the mouse pointer.

2: Now force the invisible mouse to the *centre* of the screen (to give the 
 mouse pointer plenty of 'headroom' before it would ever reach an edge).

3: Now upon reception of a mouseMove event, do something useful
 with the increment given by subtracting the new position from the
 centre-screen position. (Add the given increment to some object's position 
 or value, edit-box value etc.)

4: Now immediately *force* the mouse pointer back to the centre of the screen. 

5: Repeat 3: and 4: until mouseRelease event is received.

6: Now turn the mouse pointer back on.  Done.

This technique can be used for example with:
Linear-movement knobs and dials, and both vertical and horizontal sliders.
Canvases (2D and 3D!).
Edit boxes.

I've used this technique since the mid-90's on Windows, and now Linux.
It was harder to do in Windows than Linux.
I know a Windows 3D app of the type Thorsten mentioned.
I remember when that edge-crossing thing was added.
I shook my head, laughed out loud to my friend who paid money for that
 and was showing me the new version. So goofy that was, it seemed to me, 
 so I made this instead and added it to my own 3D apps!

I call it "Continuous borderless mouse movement" and my control class 
 names are like "RollEdit" RollCanvas" etc. because you can roll, roll, roll, 
 without ever lifting the mouse button and re-positioning the mouse.
Coupled with mouse acceleration and accelerator modifiers,
 you can roll a floating point edit box value with fine resolution, 
 or with a few more rolls plus acceleration, be into the thousands.
Great for example for 3D Z-axis zooming in/out quickly, continuously.

See examples in MusE, the continuous Pan and Zoom.

If I recall correctly, see also LMMS, and the audio editor Sweep.

I have yet to try, but believe the technique is good when you are forced 
 to deal with short sliders because you are not forced to map one 
 value-change per mouse-pixel movement, nor accept a mouse pointer 
 that goes way out of positional sync with the actual slider thumb 
 rectangle just to get good resolution/range stuffed into the short slider.

We discussed this in MusE recently, about our mixer, and I said if we 
 want short horizontal sliders stuffed into a thin vertical mixer strip,
 then it will be really crucial that we get these sliders right.
So I believed that this technique was really a (the only?) practical
 way to do it. Hide the mouse. 

What do you think?


About *circular* motion knobs and dials:
The awesome thing about them is, as someone mentioned, 
 if you need more resolution you simply move out to a higher radius.

But I've always believed the 'hidden borderless continuous mouse' 
 technique cannot be used here.
Because the user needs to know that they are drawing an arc, and what 
 radius they are at.
Whether that is done by showing the mouse, or hiding the mouse but 
 showing say a 'radius line', if you are near the screen edge you are stuck.
Maybe linear-motion knobs really are best used with this technique.

So I just realized now that the above mentioned border *crossing* 
 technique may help with these circular-motion knobs.
Let the mouse cross the border. Not a big problem for the user,
 they are mostly focused on the actual knob and its value.
They can move to (almost) any radius, and sure it's a little awkward
 near screen edges (like Asteroids - never did well on that one).
But hey what else can you do...

Now, a bit far-fetched but, what about showing a sort of temporary 
 

Re: [LAD] GuitarSynth

2015-04-24 Thread Tim E. Real
On April 25, 2015 02:37:34 AM Albert Graef wrote:
> Hi Gerald,
> 
> cool project, I'm looking forward to give it a try. :)
> 
> On Thu, Apr 23, 2015 at 7:24 AM, Gerald  wrote:
> > definately, but that comes with the cost of extra hardware (pickup,
> > 6chan soundcard). I would build that into GuitarSynth if I had that gear.
> > But I'm also rather interested  multipitch out of one signal. It's just
> > more convenient too
> 
> Polyphonic pitch detection is much more involved and requires more advanced
> algorithms which are computationally intensive and thus hard to perform in
> real-time.
> 
> Commercial closed-source software like Melodyne can do this, at least in
> off-line processing. AFAICT, the latest Melodyne versions also do some
> real-time processing, but I haven't used Melodyne for some time and so I
> don't know how well that works.
> 
> I'm not sure either whether there are any good open-source codes for this
> available yet, maybe others can provide corresponding links. But here are
> some relevant answers from Stackoverflow and Stackexchange:
> 
> http://stackoverflow.com/questions/9613768/multiple-pitch-detection-fft-or-o
> ther/9626849#9626849
> 
> http://dsp.stackexchange.com/questions/11433/polyphonic-detection-mulit-pitc
> h-detection-chord-recognition
> 
> Also, here's an interesting recent DAFx paper on doing polyphonic pitch
> detection using autocorrelation:
> 
> http://www.dafx14.fau.de/papers/dafx14_sebastian_kraft_polyphonic_pitch_dete
> ctio.pdf
> 
> And then there's the work of Anssi Klapuri and others at Tampere University:
> 
> http://www.cs.tut.fi/~klap/iiro/
> 
> Also, there are algorithms for doing spectrum estimation such as filter
> diagonalization methods (FDM) and the classical Prony algorithm, but due to
> their complexity these probably aren't well-suited for real-time processing
> either (the Prony algorithm also suffers from numerical instabilities
> IIRC), and you still have to do the partitioning of the overtone series
> afterwards.
> 
> There's surely more, but that's what I could find with a quick Google
> search or remember from the top of my head.
> 
> Hope this helps,
> Albert

If it provides inspiration, I was doing this in the late 90's on Windows,
 in good ol' Borland C++ Builder.

I simply grabbed an open-source FFT library, and the rest was easy. 
Audio-to-midi polyphonic pitch converter.

It is a real riot! Super fun to try.

I was able to play polyphonic guitar chords and have it come out 
 as midi, for example piano. 
With velocity detection. And anti-retriggering.

There is one drawback. Latency.
For FFT to distinguish among notes  it needs a certain amount of 
 samples in a block. More samples per block for lower notes.
On guitar it was just sorta kinda usable, but fun.

To reduce latency I even tried putting the guitar through a standard 
 time-domain pitch shifter (up one octave) and then into the detector.
Not bad, so so.


Question: I tried a demo product which did polyphony, with similar 
 latency as my app, which claimed to have a full version with 
 near-zero latency.

Is this actually possible?

And with properly timed chord notes (not high notes sounding 
 before low notes ie lower latency for higher notes)?   


Related question:
Albert (and list), I am desperately searching for a pitch shifting 
 phase-vocoder audio plugin with lower latency than FFT.

I have read about wavelets for a long time. 
They are said to be better for this than FFT.
Hard to find real working examples except in commercial.

In PD, there is a wavelet pitch shifter (I think in PD extended)
 but it is *broken*
I emailed the list but got no reply except to contact the author,
 which I haven't done yet.

Are wavelets good for polyphonic pitch detection too?

Can anyone shed more light on them in this context?


Thanks for the links Albert, bookmarked and will check them out.
Tim.


___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Len Ovens

On April 24, 2015 10:04:36 AM Thorsten Wilms wrote:

With pointer-based usage, you can allow the pointer to go beyond the
edge. Some 3D application will have the pointer appear on the other
side, as if it traveled through a portal. But with touch, you are out of
luck, have to move the active area and allow the finger to be repositioned.


another idea for a touch screen:

1 touch control with finger one.
2 put finger two some distance away.
3 move finger two towards control to decrease value or farther away to 
increase value.
4 lift both fingers. I am not sure if lift order would matter. (it 
shouldn't)


I do not know how long it would take to "learn" this so it was natural to 
use.


--
Len Ovens
www.ovenwerks.net

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] User eXperience in Linux Audio

2015-04-24 Thread Tim E. Real
On April 24, 2015 10:18:57 PM Tim E. Real wrote:
> 6: Now turn the mouse pointer back on.  Done.

Ehm, missed on of the best parts:
6:  Now return the mouse pointer to where it was when originally clicked.
7: Now turn the mouse pointer back on.  Done.

Although, realizing now that when using this 'mouse hiding' technique, 
 it doesn't really matter if we use crossing borders or centre-screen mouse.
As long as it returns to where it was when clicked (6:).

Hmm, realizing now that hidden cross-border might actually be simpler 
 and better than my hidden centre-screen thing:
 
My centre-screen technique is in fact limited to half-screen height maximum
 movement ie. DOES have screen borders that could be hit.
With hidden cross-border we roll for a full screen after another 
 after another...

T.

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-dev