[LAD] Pipewire

2024-08-13 Thread Fons Adriaensen
Hello all,

I spent a lot a time reading whatever docs I could find for Pipewire,
and discuss things with some users, only to get frustrated more and
more.

Below is a description of the configuration I'd want. If anyone knows
how to do this (it shouldn't be that difficult) that person will
receive my eternal admiration and gratitude.


1. Jack2 and some clients are started manually after I login,
   and will be running all the time.

2. Currently the ALSA Jack plugin is used to route audio from
   web browsers etc. to Jack. PW may take over this role but
   that is not a strict requirement.

3. PW will be started manually when required, and I don't expect
   that will happen very often. It may remain running when no longer
   needed but shouldn't interfere. It will be used to connect apps
   to Jack as in (2), or those that even don't support ALSA, or
   maybe to route audio from Jack to Bluetooth etc.

4. All Jack ports created by PW should be permanent and exist
   as soon as PW is started, so they can be manually connected
   and remain connected even when not in active use. 

5. PW should never ever access the sound card used by Jack,
   not even if accidentally started when Jack is not running.
   It must not force Jack to use dbus in order to get access
   to that card. It may manage other sound cards, but preferably
   only those explicitly listed.

6. PW must never ever interfere with Jack in any way - making
   connections, trying to change the period size, etc. Its only
   role is to be a well-behaved Jack client.

7. I do not expect anything 'automatic' to happen when things
   are plugged in or out.

8. The PW configuration should be done in such a way such that
   it can't be modified by drop-in files from the system package
   manager. All configuration should be manual and explicit, and
   easy to verify without having to scan a myriad of files and/or
   directories and trying to understand how they interact. This
   is just basic security.

  
Ciao,

-- 
FA

___
Linux-audio-dev mailing list -- linux-audio-dev@lists.linuxaudio.org
To unsubscribe send an email to linux-audio-dev-le...@lists.linuxaudio.org


Re: [LAD] Pipewire help?

2022-02-03 Thread John Murphy
Solved. It was me! I was making links with pw-link -P ... but should have
used -L which I'm sure I tried, but must have failed for some other reason.

Quoting Wim Taymans (who's time I'm sad to have wasted):

"-L makes a lingering link, that is one that stays alive after pw-link
quits and is likely what you want. -P makes a link that is not activated
(for keeping things suspended) you probably don't want that."

-- 
John.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-02-01 Thread Jonathan E. Brickman
I had heard that the Pipewire people are working hard on video 
integration, it appears that you found it! :-)


J.E.B.

On 2/1/22 1:37 AM, John Murphy wrote:

Hmm. I went to check something online and there was an embedded youtube
video to see. I clicked on it and thought 'I know that tune!' :-) I had
forgotten that I'd left jack-play (not) playing and linked to the playback.

Paused the video and the music stopped. Un-pause and off it went again.

All wires shown dotted in Helvium, while paused.

Tried starting jack-play again with -t (jack transport aware). It said:

jack-play: a48k.wav
jack-play: seek request failed, 385225728 (That's beyond Eof)

jack_transport controls work while the video is playing, but do nothing
while the video is paused.


___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-31 Thread John Murphy
Hmm. I went to check something online and there was an embedded youtube
video to see. I clicked on it and thought 'I know that tune!' :-) I had
forgotten that I'd left jack-play (not) playing and linked to the playback.

Paused the video and the music stopped. Un-pause and off it went again.

All wires shown dotted in Helvium, while paused.

Tried starting jack-play again with -t (jack transport aware). It said:

jack-play: a48k.wav
jack-play: seek request failed, 385225728 (That's beyond Eof)

jack_transport controls work while the video is playing, but do nothing
while the video is paused.

-- 
John.

 
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-31 Thread John Murphy
On Sun, 30 Jan 2022 12:34:41 + John Murphy wrote:

> So; I can carry on using QjackCtl and its Patchbay, or work via a meterbridge,

Even Sox plays (and gets its wires made):

$ play -n synth sine 440

The linkage looks like:

$ pw-link -l
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
  |<- ALSA plug-in [sox]:output_FL
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1
  |<- ALSA plug-in [sox]:output_FR
ALSA plug-in [sox]:output_FL
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
ALSA plug-in [sox]:output_FR
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1

Which is hardly different to the links I make manually for jack-play:

$ jack-play -u a48k.wav

$ pw-link -l
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
  |<- jack-play:out_1
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1
  |<- jack-play:out_2
jack-play:out_1
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
jack-play:out_2
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1

But I do see that the connecting links show up dotted in Helvium.

pw-play works, of course, but has no jack_transport awareness option.

Could someone with Pipewire just test that jack-play doesn't play for them
either (without going via a meterbridge) before I post it as a bug?

-- 
John.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-31 Thread Jonathan E. Brickman

gives the visual very nicely, and then I wrote pw-loadwires,
pw-savewires, and pw-dewire, to be found here:

https://github.com/ponderworthy/the-box-of-no-return-3

pw-loadwires and pw-savewires will save wire-sets in CSV files.
pw-dewire is a convenient way to remove all wires at once.  Thus far,
very reliable, fast, and convenient.

All sounds like exactly what I need and your BNR sounds like a lot of fun.

Thanks again,


You are most welcome, John!  And it is :-)

J.E.B.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-31 Thread John Murphy
On Mon, 31 Jan 2022 06:49:51 -0600 Jonathan E. Brickman wrote:

> > So; I can carry on using QjackCtl and its Patchbay, or work via a 
> > meterbridge,
> > or?  
> 
> I have been working on a Pipewire-based revision to my BNR 
> (https://lsn.ponderworthy.com) for some time; I have to have a patchbay 
> of some sort because that thing has a whole lot of connections :-)
> 
> Helvum:
> 
> https://gitlab.freedesktop.org/pipewire/helvum

Such beautiful curves! Thank you. I wondered what all the speech-
dispatchers were, until I remembered FireFox is running. Had to add
'--user' to the 'flatpak-builder --install' command:

flatpak-builder --install --user flatpak-build/ 
build-aux/org.pipewire.Helvum.json

Or:
error: Flatpak system operation ConfigureRemote not allowed for user
Install failed: Child process exited with code 1

> gives the visual very nicely, and then I wrote pw-loadwires, 
> pw-savewires, and pw-dewire, to be found here:
> 
> https://github.com/ponderworthy/the-box-of-no-return-3
> 
> pw-loadwires and pw-savewires will save wire-sets in CSV files. 
> pw-dewire is a convenient way to remove all wires at once.  Thus far, 
> very reliable, fast, and convenient.

All sounds like exactly what I need and your BNR sounds like a lot of fun.

Thanks again,

-- 
John.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-31 Thread Jonathan E. Brickman



So; I can carry on using QjackCtl and its Patchbay, or work via a meterbridge,
or?


I have been working on a Pipewire-based revision to my BNR 
(https://lsn.ponderworthy.com) for some time; I have to have a patchbay 
of some sort because that thing has a whole lot of connections :-)


Helvum:

https://gitlab.freedesktop.org/pipewire/helvum

gives the visual very nicely, and then I wrote pw-loadwires, 
pw-savewires, and pw-dewire, to be found here:


https://github.com/ponderworthy/the-box-of-no-return-3

pw-loadwires and pw-savewires will save wire-sets in CSV files. 
pw-dewire is a convenient way to remove all wires at once.  Thus far, 
very reliable, fast, and convenient.


J.E.B.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-30 Thread John Murphy
On Sat, 22 Jan 2022 13:57:45 -0800 (PST) Len Ovens wrote:

> I am not sure why PW, in it's JACK compatibility does not allow one of the 
> devices to be chosen as master and called system:* for compatibility with 
> all the JACK software out there... but it is what it is. I am sure someone 
> will come up with a configuring app(let) that does this better for 
> profesional audio use. To be honest, I am not really sure what optimal 
> would be.
> 
And yet, it seems that 'system:*' does exist in some form and plays a role.

I've been trying to do without QjackCtl Patchbay and I've deleted the
virtual-sink.conf I made, so Pipewire is back to defaults. I prefer to
not hear audio while testing, so I use a meterbridge.

$ pw-link -i
Midi-Bridge:Midi Through:(playback_0) Midi Through Port-0
Midi-Bridge:M-16DX:(playback_0) M-16DX MIDI 1
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1

but:
$ meterbridge -t vu -n myvu 
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
Registering as myvu
Can't find port 'alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0'

whereas:
$ meterbridge -t vu -n myvu system:playback_1
Registering as myvu
(and it creates a link)

$ pw-link -l
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
  |<- myvu:monitor_1
myvu:monitor_1
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0

I can then play a file with:
$ jack-play -u a48k.wav
jack-play: a48k.wav

make the link and watch (and hear) it play:
$ pw-link -P jack-play:out_1 myvu:meter_1

$ pw-link -l
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
  |<- myvu:monitor_1
myvu:monitor_1
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
myvu:meter_1
  |<- jack-play:out_1
jack-play:out_1
  |-> myvu:meter_1

Great! But then, I can't get it to work without the meterbridge
and I wonder if I need a 'bridge' as well as a Pipewire link.

This is what happens without the meterbridge:

$ jack-play -u a48k.wav
jack-play: a48k.wav

$ pw-link -P jack-play:out_1 
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0

$ pw-link -l
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
  |<- jack-play:out_1
jack-play:out_1
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0

I think that's the same without going via the meterbridge, but no sound.

If I take meterbridge's insistence on linking to 'system:playback_1' as a hint:

$ pw-link -P jack-play:out_1 system:playback_1
failed to link ports: No such file or directory

So; I can carry on using QjackCtl and its Patchbay, or work via a meterbridge,
or?

-- 
John. Needing help again.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-24 Thread John Murphy
> qjackctl: error while loading shared libraries: libQt6Widgets.so.6:
> cannot open shared object file: No such file or directory
> 
> and yet:
> 
> $ locate libQt6Widgets.so.6
> /home/john/Qt/6.2.2/gcc_64/lib/libQt6Widgets.so.6
> /home/john/Qt/6.2.2/gcc_64/lib/libQt6Widgets.so.6.2.2
[...]

All's well now. Made a file called my-qt-qjackctl.conf with contents:

# added Tue 25 Jan 2022 for qjackctl 0.9.6 on Qt 6.2.2
/home/john/Qt/6.2.2/gcc_64/lib/

Put it in /etc/ld.so.conf.d and ran sudo /sbin/ldconfig

It works! (And no problem with lines in the Connect frame.)

-- 
John.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-24 Thread John Murphy
On Sun, 23 Jan 2022 17:24:33 + Rui Nuno Capela wrote:
> On 1/23/22 13:42, Felix Homann wrote:
> > Am So., 23. Jan. 2022 um 01:42 Uhr schrieb John Murphy 
> > mailto:rosegarde...@freeode.co.uk>>:
> > 
> > I don't see a connection line in QJackCtl's Connections, but it's there.
> > Maybe I'll try some variations on the audio.position settings.
> > 
> > 
> > Please, try to delete ~/.config/rncbc.org/QjackCtl.conf 
> >  . That solved the "connections 
> > invisible for QJackCtl" issue for me after months, see 
> > https://gitlab.freedesktop.org/pipewire/pipewire/-/issues/1282#note_1074623 
> > 
> >  
> > . (I actually deleted the whole ~/.config/rncbc.org/  
> > directory but I don't think there was anything else in it than the 
> > QJackCtl.conf file)
> >   
> 
> some users out there may have something else important under 
> ~/.config/rncbc.org/..., which is the place all qstuff stores their 
> configuration and user preferences files; even the new qpwgraph [1] 
> stores its things there ;)
> 
> so please, try to remove ~/.config/rncbc.org/QjackCtl.conf and only!

That's what I did, but I remembered I'm using the old distro version of
QjackCtl and should install the latest. Qt Everywhere, except...

I have Qt 6.2.2 at /home/john/Qt/6.2.2/gcc_64/ and added there to the
path for the build and install, which went smoothly, but running, it
complains:

qjackctl: error while loading shared libraries: libQt6Widgets.so.6:
cannot open shared object file: No such file or directory

and yet:

$ locate libQt6Widgets.so.6
/home/john/Qt/6.2.2/gcc_64/lib/libQt6Widgets.so.6
/home/john/Qt/6.2.2/gcc_64/lib/libQt6Widgets.so.6.2.2
/home/john/Qt/Tools/QtCreator/lib/Qt/lib/libQt6Widgets.so.6
/home/john/Qt/Tools/QtCreator/lib/Qt/lib/libQt6Widgets.so.6.2.2
/home/john/Qt/Tools/QtDesignStudio/lib/Qt/lib/libQt6Widgets.so.6
/home/john/Qt/Tools/QtDesignStudio/lib/Qt/lib/libQt6Widgets.so.6.2.2
/home/john/Qt/Tools/QtDesignStudio/qt6_design_studio_reduced_version/lib/libQt6Widgets.so.6
/home/john/Qt/Tools/QtDesignStudio/qt6_design_studio_reduced_version/lib/libQt6Widgets.so.6.2.2
/usr/local/Qt6/lib/libQt6Widgets.so.6
/usr/local/Qt6/lib/libQt6Widgets.so.6.2.0

I tried 'sudo /sbin/ldconfig -v' but no change.

Thanks, for your programming, and in advance for help with this.

-- 
John. [ My post to the list yesterday was blocked because "The sending
IP is listed on https://spamrl.com as a source of spam." ]
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-23 Thread Felix Homann
Am So., 23. Jan. 2022 um 01:42 Uhr schrieb John Murphy <
rosegarde...@freeode.co.uk>:

> I don't see a connection line in QJackCtl's Connections, but it's there.
> Maybe I'll try some variations on the audio.position settings.
>

Please, try to delete ~/.config/rncbc.org/QjackCtl.conf . That solved the
"connections invisible for QJackCtl" issue for me after months, see
https://gitlab.freedesktop.org/pipewire/pipewire/-/issues/1282#note_1074623
. (I actually deleted the whole ~/.config/rncbc.org/ directory but I don't
think there was anything else in it than the QJackCtl.conf file)

Kind regards,
Felix
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-22 Thread John Murphy
On Sat, 22 Jan 2022 13:57:45 -0800 (PST) Len Ovens wrote:

> On Sat, 22 Jan 2022, John Murphy wrote:
> 
> > My QJackCtl Patchbay doesn't work any more and it's obvious there are
> > new ways to get similar functionality with WirePlumber, but a little
> > example would help. I seem to want to pipe the output of pw-link -l
> > somewhere (pw-link -l | wireplumber --make_it_so).
> >
> > Need to always connect jack-play this way:
> >
> > $ pw-link -l
> > alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
> >  |<- jack-play:out_1
> > alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1
> >  |<- jack-play:out_2
> > jack-play:out_1
> >  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
> > jack-play:out_2
> >  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1  
> 
> I think you can (via PW setup) change the name of your USB to 
> system:playback_1 (and 2) and then Qjackctl's patchbay might just work.
> 
> https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/Virtual-Devices#behringer-umc404hd-speakersheadphones-virtual-sinks
> 
> Change node.name to system and audio.position = [ FL FR ] to [ 1 2 ] etc.
> 
> I am not sure why PW, in it's JACK compatibility does not allow one of the 
> devices to be chosen as master and called system:* for compatibility with 
> all the JACK software out there... but it is what it is. I am sure someone 
> will come up with a configuring app(let) that does this better for 
> profesional audio use. To be honest, I am not really sure what optimal 
> would be.

Thanks Len, it works fine now. I copied /etc/pipewire/client-rt.conf) to
/etc/pipewire/virtual-sink.conf and added, in the modules section:

{   name = libpipewire-module-loopback
args = {
node.name = "system"
node.description = "the system"
capture.props = {
media.class = "Audio/Sink"
audio.position = [ FL FR ]
}
playback.props = {
audio.position = [ AUX0 AUX1 ]
node.target = "alsa_output.usb-EDIROL_M-16DX-00.pro-output-0"
stream.dont-remix = true
node.passive = true
}
}
}

I don't see a connection line in QJackCtl's Connections, but it's there.
Maybe I'll try some variations on the audio.position settings.

It's well documented, although over-my-head, but I needed to get that
working quickly so my Qt program, which uses jack-play via QProcess, works.

-- 
John.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] Pipewire help?

2022-01-22 Thread Len Ovens

On Sat, 22 Jan 2022, John Murphy wrote:


My QJackCtl Patchbay doesn't work any more and it's obvious there are
new ways to get similar functionality with WirePlumber, but a little
example would help. I seem to want to pipe the output of pw-link -l
somewhere (pw-link -l | wireplumber --make_it_so).

Need to always connect jack-play this way:

$ pw-link -l
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
 |<- jack-play:out_1
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1
 |<- jack-play:out_2
jack-play:out_1
 |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
jack-play:out_2
 |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1


I think you can (via PW setup) change the name of your USB to 
system:playback_1 (and 2) and then Qjackctl's patchbay might just work.


https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/Virtual-Devices#behringer-umc404hd-speakersheadphones-virtual-sinks

Change node.name to system and audio.position = [ FL FR ] to [ 1 2 ] etc.

I am not sure why PW, in it's JACK compatibility does not allow one of the 
devices to be chosen as master and called system:* for compatibility with 
all the JACK software out there... but it is what it is. I am sure someone 
will come up with a configuring app(let) that does this better for 
profesional audio use. To be honest, I am not really sure what optimal 
would be.


--
Len Ovens
www.ovenwerks.net
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


[LAD] Pipewire help?

2022-01-22 Thread John Murphy
My QJackCtl Patchbay doesn't work any more and it's obvious there are
new ways to get similar functionality with WirePlumber, but a little
example would help. I seem to want to pipe the output of pw-link -l
somewhere (pw-link -l | wireplumber --make_it_so).

Need to always connect jack-play this way:

$ pw-link -l
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
  |<- jack-play:out_1
alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1
  |<- jack-play:out_2
jack-play:out_1
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX0
jack-play:out_2
  |-> alsa_output.usb-EDIROL_M-16DX-00.pro-output-0:playback_AUX1

Thanks.

-- 
It gets harder to learn new things as we gets old.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-21 Thread John Murphy
On Fri, 21 Jan 2022 09:24:19 -0500 Kevin Cole wrote:

> On Fri, Jan 21, 2022 at 4:20 AM John Murphy 
> wrote:
> 
> I ended up using the 'PipeWire & WirePlumber & blueman-git PPA for Ubuntu  
> > (>= 18.04)' after many attempts at the meson build.
> 
> 
> That's what got me going as well -- or at least part of that. In an earlier
> post in this thread, I summarized my steps. But I didn't do the WirePlumber
> thing. So I went searching, based on your comment and found instructions at:
> 
> https://pipewire-debian.github.io/pipewire-debian/
> 
> Haven't tried those instructions yet. I'm checking them out now. Was that
> where you found the "how-to"?  

Yes. The same as I tested with last time, but jack_transport didn't work
for me back then. I expect you have it working by now :)

---

I meant Timeshift not Timeline, in my previous post...

-- 
John.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-21 Thread Kevin Cole
On Fri, Jan 21, 2022 at 4:20 AM John Murphy 
wrote:

I ended up using the 'PipeWire & WirePlumber & blueman-git PPA for Ubuntu
> (>= 18.04)' after many attempts at the meson build.


That's what got me going as well -- or at least part of that. In an earlier
post in this thread, I summarized my steps. But I didn't do the WirePlumber
thing. So I went searching, based on your comment and found instructions at:

https://pipewire-debian.github.io/pipewire-debian/

Haven't tried those instructions yet. I'm checking them out now. Was that
where you found the "how-to"?
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-21 Thread John Murphy
Working wonderfully well now, for me, as far as I've tested on a
hardware limited PC (Intel NUC -> Topping TP30). Amazing to be able to
run VLC into Alsa, PulseAudio or Jack, without changing anything else.
I even tried the OpenBSD output. Firefox audio always just works now.

Great to think that the quite tricky configuration of cloop/ploop etc.,
will soon be 'a thing of the past'.

Testing on virtual machines was no good although the Intel card worked.

I ended up using the 'PipeWire & WirePlumber & blueman-git PPA for Ubuntu
(>= 18.04)' after many attempts at the meson build. May be just down to a
CMake problem. Quoting meson-log.txt below.

I've found that Mint's Timeline works well and I'll be creating a snapshot,
before updating via the ppa.

My sincere thanks to all who have contributed to PipeWire (and all
Linux developers, of course). It's so good these days...

-- 
John. 


-

Part of meson-log.txt about CMake:

CMake binary for 1 is not cached
CMake binary missing from cross or native file, or env var undefined.
Trying a default CMake fallback at cmake
Found CMake: /usr/bin/cmake (3.16.3)
Extracting basic cmake information
Try CMake generator: auto
Calling CMake (['/usr/bin/cmake']) in 
/home/john/pipewire-0.3.43/builddir/meson-private/cmake_ldacBT-enc with:
  - "--trace"
  - "--trace-expand"
  - "--no-warn-unused-cli"
  - "--trace-redirect=cmake_trace.txt"
  - 
"-DCMAKE_TOOLCHAIN_FILE=/home/john/pipewire-0.3.43/builddir/meson-private/cmake_ldacBT-enc/CMakeMesonToolchainFile.cmake"
  - "."
  -- Module search paths:['/', '/opt', '/usr', '/usr/local']
  -- CMake root: /usr/share/cmake-3.16
  -- CMake architectures:['x86_64-linux-gnu']
  -- CMake lib search paths: ['lib', 'lib32', 'lib64', 'libx32', 'share', 
'lib/x86_64-linux-gnu']
Preliminary CMake check failed. Aborting.
Run-time dependency ldacbt-enc found: NO (tried pkgconfig and cmake)
Pkg-config binary for 1 is cached.
Determining dependency 'ldacBT-abr' with pkg-config executable 
'/usr/bin/pkg-config'
env[PKG_CONFIG_PATH]: 
Called `/usr/bin/pkg-config --modversion ldacBT-abr` -> 1

CMake binary for 1 is cached.
Preliminary CMake check failed. Aborting.
Run-time dependency ldacbt-abr found: NO (tried pkgconfig and cmake)
Pkg-config binary for 1 is cached.
Determining dependency 'libfreeaptx' with pkg-config executable 
'/usr/bin/pkg-config'
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-20 Thread Wim Taymans
On Thu, 20 Jan 2022 at 12:16, Fons Adriaensen  wrote:
>
> Hello, Wim,
>
> > Sorry, git for now. I just started to implement the last bits to make a
> > session manager optional.
>
> OK, I'll wait until this is available via Arch (don't want to mix up
> two potential problems, build/install and configure...)
>

I'll probably make a release next week.

>
> > All alsa devices are wrapped in an adapter. This contains:
> >
> >-> channelmix -> resample -> convert -> alsa-device
> >
> > channelmix and resample are disabled when the graph sample rate
> > and  device channels all match the adapter ports, you would configure
> > the same number of channels on the output as the alsa-device channels
> > and set the graph rate to something the hw supports.
>
> In the example config, node.param.Portconfig.format.channels is
> hardcoded. Is there a way to obtain the number of channels the
> device supports (to make sure the values match) ? ALSA does
> provide this info once the devide is opened...
>
I've added an option to disable the channelmixer and volume updates to it now.

I've also added an option to automatically configure the device with
the max amount of channels probed from the device.

> What will happen if the configured number of channels does not
> match ? What sort of channelmix will I get ?
>

It depends on the channel layout, first channels that match will be
copied, then there are some simple heuristics to make channels out of
the other ones (front center from stereo, copy front to rear channels)
or mixing channels into other ones (rear channels into stereo). This
is nothing fancy but good enough for default consumer use.

> > The channelmix is mostly to support making a 5.1 sink that can downmix
> > to dolby or some other tricks.
>
> This can be a very devious thing. I remember an occcasion some
> years ago (when I was in Parma) when some students were doing
> measurements in a rented anechoic room during an entire week.
> Later, when the measurements were processed, they discovered
> that all of them were useless because their (Windows) system
> had been trying to be clever and had applied gain changes and
> channel mixing without them being aware. Now work out the cost
> of renting an anechoic room for 60 hours. Plus, if they hadn't
> been students and 'free labour', the consultancy fees for the
> same period.
>
> For any serious work, there are things that need to be
> disabled without any chance of them ever be re-enabled by
> accident. The required result when something does not match
> is to fail and report, not to try and be clever and 'fix'
> things behind the user's back.
>
> The idea of having the daemon do the 'plumbing' and the
> session manager to define 'policies' is a very good one.
> But to take that to its logical consequence, the defaults
> for any optional processing (channel gains and mixing,
> resampling,...) should be off and disabled. If any other
> defaults make sense for the 'average user', they should
> be defaults defined by the session manager, not by the
> plumbing daemon.
>
I agree, I'm adding options to disable the extra automatic
things by default.

>
> There is one feature that would be very desirable and
> for which I would even be prepared to write an ad-hoc
> session manager if that is the only place it can be
> done: if a sound card becomes unavailable while in use,
> substitute a dummy device with the same sample rate,
> period, and number of channels, so the entire processing
> graph remains intact and running. Then, when the device
> becomes available again, allow the user to reconnect
> to it (this must NOT be automatic). This is to minimise
> 'down time' when someone accidentally pulls a cable
> during a concert or recording (a classical orchestra
> is orders of magnitude more expensive than an anechoic
> room).

Very doable with a little lua script in wireplumber..

>
> Ciao,
>
> --
> FA
>
>
>
>
>
>
>
>
>
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-20 Thread Fons Adriaensen
Hello, Wim,

> Sorry, git for now. I just started to implement the last bits to make a
> session manager optional.

OK, I'll wait until this is available via Arch (don't want to mix up
two potential problems, build/install and configure...)
 

> All alsa devices are wrapped in an adapter. This contains:
> 
>-> channelmix -> resample -> convert -> alsa-device
> 
> channelmix and resample are disabled when the graph sample rate
> and  device channels all match the adapter ports, you would configure
> the same number of channels on the output as the alsa-device channels
> and set the graph rate to something the hw supports.

In the example config, node.param.Portconfig.format.channels is
hardcoded. Is there a way to obtain the number of channels the
device supports (to make sure the values match) ? ALSA does
provide this info once the devide is opened...

What will happen if the configured number of channels does not
match ? What sort of channelmix will I get ? 

> The channelmix is mostly to support making a 5.1 sink that can downmix
> to dolby or some other tricks.

This can be a very devious thing. I remember an occcasion some
years ago (when I was in Parma) when some students were doing
measurements in a rented anechoic room during an entire week.
Later, when the measurements were processed, they discovered
that all of them were useless because their (Windows) system
had been trying to be clever and had applied gain changes and
channel mixing without them being aware. Now work out the cost
of renting an anechoic room for 60 hours. Plus, if they hadn't
been students and 'free labour', the consultancy fees for the
same period.

For any serious work, there are things that need to be
disabled without any chance of them ever be re-enabled by
accident. The required result when something does not match
is to fail and report, not to try and be clever and 'fix'
things behind the user's back.

The idea of having the daemon do the 'plumbing' and the 
session manager to define 'policies' is a very good one. 
But to take that to its logical consequence, the defaults
for any optional processing (channel gains and mixing,
resampling,...) should be off and disabled. If any other
defaults make sense for the 'average user', they should
be defaults defined by the session manager, not by the
plumbing daemon. 


There is one feature that would be very desirable and
for which I would even be prepared to write an ad-hoc
session manager if that is the only place it can be
done: if a sound card becomes unavailable while in use,
substitute a dummy device with the same sample rate,
period, and number of channels, so the entire processing
graph remains intact and running. Then, when the device
becomes available again, allow the user to reconnect
to it (this must NOT be automatic). This is to minimise
'down time' when someone accidentally pulls a cable
during a concert or recording (a classical orchestra
is orders of magnitude more expensive than an anechoic
room).

Ciao,

-- 
FA







  

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-19 Thread Wim Taymans
On Wed, 19 Jan 2022 at 14:04, Fons Adriaensen  wrote:
>
> On Tue, Jan 18, 2022 at 07:16:39PM +0100, Wim Taymans wrote:
>
> > As a bare minimum you would need pipewire (the daemon) and
> > pipewire-jack (the libjack.so client implementation). With a custom
> > config file you can make this work exactly like jack (see below).
>
> Thanks, will try this, but mnay questions remain (see below).
>
> > All the system integration (dbus, systemd and the automatic stuff)
> > happens in the session manager. You don't need to run this.
>
> Aha, that is good news.
>
> > You'll need the pipewire git version
>
> Will things work with the current Arch packages and do I
> need git just becuase it has the minimal.conf file ?
> If yes I'd prefer to use the Arch packages for now.

Sorry, git for now. I just started to implement the last bits to make a
session manager optional.

>
> Questions:
>
> * A lot of lines in the minimal.conf are commented. Can one
>   assume that these correspond to the defaults ? If not, what
>   are the defaults for all these parameters ?
>
They are defaults, yes.

>   What concerns me here is things like
>
> #channelmix.normalize  = true

All alsa devices are wrapped in an adapter. This contains:

   -> channelmix -> resample -> convert -> alsa-device

channelmix and resample are disabled when the graph sample rate
and  device channels all match the adapter ports, you would configure
the same number of channels on the output as the alsa-device channels
and set the graph rate to something the hw supports.

The channelmix is mostly to support making a 5.1 sink that can downmix
to dolby or some other tricks.

The resampler is used when slaving devices or when the graph rate
doesn't match the device rate (some webcams only do 16KHz). The
minimal config disables it as well.

>
>   I certainly do not want any normalisation, so do I need
>   to set this to false explicitly ?
>
If you set the channels in node.param.PortConfig and audio.channels
to the same value, there will be no channelmixing and so no
normalization.

>
> * The sample rate (48000) is in many places, most of them
>   commented out. What is the relation between all of these ?
>   Why, for example, is 'default.clock.rate' commented ?

default.clock.rate  = 48000 is the default, If you change the graph
clock rate, the devices will follow the rate so you can just leave
them blank (the rate in node.param.PortConfig is ignored.. need
to fix that..)

>
> * 'quantum', 'period-size' and 'period-num' are commented
>   out everywhere (except in 'vm.override'). So where is
>   the period size defined ?
>
The commented out values are the defaults

> * If things like sample rate, period size, etc. are set
>   to some fixed values in the config, can they still be
>   modified by e.g. pw-metadata ? I hope not...
>
Yes, pw-metadata (but not accessible from jack apps)  overrides the
config settings and limits. I'll add a switch to disable this.

Wim

>
> Ciao,
>
> --
> FA
>
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-19 Thread Fons Adriaensen
On Tue, Jan 18, 2022 at 07:16:39PM +0100, Wim Taymans wrote:

> As a bare minimum you would need pipewire (the daemon) and
> pipewire-jack (the libjack.so client implementation). With a custom
> config file you can make this work exactly like jack (see below).

Thanks, will try this, but mnay questions remain (see below).
 
> All the system integration (dbus, systemd and the automatic stuff)
> happens in the session manager. You don't need to run this.

Aha, that is good news. 
 
> You'll need the pipewire git version

Will things work with the current Arch packages and do I
need git just becuase it has the minimal.conf file ?
If yes I'd prefer to use the Arch packages for now.

Questions:

* A lot of lines in the minimal.conf are commented. Can one
  assume that these correspond to the defaults ? If not, what
  are the defaults for all these parameters ? 

  What concerns me here is things like 

#channelmix.normalize  = true
  
  I certainly do not want any normalisation, so do I need
  to set this to false explicitly ?


* The sample rate (48000) is in many places, most of them
  commented out. What is the relation between all of these ?
  Why, for example, is 'default.clock.rate' commented ?

* 'quantum', 'period-size' and 'period-num' are commented
  out everywhere (except in 'vm.override'). So where is
  the period size defined ?

* If things like sample rate, period size, etc. are set
  to some fixed values in the config, can they still be
  modified by e.g. pw-metadata ? I hope not...


Ciao,

-- 
FA

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-19 Thread Lorenzo Sutton

Hi Wim,

Thanks for the very detailed info!

On 18/01/22 19:24, Wim Taymans wrote:

On Mon, 17 Jan 2022 at 16:03, Lorenzo Sutton  wrote:






My problem with that set-up is that it seemed that something like Ardour
would need to be explicitly run via pw-jack so e.g.

pw-jack ardour



You distro probably also has a package that puts the pipewire
libjack.so in LD_LIBRARY path and then you don't have to type pw-jack
anymore.

OK, I think it's an AUR package on Arch and derivatives.

I'm wondering if an application is typically able to work with both, 
alsa/pulseaudio and/or jack (Ardour, Pure Data, Yoshimi, MuseScore come 
to mind), how would this work?
Some of these I hardly ever run with alsa/pulseaudio, but for example I 
do sometimes use musescore with pulseaudio, or even Pure Data and 
Yoshimi. With the current set-up if I am running jack, then 
alsa/pulseaudio will just fail, which in this case is good because it is 
'forcing' me to use jack (in the application) in case something 
different was set-up.
In that scenario what would pipewire prioritize? Would there be a way to 
tell piipewiere 'hey now I'd like to be in jack mode as much as 
possible' :-)
SMPlayer has a really simple and neat way of setting this up where in 
the Output driver for audio you can write e.g. 'jack,pulse' (that's the 
setting I have), and it will try to use those in that order, essentially 
failing if jack isn't available and then trying pulseaudio





But then setting the samplerate (I have projects at different
samplerates), wasn't trivial.


switch to fixed sample rate (on the fly):

   pw-metadata -n settings 0 clock.force-rate 

switch back to dynamic control

   pw-metadata -n settings 0 clock.force-rate 0

Same for buffersize (quantum) :

   pw-metadata -n settings 0 clock.force-quantum 

and back to dynamic:

   pw-metadata -n settings 0 clock.force-quantum 0


Cool. Would this be done before running a jack application, e.g. Ardour?

Lorenzo.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-18 Thread Will Godfrey
On Tue, 18 Jan 2022 19:16:39 +0100
Wim Taymans  wrote:

>Hi Fons,
>
>As a bare minimum you would need pipewire (the daemon) and
>pipewire-jack (the libjack.so client implementation). With a custom
>config file you can make this work exactly like jack (see below).

Thanks for this info :)

-- 
Will J Godfrey
https://willgodfrey.bandcamp.com/
http://yoshimi.github.io
Say you have a poem and I have a tune.
Exchange them and we can both have a poem, a tune, and a song.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-18 Thread Wim Taymans
On Tue, 18 Jan 2022 at 17:15, Will Godfrey  wrote:

> >and of course systemd. I do not think it will run without.

It does run fine without systemd.

>
> If it *requires* systemd then that is a non-starter for me :(
>
It doesn't require systemd.

You can compile with systemd support and then you have the capability
of having systemd socket activate the server and all components. If
you don't compile it in, you'll have to start things some other way.

Wim
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-18 Thread Wim Taymans
On Mon, 17 Jan 2022 at 16:03, Lorenzo Sutton  wrote:
>


> My problem with that set-up is that it seemed that something like Ardour
> would need to be explicitly run via pw-jack so e.g.
>
> pw-jack ardour
>

You distro probably also has a package that puts the pipewire
libjack.so in LD_LIBRARY path and then you don't have to type pw-jack
anymore.

> But then setting the samplerate (I have projects at different
> samplerates), wasn't trivial.

switch to fixed sample rate (on the fly):

  pw-metadata -n settings 0 clock.force-rate 

switch back to dynamic control

  pw-metadata -n settings 0 clock.force-rate 0

Same for buffersize (quantum) :

  pw-metadata -n settings 0 clock.force-quantum 

and back to dynamic:

  pw-metadata -n settings 0 clock.force-quantum 0

Wim
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-18 Thread Wim Taymans
Hi Fons,

As a bare minimum you would need pipewire (the daemon) and
pipewire-jack (the libjack.so client implementation). With a custom
config file you can make this work exactly like jack (see below).

The way PipeWire normally works is that when starting the daemon,
nothing is in the graph. Devices are usually loaded into the graph
(using udev for ALSA, bluez5 for bluetooth, avahi for network devices,
...) by a session manager. We have a basic pipewire-media-session and
a more complete WirePlumber session manager. All the system
integration (dbus, systemd and the automatic stuff) happens in the
session manager. You don't need to run this.

For pulseaudio compatibility, you can also install and run a separate
server (pipewire-pulse). This basically just translates pulse protocol
to pipewire protocol and makes pipewire clients for streams. You don't
need to run this either.

You'll need the pipewire git version to run a minimal setup with only
pipewire using this config:
https://gitlab.freedesktop.org/pipewire/pipewire/-/blob/master/src/daemon/minimal.conf.in
You can change the source and sink ALSA devices, set up the channels
and format and then use pipewire -c 
It should be enough to run all JACK apps to test things.

pw-top is interesting, pw-profiler works like the jack2 profiler,
pw-dump for a JSON dump of the object tree, pw-link (like jack_lsp,
jack_connect), etc...

Wim


On Mon, 17 Jan 2022 at 14:56, Fons Adriaensen  wrote:
>
> Hello all,
>
> I'd like to test pipewire as a replacement for Jack (on Arch),
> and have been reading most (I think) of the available docs.
>
> What is clear is that I will need to install the pipewire
> and pipewire-jack packages.
>
> And then ?
>
> How do I tell pipewire to use e.g. hw:3,0 and make all of
> its 64 channels appear as capture/playback ports in qjackctl ?
>
> Note: I do not have anything PulseAudio (like pavucontrol)
> installed and don't want to either. If that would be a
> requirement then I'll just forget about using pipewire.
>
> TIA,
>
> --
> FA
>
> ___
> Linux-audio-dev mailing list
> Linux-audio-dev@lists.linuxaudio.org
> https://lists.linuxaudio.org/listinfo/linux-audio-dev
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-18 Thread Len Ovens

On Tue, 18 Jan 2022, Will Godfrey wrote:


On Tue, 18 Jan 2022 08:08:56 -0800 (PST)

Len Ovens  wrote:
Pipewire does use all the system bits that puleaudio does, such as dbus 
and of course systemd. I do not think it will run without.


If it *requires* systemd then that is a non-starter for me :(


I have not tried running it without and so have no hard knowledge on that. 
But do remember the source. It comes from the RH ecosystem.


However, if you are running without systemd and presumably without pulse, 
why would you not just use jack? With the right device, jack is very 
stable. Pipewire is trying to emulate jack and in general not improve on 
jack.


--
Len Ovens
www.ovenwerks.net
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-18 Thread Will Godfrey
On Tue, 18 Jan 2022 08:08:56 -0800 (PST)

Len Ovens  wrote:
>Pipewire does use all the system bits that puleaudio does, such as dbus 
>and of course systemd. I do not think it will run without.

If it *requires* systemd then that is a non-starter for me :(

-- 
Will J Godfrey
https://willgodfrey.bandcamp.com/
http://yoshimi.github.io
Say you have a poem and I have a tune.
Exchange them and we can both have a poem, a tune, and a song.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-18 Thread Len Ovens

On Mon, 17 Jan 2022, Fons Adriaensen wrote:


I'd like to test pipewire as a replacement for Jack (on Arch),



How do I tell pipewire to use e.g. hw:3,0 and make all of
its 64 channels appear as capture/playback ports in qjackctl ?

Note: I do not have anything PulseAudio (like pavucontrol)
installed and don't want to either. If that would be a
requirement then I'll just forget about using pipewire.


It depends on the reason for not using "anything" pulseaudio. Pipewire is 
a replacement for jack and pulseaudio. So the "JACK" graph will show all 
devices, none of which will be named system:* and some of which will go 
through some sort of SRC. I think it is possible to designate one device 
as master with direct sync and specified latency but the reality is that 
if you wish your one device to be separate from any internal, hdmi, 
webcam, etc. that will not happen. It is possible to select a device 
profile of "Off" for these devices but that would mean any desktop 
application will automatically use you multi channel device "as best it 
can" (ie. connect it's output to all device outs). With PW it is not 
possible to run two audio servers separately, one for desktop and one for 
audio production. (well you can still run JACK separately... and I guess 
for now it is possible to run pulse separately too, but that will go away) 
Pipewire does use all the system bits that puleaudio does, such as dbus 
and of course systemd. I do not think it will run without.


Of course because it is a replacement for pulseaudio, even though it may 
not use any of the pulseaudio code, it's interface to the desktop 
applications uses much the same interface as pulseaudio. Hopefully in a 
better way.


One other thing to be aware of, PW does not load any other backend besides 
ALSA. I think it does have an auto dummy device though.


--
Len Ovens
www.ovenwerks.net
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-17 Thread John Murphy
Evening all.
A timely thread, for me. I've just moved to a faster PC and, with an eye
to getting pipewire working, I've installed VirtualBox (6.1.30) and made
a VM of the same distro (Mint 20.3) as on the host. I'll clone it so it's
easy to 'redo from start' when I mess it up.

I've read differences of opinion about it.

-- 
John. [Mint 20.3 Cinnamon, looks and feels excellent.]
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-17 Thread David Runge
On 2022-01-17 14:56:30 (+0100), Fons Adriaensen wrote:
> Hello all,
> 
> I'd like to test pipewire as a replacement for Jack (on Arch),
> and have been reading most (I think) of the available docs.
> 
> What is clear is that I will need to install the pipewire
> and pipewire-jack packages.
> 
> And then ?

With the pipewire currently in [testing] it would even be the only thing
you need to do, as it replaces jack2 (or jack).
A good place to look for answers in regards to pipewire would be its
official wiki [1] and documentation [2].

Please note, that before pipewire-jack >= 1:0.3.43-4 is in [extra] you
will need to use pw-jack to run applications with pipewire's jack
implementation or override your ld.so.conf (see wiki on that [3]).

It should be moved this evening though and I'll try to update the wiki
afterwards to reflect this change.

> How do I tell pipewire to use e.g. hw:3,0 and make all of
> its 64 channels appear as capture/playback ports in qjackctl ?

I *think* you need to use the "Pro Audio" configuration for your
specific device to make that work.

> Note: I do not have anything PulseAudio (like pavucontrol)
> installed and don't want to either. If that would be a
> requirement then I'll just forget about using pipewire.

Pavucontrol would be one way of doing so. It's also possible via pw-cli
[4] (Wim Taymans gave that example on IRC - #pipewire on oftc.net is a
good resource too!).

Best,
David

[1] https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/home
[2] https://docs.pipewire.org
[3] https://wiki.archlinux.org/title/PipeWire#JACK_clients
[4] 
https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/Migrate-PulseAudio#set-card-profile

-- 
https://sleepmap.de


signature.asc
Description: PGP signature
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-17 Thread Kevin Cole
I've heard of success stories -- most notably with Fedora -- and keep
circling back to it. I'm using Pop!_OS 20.04 -- which is close 'nuf to
Ubuntu 20.04 -- together with the KX Studio repositories. Cadence et
all have been my friends...

So. without knowing at all what I'm doing, I tried pipewire a few
months ago and everything became an unusable mess. I uninstalled it
all and tried again recently. While I'm not sure how to use it all,
Catia now sees a whole bunch more than it used to, and various audio
apps that I used to fuss with a bit seem to be working automatically.

I found a couple of answers to "Replacing Pulseaudio with Pipewire in
Ubuntu 20.04"
https://askubuntu.com/a/1339897/2059
https://askubuntu.com/a/1365822/2059
to be particularly helpful in getting all the wee bits working right
-- as near as I can tell.

For what it's worth, summarized here, the latest iteration of my
experiments in installing:

sudo add-apt-repository ppa:pipewire-debian/pipewire-upstream
sudo apt full-upgrade
sudo apt install libspa-0.2-bluetooth
sudo apt install pipewire-audio-client-libraries
systemctl --user daemon-reload
systemctl --user --now disable pulseaudio.service pulseaudio.socket
systemctl --user mask pulseaudio
systemctl --user --now enable pipewire-media-session.service
systemctl --user --now enable pipewire pipewire-pulse
sudo rm /etc/pipewire/pipewire.conf
sudo apt reinstall pipewire pipewire-pulse
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-17 Thread Will Godfrey
On Mon, 17 Jan 2022 16:03:03 +0100
Lorenzo Sutton  wrote:

>Hi,
>
>Thanks for opening this thread, I find this topic very interesting and 
>been discussing it with some people :-)
>
>If it might be of help, I'm on Manjaro which is Arch derivative so 
>probably similar and I followed the Arch guide, and tried the 
>'substitution' - TL:DR: I eventually reverted back to pulseaudio+jack, 
>for now.
>
>On 17/01/22 14:56, Fons Adriaensen wrote:
>
>[...]
>
>> I'd like to test pipewire as a replacement for Jack (on Arch),
>> and have been reading most (I think) of the available docs.
>> 
>> What is clear is that I will need to install the pipewire
>> and pipewire-jack packages.  
>
>My problem with that set-up is that it seemed that something like Ardour 
>would need to be explicitly run via pw-jack so e.g.
>
>pw-jack ardour
>
>But then setting the samplerate (I have projects at different 
>samplerates), wasn't trivial.
>
>If I understand correctly eventually pipewire will be a drop-in and the 
>pw-jack shouldn't be needed.
>
>The other thing I wasn't able to figure out was how to use it as I 
>previously would use qjackctl
>
>> 
>> And then ?
>> 
>> How do I tell pipewire to use e.g. hw:3,0 and make all of
>> its 64 channels appear as capture/playback ports in qjackctl ?  
>
>This was also unclear for me. I use 3 audio interfaces mainly and have 
>dedicated qjackctl 'profiles', and that works quite well for me, so 
>wasn't sure how this is handled in pipewire.
>
>If you'd be willing to share any results in this thread it would be 
>really useful.
>
>My current workflow is to launch jack when needed with the correct 
>device / samplerate configuration when needed, only _if_ needed open a 
>pulseaudio sink (e.g. browser audio needed while using jack). But I 
>understand that's might be a very 'personal' approach to it all :-)
>
>One interesting (yet still anecdotal?) aspect is that potentially 
>pipewire manages to provide better latency?
>
>Lorenzo

I'm also on the 'wait and see' list. For me, Jack is just plug and play. I'd
want pipewire to be the same. As for latency, this seems to be more to do with
my USB soundcard than anything else.

-- 
Will J Godfrey
https://willgodfrey.bandcamp.com/
http://yoshimi.github.io
Say you have a poem and I have a tune.
Exchange them and we can both have a poem, a tune, and a song.
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] pipewire

2022-01-17 Thread Lorenzo Sutton

Hi,

Thanks for opening this thread, I find this topic very interesting and 
been discussing it with some people :-)


If it might be of help, I'm on Manjaro which is Arch derivative so 
probably similar and I followed the Arch guide, and tried the 
'substitution' - TL:DR: I eventually reverted back to pulseaudio+jack, 
for now.


On 17/01/22 14:56, Fons Adriaensen wrote:

[...]


I'd like to test pipewire as a replacement for Jack (on Arch),
and have been reading most (I think) of the available docs.

What is clear is that I will need to install the pipewire
and pipewire-jack packages.


My problem with that set-up is that it seemed that something like Ardour 
would need to be explicitly run via pw-jack so e.g.


pw-jack ardour

But then setting the samplerate (I have projects at different 
samplerates), wasn't trivial.


If I understand correctly eventually pipewire will be a drop-in and the 
pw-jack shouldn't be needed.


The other thing I wasn't able to figure out was how to use it as I 
previously would use qjackctl




And then ?

How do I tell pipewire to use e.g. hw:3,0 and make all of
its 64 channels appear as capture/playback ports in qjackctl ?


This was also unclear for me. I use 3 audio interfaces mainly and have 
dedicated qjackctl 'profiles', and that works quite well for me, so 
wasn't sure how this is handled in pipewire.


If you'd be willing to share any results in this thread it would be 
really useful.


My current workflow is to launch jack when needed with the correct 
device / samplerate configuration when needed, only _if_ needed open a 
pulseaudio sink (e.g. browser audio needed while using jack). But I 
understand that's might be a very 'personal' approach to it all :-)


One interesting (yet still anecdotal?) aspect is that potentially 
pipewire manages to provide better latency?


Lorenzo
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


[LAD] pipewire

2022-01-17 Thread Fons Adriaensen
Hello all,

I'd like to test pipewire as a replacement for Jack (on Arch),
and have been reading most (I think) of the available docs.

What is clear is that I will need to install the pipewire
and pipewire-jack packages.

And then ?

How do I tell pipewire to use e.g. hw:3,0 and make all of
its 64 channels appear as capture/playback ports in qjackctl ?

Note: I do not have anything PulseAudio (like pavucontrol)
installed and don't want to either. If that would be a
requirement then I'll just forget about using pipewire.

TIA,

-- 
FA

___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] PipeWire

2018-08-20 Thread Wim Taymans
On Mon, 20 Aug 2018 at 01:41, Robin Gareus  wrote:
>
> On 02/19/2018 09:39 AM, Wim Taymans wrote:
> [...]
> > I would very much like to hear your ideas, comments, flames, thoughts on 
> > this
> > idea. I think I'm at a stage where I can present this to a bigger audience 
> > and
> > have enough experience with the matter to have meaningful discussions.
>
> Hi Wim,

Hi Robin,

Thanks for taking time to reply.

>
> I think the general lack of enthusiasm about pipewire here is because it
> does not solve any issues for linux-audio and at best does not
> introduces new ones.
>
> In the past years the most prominent question that I have received is
>
>  * How can I use all of my USB Mics with Ardour on Linux?
>  * How do I uniquely identify my many MIDI devices?
>  * Why does my audio device not have proper port-names?
>  * Why can't I re-connect my device and resume work?
>
> These questions are mostly from Mac or Windows users moving to Linux ...
> and many of them moving back to MacOS.
>
> If you try to come up with a new system (think pipewire), please copy as
> many concepts as possible from Mac's CoreAudio.
>

I heard this before. Device management on linux is still pretty bad. I have not
seriously looked at how to solve any of this yet..

>
> While it is not impossible to combine multiple devices, it is not a
> straightforward to set this up. Manging devices uniquely and handling
> temporarily missing devices is not possible on GNU/Linux AFAIK.

One of the ideas with PipeWire is to move much of the logic to set up devices
and filters to another process. We would like to have the desktops come up
with policies for what to connect when and where and implement those.

This would also make it possible to do more configuration in the desktop control
panels, like rank devices (to select a master device), setup filters
for surround sound,
bass boost, echo cancellation (the things you can configure in Windows
and MacOs).

I know there a problems with uniquely identifying devices in Linux that may not
make this 100% perfect but we should be able to get to the same state as MacOs
or Windows.

The logic for combining devices exists and works well (zita-a2j/j2a) I would
like to have built-in support for this in PipeWire as soon as 2
devices interact.
MacOS has a panel for combining devices, we need something like that too.

>
> Both pulseaudio and jack had the correct idea to present audio as a
> service to applications. The server is concerned with device(s) and
> device settings. However, both fail to abstract multiple devices, map
> their port uniquely and provide multiple apps to concurrently use those
> devices for different purposes.

Are you talking about JACK? PulseAudio pretty much has this right, no?

>
> The main issue with pulse is that it is a poll API. Also pulseaudio's
> per device, per port-latency is incorrect (if set at all).

What wrong with a poll API? To me PulseAudio has more of an event based
API.. Not sure what you mean with the latency being reported
incorrectly, latency
is dynamic and you can query it, it pretty much gives you access to the read and
write pointers of the device..

> JACK on the
> other hand is too limited: single device, fixed buffersize. jackd also
> periodically wakes ups the CPU and uses power (even if no client is
> connected).

These are the main points for objecting to JACK as a generic desktop
replacement for audio and PulseAudio takes the complete opposite approach.

To me, the ideal solution would be to keep the JACK design and remove the
above mentioned limitations.

>
> Browsing around in the pipewire source I see several potential design
> issues.
>
> In particular data format conversions: The nice part about JACK is that
> uses float as only native format. Also port-memory is shared between
> application with zero-copy.

I 100% agree, arbitrary format conversions are not practical. In PipeWire, there
are 2 scenarios:

1) exclusive access to a device. You can negotiate any format and buffer layout
directly with the device. Very handy for compressed formats, or to get the
maximum performance (games). Of course only one app can use the device
but this can be allowed in certain cases.

2) non-exclusive access. A dsp module (exclusively) connects to the device that
converts to and from the canonical format (float32 mono) and the device format.
Clients then either connect with the canonical format (jack clients) or can use
the stream API (like CoreAudio's AudioQueue) to play or record data with
conversions being handled automatically. So only conversions at the entry and
exit points of the graph, everything in between is float32.

Port memory in PipeWire is also shared between applications and is pretty much
a requirement to do anything related to video. Where JACK has 1 buffer per port
allocated in the shared memory, PipeWire can have multiple buffers per (output)
port that are all shared between the connected peer ports. The reason for
multiple buffers per port is

Re: [LAD] PipeWire

2018-08-19 Thread Robin Gareus
On 02/19/2018 09:39 AM, Wim Taymans wrote:
[...]
> I would very much like to hear your ideas, comments, flames, thoughts on this
> idea. I think I'm at a stage where I can present this to a bigger audience and
> have enough experience with the matter to have meaningful discussions.

Hi Wim,

I think the general lack of enthusiasm about pipewire here is because it
does not solve any issues for linux-audio and at best does not
introduces new ones.

In the past years the most prominent question that I have received is

 * How can I use all of my USB Mics with Ardour on Linux?
 * How do I uniquely identify my many MIDI devices?
 * Why does my audio device not have proper port-names?
 * Why can't I re-connect my device and resume work?

These questions are mostly from Mac or Windows users moving to Linux ...
and many of them moving back to MacOS.

While it is not impossible to combine multiple devices, it is not a
straightforward to set this up. Manging devices uniquely and handling
temporarily missing devices is not possible on GNU/Linux AFAIK.

If you try to come up with a new system (think pipewire), please copy as
many concepts as possible from Mac's CoreAudio.

Both pulseaudio and jack had the correct idea to present audio as a
service to applications. The server is concerned with device(s) and
device settings. However, both fail to abstract multiple devices, map
their port uniquely and provide multiple apps to concurrently use those
devices for different purposes.

The main issue with pulse is that it is a poll API. Also pulseaudio's
per device, per port-latency is incorrect (if set at all). JACK on the
other hand is too limited: single device, fixed buffersize. jackd also
periodically wakes ups the CPU and uses power (even if no client is
connected).

Browsing around in the pipewire source I see several potential design
issues.

In particular data format conversions: The nice part about JACK is that
uses float as only native format. Also port-memory is shared between
application with zero-copy.

In pipewire a port can be any data-type including vorbis and worse MIDI
is a bolted-on sub-type on an audio port.

JACK-MIDI has in the past been criticized most because MIDI was a
dedicated type instead of JACK providing generic event-ports.

Another conceptual issue that I see with pipewire is that it pushes sync
downstream (like gstreamer does), instead of sources being pulled
upstream. This in particular will make it hard to compensate for
latencies and align outputs.

Implementation wise there are plenty of other issues remaining to be
discussed, e.g. context-switches, resampling, process-graph,.. but those
are not important at this point in time.

Cheers!
robin



signature.asc
Description: OpenPGP digital signature
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] PipeWire, and "a more generic seeking and timing framework"

2018-02-19 Thread Jonathan E. Brickman
Many thanks, Paul, for this and much more.  My lame excuse is I have
had a chunk of my head buried in a singular problem for a long time and
those librarians are very tired :-)
Given reality-check, then, maybe the institution of multiple JACK
subgraphs, with time-decoupling by Pulse-style transport, is a way to
get everything done!
J.E.B.
On Mon, 2018-02-19 at 18:04 -0500, Paul Davis wrote:
> JACK is already much closer to the hardware than the networking
> stack.
> 
> At the conclusion of the jack process callback, it writes samples
> *directly into the memory mapped buffer being used by the audio
> hardware*. The process callback is  preemptively (and with realtime
> scheduling) triggered directly from the interrupt handler of the
> audio interface.
> 
> JACK does not use a round-robin approach to its clients. It creates a
> data (flow) graph based on their interconnections and executes them
> (serially or in parallel) in the order dictated by the graph. 
> 
> 
> 
> On Mon, Feb 19, 2018 at 5:57 PM, Jonathan Brickman  com> wrote:
> > Not really sure the subgraph is so good -- one of the things JACK
> > gives us is the extremely solid knowledge of what it just did, is
> > doing now, and will do next period.  If I run Pulse with JACK, it's
> > JACK controlling the hardware and Pulse feeding into it, not the
> > other way around, because Pulse is not tightly synchronized,
> > whereas JACK is.  But if you can make it work as well, more power
> > to you.
> > 
> > Concerning seeking and timing, though, I have had to wonder.  My
> > impression of JACK for a long time (and more learned ladies and
> > gentlemen, please correct) is that it uses a basically round-robin
> > approach to its clients, with variation.  I have had to wonder,
> > especially given my need for this, how practical a model might be
> > possible, using preemptive multitasking or even Ethernet-style
> > collision avoidance through entropic data, at current CPU speeds. 
> > It's chopped into frames, right?  Couldn't audio and MIDI data be
> > mapped into networking frames and then thrown around using the
> > kernel networking stack?  The timestamps are there...the
> > connectivity is there...have to do interesting translations... :-)  
> > Could be done at the IP level or even lower I would think.  The
> > lower you go, the more power you get, because you're closer to the
> > kernel at every step.
> > 
> > 
-- 
Jonathan E. Brickman   j...@ponderworthy.com   (785)233-9977
Hear us at http://ponderworthy.comcom -- CDs and MP3 now available!
Music of compassion; fire, and life!!!___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] PipeWire, and "a more generic seeking and timing framework"

2018-02-19 Thread Paul Davis
JACK is already much closer to the hardware than the networking stack.

At the conclusion of the jack process callback, it writes samples *directly
into the memory mapped buffer being used by the audio hardware*. The
process callback is  preemptively (and with realtime scheduling) triggered
directly from the interrupt handler of the audio interface.

JACK does not use a round-robin approach to its clients. It creates a data
(flow) graph based on their interconnections and executes them (serially or
in parallel) in the order dictated by the graph.


On Mon, Feb 19, 2018 at 5:57 PM, Jonathan Brickman 
wrote:

> Not really sure the subgraph is so good -- one of the things JACK gives us
> is the extremely solid knowledge of what it just did, is doing now, and
> will do next period.  If I run Pulse with JACK, it's JACK controlling the
> hardware and Pulse feeding into it, not the other way around, because Pulse
> is not tightly synchronized, whereas JACK is.  But if you can make it work
> as well, more power to you.
>
> Concerning seeking and timing, though, I have had to wonder.  My
> impression of JACK for a long time (and more learned ladies and gentlemen,
> please correct) is that it uses a basically round-robin approach to its
> clients, with variation.  I have had to wonder, especially given my need
> for this , how practical a
> model might be possible, using preemptive multitasking or even
> Ethernet-style collision avoidance through entropic data, at current CPU
> speeds.  It's chopped into frames, right?  Couldn't audio and MIDI data be
> mapped into networking frames and then thrown around using the kernel
> networking stack?  The timestamps are there...the connectivity is
> there...have to do interesting translations... :-)  Could be done at the IP
> level or even lower I would think.  The lower you go, the more power you
> get, because you're closer to the kernel at every step.
>
> --
> *Jonathan E. Brickman   j...@ponderworthy.com
> 
>(785)233-9977
> <(785)%20233-9977>*
> *Hear us at http://ponderworthy.com  -- CDs and
> MP3s now available! *
> *Music of compassion; fire, and life!!!*
>
> ___
> Linux-audio-dev mailing list
> Linux-audio-dev@lists.linuxaudio.org
> https://lists.linuxaudio.org/listinfo/linux-audio-dev
>
>
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


[LAD] PipeWire, and "a more generic seeking and timing framework"

2018-02-19 Thread Jonathan Brickman
Not really sure the subgraph is so good -- one of the things JACK gives us
is the extremely solid knowledge of what it just did, is doing now, and
will do next period.  If I run Pulse with JACK, it's JACK controlling the
hardware and Pulse feeding into it, not the other way around, because Pulse
is not tightly synchronized, whereas JACK is.  But if you can make it work
as well, more power to you.

Concerning seeking and timing, though, I have had to wonder.  My impression
of JACK for a long time (and more learned ladies and gentlemen, please
correct) is that it uses a basically round-robin approach to its clients,
with variation.  I have had to wonder, especially given my need for this
, how practical a model might be
possible, using preemptive multitasking or even Ethernet-style collision
avoidance through entropic data, at current CPU speeds.  It's chopped into
frames, right?  Couldn't audio and MIDI data be mapped into networking
frames and then thrown around using the kernel networking stack?  The
timestamps are there...the connectivity is there...have to do interesting
translations... :-)  Could be done at the IP level or even lower I would
think.  The lower you go, the more power you get, because you're closer to
the kernel at every step.

-- 
*Jonathan E. Brickman   j...@ponderworthy.com

  (785)233-9977*
*Hear us at http://ponderworthy.com  -- CDs and
MP3s now available! *
*Music of compassion; fire, and life!!!*
___
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org
https://lists.linuxaudio.org/listinfo/linux-audio-dev


Re: [LAD] PipeWire

2018-02-19 Thread Jonathan Brickman
 ​
Greetings, Wim.  Amazing project you have there.  I hope you succeed.  Len
has covered lots of excellent thoughts.  Here are a few more, clearly
intersecting.

First of all, it's a great idea.  I'd love to see one layer which could do
all of JACK and pulse.  But the pitfalls are many :-)  It's worthwhile to
remember that the ALSA people tried a lot of it, the code bits and
configuration settings are still there waiting to be used, it's just that
Pulse and JACK are doing it and more so much more reliably.

Second, the newer JACK+Pulse setup with Cadence controlling it is amazing,
a joy and a simplicity.  Kudos extremus (sorry, I am linguistically
challenged).  It does cost a bit in JACK DSP (5% on the big BNR hard server
when I tried it), but it works very reliably.

And third, I could certainly imagine one layer with three different kinds
of ports:  MIDI (using the JACK MIDI API), Pro Audio (using the JACK audio
API), and Desktop Audio (using the Pulse API).  All desktop audio ports
behave like Pulse, and are controlled using the Pulse control APIs, and by
default their data is mixed into a default Desktop Audio hardware output.
At the control system level (using JACK for all), Pulse ports look like
JACK ports and can be rerouted, but the underlying layer treats them
differently, decouples them from the rigid round-robin of JACK.  This does
not make for a simple system, because there has to be both kinds of ports
for the hardware audio, and I'm sure there are a lot more complications
which others will think of, and which will emerge as soon as users start
trying it!

J.E.B.

On Mon, Feb 19, 2018 at 2:39 AM, Wim Taymans  wrote:

> Hi everyone,
>
> I'm Wim Taymans and I'm working on a new project called PipeWire you might
> have heard about [1]. I have given some general presentations about it
> during
> its various stages of development, some of which are online [2].
>
> PipeWire started as a way to share arbirary multimedia, wich requires
> vastly
> different requirements regarding format support, device and memory
> management
> than JACK. It wasn't until I started experimenting with audio processing
> that
> the design started to gravitate to JACK. And then some of JACKs features
> became
> a requirement for PipeWire.
>
> The end goal of PipeWire is to interconnect applications and devices
> through
> a shared graph in a secure and efficient way. Some of the first
> applications
> will be wayland screen sharing and camera sharing with access control for
> sandboxed applications. It would be great if we could also use this to
> connect
> audio apps and devices, possibly unifying the pulseaudio/JACK audio stack.
>
> Because the general design is, what I think, now very similar to JACK, many
> people have been asking me if I'm collaborating with the linux pro-audio
> community on this in any way at all. I have not but I really want to change
> that. In this mail I hope to start a conversation about what I'm doing and
> I
> hope to get some help and experience from the broader professional audio
> developers community on how we can make this into something useful for
> everybody.
>
> I've been looking hard at all the things that are out there, including
> Wayland, JACK, LV2, CRAS, GStreamer, MFT, OMX,.. and have been trying to
> combine the best ideas of these projects into PipeWire. A new plugin API
> was
> designed for hard realtime processing of any media type. PipeWire is LGPL
> licensed and depends only on a standard c library. It's currently targeting
> Linux.
>
> At the core of the PipeWire design is a graph of processing nodes with
> arbirary
> input/output ports. Before processing begins, ports need to be configured
> with a
> format and a set of buffers for the data. Buffer data and metadata
> generally
> lives in memfd shared memory but can also be dmabuf or anything that can be
> passed as an fd between processes. There is a lot of flexibility in doing
> this
> setup, reusing much of the GStreamer experience there is. This all happens
> on
> the main thread, infrequently, not very important for the actual execution
> of
> the graph.
>
> In the realtime thread (PipeWire currently has 1 main thread and 1
> realtime data
> thread), events from various sources can start push/pull operations in the
> graph. For the purpose of this mail, the audio sink uses a timerfd to wake
> up
> when the alsa buffer fill level is below a threshold. This causes the sink
> to
> fetch a buffer from its input port queue and copy it to the alsa
> ringbuffer. It
> then issues a pull to fetch more data from all linked peer nodes for which
> there
> is nothing queued. These peers will then eventually push another buffer in
> the
> sink queue to be picked up in the next pull cycle of the sink. This is
> somewhat
> similar to the JACK async scheduling model. In the generic case, PipeWire
> has to
> walk upstream in the graph until it finds a node that can produce
> something (see
> below how this can be optimized).
>
> Sch

Re: [LAD] PipeWire

2018-02-19 Thread Len Ovens

On Mon, 19 Feb 2018, Wim Taymans wrote:


PipeWire started as a way to share arbirary multimedia, wich requires vastly
different requirements regarding format support, device and memory management
than JACK. It wasn't until I started experimenting with audio processing that
the design started to gravitate to JACK. And then some of JACKs features became
a requirement for PipeWire.

The end goal of PipeWire is to interconnect applications and devices through
a shared graph in a secure and efficient way. Some of the first applications
will be wayland screen sharing and camera sharing with access control for
sandboxed applications. It would be great if we could also use this to connect
audio apps and devices, possibly unifying the pulseaudio/JACK audio stack.


By unifying I think you mean both things in one server rather than both 
making jack work like pulse or pulse work like jack. I have been using 
jackdbus as my audio server/backend with pulse as a desktop compatability 
layer for 3 to 4 years now with reasonable success. Jackdbus takes care of 
all physical audio devices (I have no bluetooth audio devices) with my 
multitrack audio device (an older ice1712 based delta66) as jack's master 
and any other devices as clients via zita-ajbridge (with SRC). In general, 
I don't use devices cannected through SRC for recording but many beginner 
users have bought "pro USB mics" to start recording and so SRC is "a 
thing".


I run pulse without the alsa, udev and jackdbus-detect modules but do load 
jacksink/source via script as needed. I use my own script because it 
allows me to name my pulse ports so that pulse sees a device name rather 
than just jackd. I do not know the internals of pulseaudio, but have found 
that pulse will sync to any real device it happens to have access to even 
though no stream is using that device. This ends meaning that data is 
transfered to jackd on that devices time schedule rather than jack's with 
the result of xruns in jackd and even crashes when jackd is put in freerun 
mode. By running pulse with no alsa modules and no udev module (which auto 
loads alsa modules when a new device is detected), both of these problems 
are solved.


The one problem I have left is that pulse then has to follow jackd's 
latency model. This is probably because jackd-sink/source are in the 
sample category rather than well thought out and finished. As jack's 
latency goes down (it can be changed while jackd is running), jack's cpu 
load goes up as expected, but it stays in reasonable limits. However, 
pulse is forced to follow along and pulse uses more than double the cpu as 
jack does. Along with this some desktop applications start to fail 
noticably. Skype is a good example of this because it does actually see 
some use in the profesional audio world in broadcast application where 
skype is sometimes used for live remote contribution. (think phone in talk 
show or even news) In such a case, the local studio may be running totally 
in jack using something like idjc with skype linked in using pulse 
bridging. (thankfully asterisk can deal with jack directly and already 
expects low latency operation so normal phone calls just work) Low latency 
jack operation is important in an announcer application as monitoring as 
often done with headphones where a delay of one's own voice may be 
annoying. So jack needs to run at 5ms or so while skype seems to think 
30ms is quite normal (and uses echo cancelation so the talker can't hear 
their own delayed voice).


What this points out is that there are two different requirements that 
sometimes need to be met at the same time. Pipewire has the advantage of 
knowing about both uses and being able to deal with them somewhat more 
gracefully if it chooses to. Desktop needs it's own buffering it seems.


Certainly most people who use jack much would have liked to see jack 
become standard with a pulse like wrapper for desktop. The development 
energy just wasn't there.



Because the general design is, what I think, now very similar to JACK, many
people have been asking me if I'm collaborating with the linux pro-audio
community on this in any way at all. I have not but I really want to change


It does not really matter if pipewire is similar to jack in operation. 
Jack allows things that some applications require and there are users who 
do not have pulse on their system at all. so even if pipewire did not 
allow jack clients to directly connect, jack is still around, still in use 
and will be for some time. (do not be disapointed when some people choose 
to remove pipewire in their new installs and replace it with jackd1, they 
may be vocal, but a small number of people)



that. In this mail I hope to start a conversation about what I'm doing and I
hope to get some help and experience from the broader professional audio
developers community on how we can make this into something useful for
everybody.


While I have done some development using the jack API, 

[LAD] PipeWire

2018-02-19 Thread Wim Taymans
Hi everyone,

I'm Wim Taymans and I'm working on a new project called PipeWire you might
have heard about [1]. I have given some general presentations about it during
its various stages of development, some of which are online [2].

PipeWire started as a way to share arbirary multimedia, wich requires vastly
different requirements regarding format support, device and memory management
than JACK. It wasn't until I started experimenting with audio processing that
the design started to gravitate to JACK. And then some of JACKs features became
a requirement for PipeWire.

The end goal of PipeWire is to interconnect applications and devices through
a shared graph in a secure and efficient way. Some of the first applications
will be wayland screen sharing and camera sharing with access control for
sandboxed applications. It would be great if we could also use this to connect
audio apps and devices, possibly unifying the pulseaudio/JACK audio stack.

Because the general design is, what I think, now very similar to JACK, many
people have been asking me if I'm collaborating with the linux pro-audio
community on this in any way at all. I have not but I really want to change
that. In this mail I hope to start a conversation about what I'm doing and I
hope to get some help and experience from the broader professional audio
developers community on how we can make this into something useful for
everybody.

I've been looking hard at all the things that are out there, including
Wayland, JACK, LV2, CRAS, GStreamer, MFT, OMX,.. and have been trying to
combine the best ideas of these projects into PipeWire. A new plugin API was
designed for hard realtime processing of any media type. PipeWire is LGPL
licensed and depends only on a standard c library. It's currently targeting
Linux.

At the core of the PipeWire design is a graph of processing nodes with arbirary
input/output ports. Before processing begins, ports need to be configured with a
format and a set of buffers for the data. Buffer data and metadata generally
lives in memfd shared memory but can also be dmabuf or anything that can be
passed as an fd between processes. There is a lot of flexibility in doing this
setup, reusing much of the GStreamer experience there is. This all happens on
the main thread, infrequently, not very important for the actual execution of
the graph.

In the realtime thread (PipeWire currently has 1 main thread and 1 realtime data
thread), events from various sources can start push/pull operations in the
graph. For the purpose of this mail, the audio sink uses a timerfd to wake up
when the alsa buffer fill level is below a threshold. This causes the sink to
fetch a buffer from its input port queue and copy it to the alsa ringbuffer. It
then issues a pull to fetch more data from all linked peer nodes for which there
is nothing queued. These peers will then eventually push another buffer in the
sink queue to be picked up in the next pull cycle of the sink. This is somewhat
similar to the JACK async scheduling model. In the generic case, PipeWire has to
walk upstream in the graph until it finds a node that can produce something (see
below how this can be optimized).

Scheduling of nodes is, contrary to JACKs (and LADSPA and LV2) single 'process'
method, done with 2 methods: process_input and process_ouput. This is done to
support more complex plugins that need to decouple input from output and to also
support a pull model for plugins. For internal clients, we directly call the
methods, for external clients we use an eventfd and a shared ringbuffer to send
the right process command to the client.

When the external client has finished processing or need to pull, it signals
PipeWire, which then wakes up the next clients if needed. This is different from
JACK, where a client directly wakes up the peers to avoid a server context
switch. JACK can do this because the graph and all client semaphores are shared.
PipeWire can't in general for a couple of reaons: 1) you need to bring mixing of
arbitrary formats to the clients 2) sandboxed clients should not be trusted with
this information and responsability. In some cases it would probably be possible
to improve that in the future (see below).

This kind of scheduling works well for generic desktop style audio and video.
Apps can send buffers of the size of their liking. Bigger buffers means higher
latency but less frequent wakeups. The sink wakeup frequency is determined by
the smallest buffer size that needs to be mixed. There is an upper limit for the
largest amount of data that is mixed in one go to avoid having to do rewinds in
alsa and still have reasonable latency when doing volume changes or adding new
streams etc.

The idea is to make a separate part of the graph dedicated to pro-audio. This
part of the graph runs with mono 32bit float sample buffers of a fixed size and
samplerate. The nodes running in this part of the graph also need to have a
fixed input-output pattern. In this part of the grap