(really need to start using reply-all, sorry for the double message bryan. 
mailing list wasn’t included).

hello sir,

thank you for an incredibly detailed, thoughtful and quick response.

first off: it’s not usb fam! lol, usb. this sucker is PCI, so we’re doing DMA! 
:)


> On Dec 10, 2022, at 6:34 PM, brianw <[email protected] 
> <mailto:[email protected]>> wrote:
> 
> 
> 
> On Dec 10, 2022, at 2:07 PM, Gagan Sidhu via Coreaudio-api 
> <[email protected] <mailto:[email protected]>> wrote:
>> how does one implement an SPDIF toggle?
> 
> What is a "toggle"? Are you talking about an Enable, or a Mute, or some other 
> sort of switch? I've not heard the term used within the CoreAudio universe, 
> so it's a difficult question to answer without more information.
> 
> ... and then, what is an "SPDIF toggle"?
> 
> Perhaps you should step up a conceptual level and describe what you are 
> trying to accomplish. Are you creating support for SPDIF? Are you trying to 
> gain access to pre-existing support for SPDIF? Are you even sure that you 
> need SPDIF, per se, and not merely a general CoreAudio stream?
> 
objective: i am trying to wrap this big alsa driver for a cutting-edge family 
of sound cards (even 14 years later), written by some elusive german fellow 
named Clemens Ladisch, for APPUL desktops.

all of his code is here: 
https://elixir.bootlin.com/linux/v4.14.298/source/sound/pci/oxygen/ 
<https://elixir.bootlin.com/linux/v4.14.298/source/sound/pci/oxygen/>

i originally started with the (outdated, since it was originally [correct me if 
i’m wrong] developed when APPUL machines were PPC) SamplePCIAudioDevice: 

https://opensource.apple.com/source/IOAudioFamily/IOAudioFamily-183.4.2/Examples/Templates/SamplePCIAudioDriver/
 
<https://opensource.apple.com/source/IOAudioFamily/IOAudioFamily-183.4.2/Examples/Templates/SamplePCIAudioDriver/>


> 
>> we could use an IOAudioToggleControl, sure,  but i noticed 
>> kIOAudioSelectorControlSelectionValueSPDIF is in an enum.
> 
> I am not familiar with the use-case for IOAudioToggleControl, so I'd probably 
> have to see an example (code or user interface) before I could attempt to 
> comment.
> 
> Looking at the headers that define 
> kIOAudioSelectorControlSelectionValueSPDIF, it looks to me like an Audio 
> Selector Control is a way to choose a particular type of connector on a 
> physical audio interface connection. For example, the typical MacBook has 
> internal speakers and a headphone jack, but only one audio output that can be 
> directed to one or the other of these two destinations. There is thus an 
> Audio Selector Control that allows an application to query or set the Device 
> connection type to kIOAudioSelectorControlSelectionValueInternalSpeaker or 
> kIOAudioSelectorControlSelectionValueHeadphones. Different physical hardware 
> interfaces will have different valid options available.
> 
> kIOAudioSelectorControlSelectionValueSPDIF is an enum because it is just one 
> of eight possible selections for an Audio Selector Control. Looking at the 
> header comments, three options are output-specitic, three are input-specific, 
> and two are common to both output and input. SPDIF is one of the types that 
> happens to be valid for both input and output. One enum is for 'none', but we 
> won't count that. Thus, for physical inputs, there are five valid options, 
> and a given audio interface device will have some number of these. Same for 
> physical outputs - five valid options in total, two in common with input and 
> three unique to output - with a given audio interface device allowing one or 
> more of these.
> 
> It's probably worth pointing out a few things. You can't force physical 
> hardware into SPDIF mode unless its electronics are actually designed to 
> support it. This setting won't really be appropriate unless your code is 
> talking directly to a CoreAudio physical device. Granted, most of your code 
> might be dealing with a CoreAudio stream before it reaches the hardware, and 
> so it's worth pointing out that SPDIF isn't a concern until the audio stream 
> reaches a physical interface.
> 
> Finally, there's almost surely going to be a hardware-specific driver that 
> determines whether SPDIF is available as an option or not, otherwise 
> selecting it will probably return an error. There is probably some mapping 
> between USB Audio Class and CoreAudio where the macOS driver for all UAC 
> devices can automatically translate - however, I'm not sure whether UAC 
> actually has this, so it's probably an avenue where more research is needed 
> (the USB to CoreAudio aspect, that is). Non-USB audio interfaces will surely 
> need a custom driver to facilitate support for S/PDIF.
> 
> I did not immediately see a way to query a device for a list of Audio 
> Selector Control types that are available on a specific input or output 
> stream, but knowing CoreAudio there is probably a way.
> 
> 
>> and if my usage is appropriate, another question would be about handling a 
>> massive number of controls that involve monitor modes.
> 
> What exactly are you trying to do here? CoreAudio might not be appropriate 
> for monitor modes, at least not unless they represent different I/O 
> Selections. The answer probably depends upon whether changing monitor modes 
> is more of an audio processing change or a physical patching change (or both).
> 
> There seems to be some mapping between the USB Audio Class Descriptors and 
> CoreAudio controls, at least for some types of features, but I've not looked 
> too deeply into that. Often, the mapping loses too much in the translation, 
> so there are probably limits to how fancy you can get here.

i get what you’re saying. i may need to leave some features on the table at 
this point and focus on functionality, which i will probably do. 

i think at a minimum, i need more than just the “master controls” and some kind 
of SPDIF would be nice. as far as we’re concerned, we can look at everything on 
the “low-end” as done. i know for sure all of the system-level calls are 
translating because the last time i loaded the driver, it finished init (but i 
had not even started the audiostream/mixers).

the focus is on ensuring all of the coreaudio HAL stuff is in the right place. 
i understand ALSA and CoreAudio are very different, so placement may be 
different for, say, the mixers (which are initialised in the oxygen_probe, 
before the completion of the initialisation: 
https://elixir.bootlin.com/linux/v4.14.298/source/sound/pci/oxygen/oxygen_lib.c#L691
 
<https://elixir.bootlin.com/linux/v4.14.298/source/sound/pci/oxygen/oxygen_lib.c#L691>
> 
> 
>> right now we have a Level and Toggle control, and yes we can use the more 
>> generic IOAudioControl::create, but it seems to me there hasn’t been anyone 
>> asking questions of this nature.
>> 
>> the closest was Yves in 2011: 
>> https://lists.apple.com/archives/coreaudio-api/2011/Mar/msg00004.html
>> 
>> i found willoughby’s answer very interesting in terms of generalisation. for 
>> example, ALSA’s snd_kcontrol_new requires an ‘info’ and ‘get’ function.
>> 
>> yet for coreaudio there is no equivalent. it is almost as if, as long as we 
>> can specify the hardware control for setting things (like alsa’s put), that 
>> “coreaudio will take care of the rest”.
> 
> It's incredibly difficult to design generic support that includes every 
> possible variation and yet automates the mapping between different systems. 
> i.e. The more capable and flexible a design is, the less it can be 
> automatically mapped to some other paradigm. Apple has done quite a good job 
> of designing CoreAudio so that USB Audio Class Devices can be fully 
> supported, but mapping to other systems (ALSA?) may reveal a mismatch between 
> the available building blocks in each system.
> 
> If I were working on this, I'd purchase some of the most complex USB Audio 
> Class Devices available on the market - that are known to be (fully) 
> supported by macOS - and then take a detailed look at both the USB 
> Descriptors and the CoreAudio objects that those descriptors are translated 
> too. Apple provides a couple of tools like HAL Lab that allow visibility into 
> the CoreAudio details for specific devices, but you can also write code to 
> walk the device and object trees to see how things are put together.
> 
> Once you get familiar with a complicated device, this should give you an idea 
> of how things work in CoreAudio, and then via the header files and comments 
> you can perhaps expand on the specifics that you're trying to support. I've 
> had good luck writing custom code to query and display as much data as 
> possible about my particular setups.


a question on this note: are you aware of any pci sound cards that were ever 
released for 10.5 and later for os x? i doubt i’ll be able to find anything in 
terms of the driver source code, but i wanted to ask you.

even with creative’s most-recent USB offering (soundblaster X3, now X4), i am 
not inclined to believe they’re doing full driver-level programming. 
        -instead, i’m betting it’s running an on-board CPU that handles 
everything, allow them to focus on producing a driver that simply handles 
mixing and volume controls. 
                -the reason i say this is the device is not supported by ALSA 
yet users have reported they can do some non-trivial things in spite of no 
support 
(https://www.reddit.com/r/SoundBlasterOfficial/comments/qgsmr7/sound_blaster_x3_without_drivers/
 
<https://www.reddit.com/r/SoundBlasterOfficial/comments/qgsmr7/sound_blaster_x3_without_drivers/>
 as an example)

in short, this was an ambitious undertaking but i believe there are real 
benefits to REAL APPUL users because right now the management is about to bury 
the platform in the name of braindead casuals.

> On Dec 10, 2022, at 6:34 PM, brianw <[email protected]> wrote:
> 
> 
> 
> On Dec 10, 2022, at 2:07 PM, Gagan Sidhu via Coreaudio-api 
> <[email protected]> wrote:
>> how does one implement an SPDIF toggle?
> 
> What is a "toggle"? Are you talking about an Enable, or a Mute, or some other 
> sort of switch? I've not heard the term used within the CoreAudio universe, 
> so it's a difficult question to answer without more information.
> 
> ... and then, what is an "SPDIF toggle"?
> 
> Perhaps you should step up a conceptual level and describe what you are 
> trying to accomplish. Are you creating support for SPDIF? Are you trying to 
> gain access to pre-existing support for SPDIF? Are you even sure that you 
> need SPDIF, per se, and not merely a general CoreAudio stream?
> 
> 
>> we could use an IOAudioToggleControl, sure,  but i noticed 
>> kIOAudioSelectorControlSelectionValueSPDIF is in an enum.
> 
> I am not familiar with the use-case for IOAudioToggleControl, so I'd probably 
> have to see an example (code or user interface) before I could attempt to 
> comment.
> 
> Looking at the headers that define 
> kIOAudioSelectorControlSelectionValueSPDIF, it looks to me like an Audio 
> Selector Control is a way to choose a particular type of connector on a 
> physical audio interface connection. For example, the typical MacBook has 
> internal speakers and a headphone jack, but only one audio output that can be 
> directed to one or the other of these two destinations. There is thus an 
> Audio Selector Control that allows an application to query or set the Device 
> connection type to kIOAudioSelectorControlSelectionValueInternalSpeaker or 
> kIOAudioSelectorControlSelectionValueHeadphones. Different physical hardware 
> interfaces will have different valid options available.
> 
> kIOAudioSelectorControlSelectionValueSPDIF is an enum because it is just one 
> of eight possible selections for an Audio Selector Control. Looking at the 
> header comments, three options are output-specitic, three are input-specific, 
> and two are common to both output and input. SPDIF is one of the types that 
> happens to be valid for both input and output. One enum is for 'none', but we 
> won't count that. Thus, for physical inputs, there are five valid options, 
> and a given audio interface device will have some number of these. Same for 
> physical outputs - five valid options in total, two in common with input and 
> three unique to output - with a given audio interface device allowing one or 
> more of these.
> 
> It's probably worth pointing out a few things. You can't force physical 
> hardware into SPDIF mode unless its electronics are actually designed to 
> support it. This setting won't really be appropriate unless your code is 
> talking directly to a CoreAudio physical device. Granted, most of your code 
> might be dealing with a CoreAudio stream before it reaches the hardware, and 
> so it's worth pointing out that SPDIF isn't a concern until the audio stream 
> reaches a physical interface.
> 
> Finally, there's almost surely going to be a hardware-specific driver that 
> determines whether SPDIF is available as an option or not, otherwise 
> selecting it will probably return an error. There is probably some mapping 
> between USB Audio Class and CoreAudio where the macOS driver for all UAC 
> devices can automatically translate - however, I'm not sure whether UAC 
> actually has this, so it's probably an avenue where more research is needed 
> (the USB to CoreAudio aspect, that is). Non-USB audio interfaces will surely 
> need a custom driver to facilitate support for S/PDIF.
> 
> I did not immediately see a way to query a device for a list of Audio 
> Selector Control types that are available on a specific input or output 
> stream, but knowing CoreAudio there is probably a way.
> 
> 
>> and if my usage is appropriate, another question would be about handling a 
>> massive number of controls that involve monitor modes.
> 
> What exactly are you trying to do here? CoreAudio might not be appropriate 
> for monitor modes, at least not unless they represent different I/O 
> Selections. The answer probably depends upon whether changing monitor modes 
> is more of an audio processing change or a physical patching change (or both).
> 
> There seems to be some mapping between the USB Audio Class Descriptors and 
> CoreAudio controls, at least for some types of features, but I've not looked 
> too deeply into that. Often, the mapping loses too much in the translation, 
> so there are probably limits to how fancy you can get here.
> 
> 
>> right now we have a Level and Toggle control, and yes we can use the more 
>> generic IOAudioControl::create, but it seems to me there hasn’t been anyone 
>> asking questions of this nature.
>> 
>> the closest was Yves in 2011: 
>> https://lists.apple.com/archives/coreaudio-api/2011/Mar/msg00004.html
>> 
>> i found willoughby’s answer very interesting in terms of generalisation. for 
>> example, ALSA’s snd_kcontrol_new requires an ‘info’ and ‘get’ function.
>> 
>> yet for coreaudio there is no equivalent. it is almost as if, as long as we 
>> can specify the hardware control for setting things (like alsa’s put), that 
>> “coreaudio will take care of the rest”.
> 
> It's incredibly difficult to design generic support that includes every 
> possible variation and yet automates the mapping between different systems. 
> i.e. The more capable and flexible a design is, the less it can be 
> automatically mapped to some other paradigm. Apple has done quite a good job 
> of designing CoreAudio so that USB Audio Class Devices can be fully 
> supported, but mapping to other systems (ALSA?) may reveal a mismatch between 
> the available building blocks in each system.
> 
> If I were working on this, I'd purchase some of the most complex USB Audio 
> Class Devices available on the market - that are known to be (fully) 
> supported by macOS - and then take a detailed look at both the USB 
> Descriptors and the CoreAudio objects that those descriptors are translated 
> too. Apple provides a couple of tools like HAL Lab that allow visibility into 
> the CoreAudio details for specific devices, but you can also write code to 
> walk the device and object trees to see how things are put together.
> 
> Once you get familiar with a complicated device, this should give you an idea 
> of how things work in CoreAudio, and then via the header files and comments 
> you can perhaps expand on the specifics that you're trying to support. I've 
> had good luck writing custom code to query and display as much data as 
> possible about my particular setups.
> 
> Brian Willoughby
> 
> 
>> On Sat, Dec 10, 2022 at 1:51 PM Gagan Sidhu via Coreaudio-api 
>> <[email protected]> wrote:
>>> [...] assistance in wrapping this ALSA code to coreaudio (i’m using osx < 
>>> 11)
>>> 
>>> https://github.com/i3roly/CMI8788
>>> 
>>> Thanks,
>>> Gagan

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to