On Tue, 7 May 2024 19:53:45 +0200
Ichthyostega <[email protected]> wrote:

>Hi Yoshimi-devs,
>
>...there is a topic and an issue, which I am encountering since a long time;
>Will and I were discussing that on on and off, and just recently Will brought
>up a related idea, which encourages me to pick this discussion up....
>
>> On Thu, 14 Jul 2022 19:01:00 +0200 Ichthyostega <[email protected]> wrote:
>>  
>>> For my own work, I am fighting with some problem for quite some time, but
>>> I'm not really sure how to approach that best. If I recall right, I have
>>> mentioned it several times in the past. The problem is: if you build an
>>> instrument based on some distinct sonic character, it often turns out that
>>> the result is either not balanced in itself over various octaves, and
>>> sometimes it also does not "run" well, again over a span of several
>>> octaves.
>>>
>>> This problem manifests itself with some obnoxious symptoms: if you play a
>>> chord, either the high-end or the low-end dominates. And the balance tends
>>>  to slide away when the score moves up or down.
>>>
>>> Similar problems are known from conventional instruments, and there they
>>> are just considered as symptoms of inferior craftsmanship. A well made
>>> instrument is just expected to have a smooth "run" over various octaves
>>> and should have a similar "presence" in high as in low register; it is
>>> expected to work well when played in ensemble, without requiring
>>> excessive tweaks or "stunts" by the musician.
>>>
>>> Some Ideas to alleviate such problems
>>>
>>> - possibility to define a volume adjustment based on the key,
>>> as part of the instrument itself (not a controller)
>>>
>>> - fading in/out and crossfading of layers, again based on volume,
>>> not velocity.  
>
>
>Am 15.07.22 um 19:08 schrieb Will Godfrey:
>
>> Yes. I agree it should be instrument specific, although where exactly?
>> Would this be a whole instrument setting, or per engine?  
>
>
>
>On 07.05.24 14:35, Will Godfrey wrote:
>> This is something that's been in the back of my mind for years.
>> We both know that using velocity is not exactly ideal!
>>
>> However, today I had a close look at the note event.
>> The only parameter actually carried 'within' a note is velocity.
>> See Misc/Part.h Line 231
>>
>> So I wondered what would happen if I expanded the struct... and stuck
>> a float after it, then compiled it and played one of my heavyweight tracks
>> with no noticable change in performance.
>>
>> Now obviously just increasing the memory foot print is not going to do too
>> much, but I'm wondering if this could be a route to getting individual note
>> control of volume.
>>
>> Further to that, if this is viable, it moves us into new Yoshimi specific
>> territory, so why restrict it to crossfading volume. Could we instead
>> make it a sort of general purpose one for any other control that's specific
>> to particular notes?  
>
>
>First of, I can confirm that adding this or that further check or simple
>calculation to the note-on-path does typically not result in any measurable
>performance difference. Presumably there are some limits; I'd expect it to
>be problematic when we have to feed through extended amounts of data, or
>when we'd have to do computations exceeding beyond the range of ~10µs
>
>So it does not seem to pose any problems having note-on make some further
>adjustment to parameters which are computed there anyway. Two other questions
>are much more tricky
>
>- where does this information come from? Is it feed in alongside with the
>   note info? Is there even a way to do so in the existing standards? I have
>   only a rough understanding of the MIDI standard and did not watch that
>   topic too closely. Right now we are using the controllers, which are
>   rather global feeds, however. On the other hand, to address that specific
>   issue with balancing an instrument, it would be sufficient to /derive/
>   the adjustment from a lookup-table, which is keyed by the MIDI note
>
>- how can we integrate such controls into the existing GUI? It seems,
>   this is a limiting factor, because the GUI is already quite well populated,
>   and there is a limited version of such an adjustment or coupling, namely the
>   velocity sensitivity, which is implemented by dedicated control knobs at 
> some
>   places. Clearly there are limits on carrying this idea further.
>

I'm not only thinking of GUI access. I've been watching MIDI developments fairly
closely the last few years. I quickly dismissed MPE as a stopgap that will go
the way of the Dodo, but MIDI 2 is something quite different. There are now
real keyboard controllers out that fully support it (albeit priced too high for
most people). One focus of this (among several others) is individual note
control. 'Property Exchange' is especially interesting.

There are more important issues we need to sort out within Yoshimi first, but I
think we should at least keep it in mind while doing other work. Making inroads
into note control would be good practice, as well as sorting out crossfade
properly. Also, I counted 55 CCs that have no official MIDI designation that
could be applied to this - to say nothing of NRPNs.

Incidentally, recent versions of ALSA do have MIDI 2 support, although I don't
know how complete it is.

-- 
Will J Godfrey {apparently now an 'elderly'}



_______________________________________________
Yoshimi-devel mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/yoshimi-devel

Reply via email to