Everywhere in the exact sciences there's the dualism between statistical analysis and deterministic engineering tools, since the major break through in quantum physics at the beginning of the 20th century. Whether that's some sort of diabolical duality or, as it actually is at the higher levels of mathematics, some natural state of affairs with on one side theoretical science that properly decorrelates what's not connected and the better beta scientists construct working theories and machinery on the basis of deterministic ("analytic" or number based) hypotheses depends in my opinion on the nature of the beast.

In physics, the strong and hard mathematical foundation of the main solutions for the quantum mechanical equations of name comes primarily from physical observations: nature appears to play a lot of dice at some level, whether we like it or not! That's a real given, not a lack of high frequency measurements or lack of practical knowledge about electromagnetics, waveguides and linear and non-linear electronic networks, but as of a century ago until this day, because of physics laws that in incredible accuracy appear to be based on pure statistics, and hard givens about "causality".

Electronics in the higher frequency ranges, since the beginning, are usually designed in terms of networks (oscillators, mixers, amplifiers, cables), EM field considerations (antennas, waveguides) and a quantum mechanics at the level of transistor design. There are many fields around communications obviously in progress the last decades, including better measurement equipment and better high speed digital processing tools, as well as design software for creating (digital) transmitters and receivers.

Recently I've witnessed Agilent software for mobile phones and other applications digital transmitters and other circuitry, and had some hands on experience with Keysight technology oscilloscopes in the many tens of giga Hertz range. Pretty interesting to actually being able to sample signals of 10s of GHz into a computer memory and for instance do eye-based analysis on digital signals, or play with the various statistics modules in such a device.

I heard the story that some of the latest Xilinx high speed FPGAs with their 28Gb transceiver links when connected over a back-plane create working "eye" diagrams, i;e; the communication works good, but measurement equipment fails to acknowledge this by proper measurement. That's an interesting EE design dilemma right there: is the measurement equipment better than the design at hand, or: do you need a bigger and faster computer than the target computer system you're designing, etc.

So the statistics being discussed come mainly I think from electronics about information theory, and some people, as is normal in inf. th. find it fun to take out some singular (simpler) components like basic statistical signal considerations, in the hope to easily design some competing digital communication protocols. Scientific relevance: close to zero, unless maybe you'd get lucky.

With respect to musical signal analysis, it could be fun to theorize a bit about corner cases that exist since a long time, like a noise source feeding a sample and hold circuit, and making interesting tones and processing with that. Like a S&H unit from a classical 60's modular Moog synthesizer, which probably can be clocked with varying clock, and feedback signals. The prime objective at the time was probably more related to finding out the deep effects of sampling on signals, and encoding small signal corrections in analog signal for when they where going to be on CD. My guess...

T.V.
_______________________________________________
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Reply via email to