Hey all

I once read about a simple and robust way to perform latency
measurements with an audio signal. 

Explained in a few words, the test signal consists of a sweeping sine
tone. The return signal ring-modulates the source signal and the
resulting signal consists of two frequencies, the sum ( f_src+f_ret ) 
and difference ( f_src - f_ret), while the lower frequency is
proportional to the latency and can be detected quite easily with  with
e.g. [sigmund~].

I have troubles finding the name of this algorithm and don't remember
the original source. I would like to read more about it and correctly
attribute the original author / inventor.

Thanks,
Roman  

Attachment: signature.asc
Description: This is a digitally signed message part

_______________________________________________
Pd-list@lists.iem.at mailing list
UNSUBSCRIBE and account-management -> 
https://lists.puredata.info/listinfo/pd-list

Reply via email to