There appears to be an implicit assumption that peak vs. average detection
yield a 20 dB difference.  I don't understand that.  If the signal is cw,
like a clock, it won't make any difference what detector you are using.  if
the signal has some modulation, then the peak/average detector output ratio
will depend on the modulation, right?  What am I missing?

----------
From: "Stuart Lopata" <stu...@timcoengr.com>
To: "emc" <emc-p...@majordomo.ieee.org>
Subject: Fw: FCC rule interpretation (add'l info)
List-Post: emc-pstc@listserv.ieee.org
Date: Thu, Aug 16, 2001, 9:50 AM



Does this imply that we can use 74 dBuV/m (at 3 meters) rather than the 54
dBuV/m limit
if we took measurements employing peak detection?

I left that last part out in the previous question.

Reply via email to