On 2016-01-07 13:35, Attila Kinali wrote:
On Wed, 6 Jan 2016 18:30:07 -0800
"Richard (Rick) Karlquist" <rich...@karlquist.com> wrote:

Read Gilbert's paper or Gray and Meyers analog IC textbook and
you will see that the whole theory of operation of these
depends on keeping the signal levels in them very small,
especially if linearity (actually translinearity) is
important.  They always have current sources in the
emitters that contribute a lot of noise.  So you have
small signals and large noise.  The IC's that are
designed to be DC coupled have even more sources of
extra noise.

How about using the Gilbert Cell as "digital" mixer,
ie driving the currents hard from one branch to the other
and replacing the current sources by resistors?

How much would that improve the noise? Would it be still much
worse than the diode mixer?

I think so.

I checked up the MC1496 (just a sample-point of a classic Gilbert cell chip), it has 25 mV Peak, or -22 dBm as maximum input voltage before it starts to compress. Looking at the VCWR curves it is clear that it starts to misbehave there.

Comparing that to the SBL-1+ double-balanced mixer (another random sample-point), which has an LO max of +7 dBm, you are looking at a difference of about 30 dB (29 to be exact, but neither number is exact to the 1 dB so).

The MC1495, which is a linearized variant of the MC1496 (only true to some degree, it's more complex than that), allows 5 V signals easily, but internally you then go down to about the same levels.

So, while you can drive things harder, you can do that on both sides. If it where less of a difference I'd say it would not be such a big difference, but it is relatively large difference here.

Anyway, just wanted to put a few numbers down to illustrate the difference.

Cheers,
Magnus
_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to