In article 
<[EMAIL PROTECTED]>,
 jlevine <[EMAIL PROTECTED]> wrote:

> Hello,
> 
> > While it's unlikely that I will soon get to build such an instrument, I
> > am quite interested in how they are built, if only to understand what
> > can happen and why.  Can you suggest some articles and/or books and/or
> > patents delving into both the theory and the practicalities of building
> > DMTD instruments?
> 
>    We (the time and frequency division of NBS/NIST) designed and built
> a dual-mixer systerm in 1980 (more or less). This same system is the one
> that still runs the atomic clock ensemble in Boulder. You can get the 
> publications
> that describe this instrument from the publications database on our web site.
> Go to tf.nist.gov and click on the publications menu. When the menu appears,
> look for author Glaze. The stuff was published in about 1983 or so.
> There were several papers as I recall with various combinations of the folks 
> who
> built the system and the software drivers for it.

This is precisely the kind of pointer I was hoping for.  Thanks.


>    The system we built was totally analog, but a modern system would probably
> be fully digital. Our system had a resolution of about 0.2 ps and a
> stability of about 3-4 ps. A digital system could do better, mostly because 
> the
> temperature sensitive stuff could be confined to the analog front end whereas 
> we
> had to worry about temperature pretty much everywhere in the system.

That isn't bad for 1980 analog electronics.  I think that the 5120 is 
the digital realization, as discussed in other postings.  That said, the 
5120 is temperature sensitive, and one had to allow many hours for 
temperatures to stabilize, but then the resolution appeared to be about 
0.01 pS. I assume that the improvement from 0.2 pS was due to the fancy 
matched-mixers trick, combined with use of a very low noise oscillator.


> However, the job is not trivial, since even tiny impedance mismatches can
> cause problems at this sub-picosecond resolution. You should watch especially
> for the connectors and the cables. We typically use SMA connectors and
> rigid coax. The inputs are buffered with distribution amplifiers with
> a reverse isolation that is as good as we can make it. About -165 db, I think,
> although I have not looked at that recently. (Note that the problems are not
> adequate digital computing power but plain old analog electronics.)

As I said, I don't think I will be building such an instrument.  But 
it's just this kind of nitty gritty detail I want to be aware of, for 
interest, and for self-protection in the lab.


>    Even so, we have a detectable sensitivity to temperature at the
> level of ps. This noise level tends to be too small to affect the
> data from cesium standards, but it could be a problem if you were trying to
> calibrate the long-period performance of a device or a transmission system 
> that
> had a small delay, since the residual diurnal temperature sensitivity could
> come to get you. 

What we were doing was to measure the temperature coefficient of 
electrical length of a temperature-stable 10 MHz distribution amplifier, 
the goal being a tempco not exceeding 1.0 pS per degree centigade.  Some 
of the tested amps achieve ~0.5 pS/degree C, in a total delay of ~4.5 
nanoseconds, or ~111 ppm per degree C, call it 100 ppm.

The test consisted of measuring changes in total delay at three 
temperatures, 17, 24, and 31 degrees C.  The problem is that it took at 
least an hour for the amplifier to stabilize at each temperature, so 
instrument drift is a significant source of error.  The measured "RC" 
time constant of delay of the amplifier in chamber is 14 minutes.

My solution was to compare the amplifier under test to a mechanical 
variable delay unit (Colby Instruments PDL-100A-625PS-5.0NS), using a 
fast sampling scope (200 femtosecond rms jitter(?), averaged down to ~50 
fS) as the null detector.  

The specific circuit is a low-noise oscillator (Symmetricom 1050A) 
driving the first splitter, one output driving the scope sync input, the 
other driving the input of the second splitter.  One output of the 
second splitter drives the reference path, which contains the variable 
delay unit.  The other output drives the device path, which contains the 
amplifier under test.  Both device and reference path cables pass 
through the environmental chamber, with the heated lengths held equal.  
The cables are low tempco as well (~1.5 ppm per degree C).  Everything 
was 50-ohm, at least nominally, but no attempt at precision matching or 
isolation was made, and the connectors and adapters were a mix of 
whatever could be scrounged up in the lab.

This setup yielded clean data, easily sufficient to the purpose.  The 
main limits to accuracy appear to be hysteresis in the amplifiers under 
test, and the cyclic temperature variation of the environmental chamber 
itself.


> If you are in this business then you need professional help.

Heh.  I've been told this before, but the issue was not the measurement 
of time.

Anyway, the current measurements are complete.  But I expect the issue 
to arise again, and I'll be doing some homework and research in the 
meantime.


Joe Gwinn


> Judah Levine
> Time and Frequency Division
> NIST Boulder

_______________________________________________
questions mailing list
questions@lists.ntp.org
https://lists.ntp.org/mailman/listinfo/questions

Reply via email to