To elaborate:

BPSK-1000 uses "convolutional interleaving" with a depth of 16,384 symbols.
The symbol rate is 1 kHz (1,000 symbols/sec) so it takes 16.384 seconds for
a data symbol to pass through both the transmit and receive interleave
buffers. The transmitter delay changes a lot from one symbol to the next,
but every symbol experiences the same *total* (transmitter + receiver)
delay: 16,384 symbol times or 16.384 seconds. The idea of any interleaver is
to chop up (short) fades and spread them out in time so that they can be
easily corrected by the Viterbi error correction algorithm (which deals well
with random thermal noise but not with burst errors).

The usual rule of thumb is that an interleaver can easily handle a complete
fade lasting up to 10% of its length, as long as you give it time to recover
between fades. That would be 1.6 seconds, which seemed plenty long for a
continuously transmitting LEO spacecraft on 2 meters.

Of course, you pay a price in delay -- there's no way around it. I have
Sirius Satellite Radio in my car, and it always cuts out 4 seconds *after* I
drive into the parking garage at work. It doesn't come back until (at least)
4 seconds *after* I drive out and it sees the satellite(s) again. The reason
is exactly the same -- an interleaver that takes care of brief fades but not
the really long ones caused by driving into a parking garage.

I chose convolutional interleaving for BPSK-1000 because it has half the
delay of block interleaving for the same fade performance.

Convolutional interleavers also operate continuously, a good match to
ARISSat-1's continuous transmitter. At AOS, your deinterleaver is still full
of noise received earlier; it takes 16.384 seconds to flush it all out and
feed "solid" data to the decoder. During that time, it ramps from pure noise
to pure signal, and at some point it starts correcting what it sees.
Depending on how strong the signal is, that may happen before the flushing
is complete. I.e., it might reconstruct some of the missing symbols sent
before your AOS.

Similarly there is a slow ramp from solid signal down to pure noise over
16.384 seconds at LOS.

See how this helps handle fading? Even an abrupt, complete fade starts the
same, slow 16-second ramp down from signal to noise. If the fade ends only a
second or two later, the rampdown won't have progressed very far and the
decoder will still see mostly signal when the trend reverses and ramps back
up to pure signal. That takes a few extra seconds, but the error correction
can easily handle it all -- as long as the fade isn't *too* long.
Interleaving takes a signal that may be solid one moment and gone the next
and smooths it out so that the signal-to-noise ratio changes only slowly. It
literally averages the signal-to-noise ratio.

Since even a short LEO pass is usually several minutes long, these 16 second
fill/drain intervals didn't seem like a big deal. Besides, we've already had
a similar problem since the old days of the uncoded Phase III block
telemetry format. You might have AOS in the middle of a frame and have to
wait for the next one to start before you can decode anything. Interleaving
isn't really any worse.

The problem is that I didn't count on having the transmitter turned on for
only 40-60 seconds at a time. So....if the transmissions are only 40 sec,
and if you have to wait 16.384 seconds for the interleaver to fill, and you
can't rely on the last 16.384 seconds as the interleaver drains, that leaves
40 - 2*16.384 = 7.232 seconds of solid, noise-free "middle" to work with.

As I recall, ARISSat-1 data frames can be up to 512 bytes long. Ignoring
HDLC flags, bit stuffing, CRC, etc, that's 4K bits. At a data rate of 500
bps (the FEC is rate 1/2), 512 bytes will take 4096/500 = 8.192 seconds to
transmit.

8.192 seconds is longer than 7.232 seconds.

Ooops.

But wait, there's more. If the satellite sends a series of back-to-back 512
byte frames, and the transmitter comes on  too late after one has already
started, you'll have to wait for it to end before you can begin decoding the
next one. Meanwhile, the clock is quickly ticking down until the transmitter
goes OFF again...

Double oops.

Now this probably overstates the problem a bit. Being the engineer that I
am, this is a very conservative analysis -- I made the most pessimistic
assumption at each step. After all, I was stunned when somebody streamed
BPSK-1000 over the net with a lossy MP3 encoder and it *decoded*; I never
thought that would work.

Error correction can fill in for a remarkable variety of ills. In reality,
the satellite won't send a continuous stream of 512 byte frames. In reality,
the key-down intervals may be more than 40 seconds. So I actually won't be
too terribly surprised if the thing actually works. But it won't perform
anything like it will when the satellite is eventually operated in its
intended 100% duty cycle mode.
_______________________________________________
Sent via AMSAT-BB@amsat.org. Opinions expressed are those of the author.
Not an AMSAT-NA member? Join now to support the amateur satellite program!
Subscription settings: http://amsat.org/mailman/listinfo/amsat-bb

Reply via email to