I do apologize for contributing to this off-topic thread...

A technical issue for cable modems is the uplink.
For downlink, the cable modem system uses one or more "TV channels to sends its data on. There is encoding and all that but the signal isn't un-like a TV channel.

The uplink is where the problem lies.

Cable modems typically use low frequencies (up to 50 MHz or so) to transmit their packets from the enduser modem to the headend. You'd need a channel bandwith of 8 MHz or so. Where does a cable operator find a clean 8MHz channel under 50 MHz?

Keep in mind that the ingress signal (from the enduser to the headend) collects all the RF leakages from the environment. All poor cabling together creates a massive reception antenna. And a single strong carrier in the channel may block it. I had a chance to see the upstream spectrum ("ingress signals") and it was not a pretty sight. I know that one local cable operator had to change channels because in the evening, strong shortwave broadcast signals leaking in were blocking the ingress channel of the cable-TV modems.

For this reason, the ingress system, in the DOCSYS system I looked at, was using direct-sequence spread spectrum (DSSS) to "smear" these carriers over the spectrum and reduce their impact.

Now, consider very strong carriers on the ingress. Carriers so strong that the conversion gain of the DSSS system doesn't help enough. In that case, the ingress channel is still fully blocked despite the DSSS measures.

What impact things a lot, of course, is leakly cabling. And from what I hear, American cable-TV systems have lots of leakage issues (here in PA0, cable systems are underground and are shielded very, very well; any leakage here is typically poor enduser cabling).

While the response of the tech is very wrong, perhaps the above explains a bit what is happening and the reason for his reaction. If it were me, I would see if things can be solved along these lines, considering that from what I read in QST, many USA cable-TV systems have a massive, massive leaking problem.

In the Netherlands, more and more spectrum is being moved from "broadcast TV" / cable-TV to mobile services. Cable-TV signals are now on frequencies that are also used by mobile phone users, with transmitters (handies) on short distance and hence strong signals. Suddenly the jammers are not us pesky hamradio guys, but every person carrying a mobile phone. And the community now understands that the proper fix for this is not banishing the mobiles, but better cabling. For cable-TV, there are now "kabelkeur" (www.kabelkeur.nl) qualified cables that have double shielding, and, more important, good connectors, connectors that frankly put the typical hamradio RF connector to shame.

And I wonder how this compares to the situation of the topic starter.
With emotions running high we won't get anywhere, but I hope the local ARRL community can step in, find out if my hypothesis is the root issue and work from there, considering that proper shielding is a CATV operator issue and no-one else's.

73,

Geert Jan PE1HZG
______________________________________________________________
Elecraft mailing list
Home: http://mailman.qth.net/mailman/listinfo/elecraft
Help: http://mailman.qth.net/mmfaq.htm
Post: mailto:Elecraft@mailman.qth.net

This list hosted by: http://www.qsl.net
Please help support this email list: http://www.qsl.net/donate.html
Message delivered to arch...@mail-archive.com

Reply via email to