The top posting may be confusing, but "the example" here is the example of the 
> 100 TCP destinations and dozens of DNS queries that are needed (unless 
cached) to display the front page of CNN today.
That's "one website" home page. If you look at the JavaScript resource loading 
code, and now the "service worker" javascript code, the idea that it is like 
fetching a file using FTP is just wrong. Do NANOG members understand this? I 
doubt it.
 
On Monday, September 20, 2021 5:30pm, "David P. Reed" <dpr...@deepplum.com> 
said:



I use the example all the time, but not for interviewing. What's sad is that 
the answers seem to be quoting from some set of textbooks or popular 
explanations of the Internet that really have got it all wrong, but which many 
professionals seem to believe is true.
 
The same phenomenon appears in the various subfields of the design of radio 
communications at the physical and front end electronics level. The examples of 
mental models that are truly broken that are repeated by "experts" are truly 
incredible, and cover all fields. Two or three:
 
1. why do the AM commercial broadcast band (540-1600 kHz) signals you receive 
in your home travel farther than VHF band TV signals and UHF band TV signals?  
How does this explanation relate to the fact that we can see stars a million 
light-years away using receivers that respond to 500 Terahertz radio (visible 
light antennas)?
 
2. What is the "aperture" of an antenna system? Does it depend on frequency of 
the radiation? How does this relate to the idea of the size of an RF photon, 
and the mass of an RF photon? How big must a cellphone be to contain the 
antenna needed to receive and transmit signals in the 3G phone frequencies?
 
3. We can digitize the entire FM broadcast frequency band into a sequence of 
14-bit digital samples at the Nyquist sampling rate of about 40 Mega-samples 
per second, which covers the 20 Mhz bandwidth of the FM band. Does this allow a 
receiver to use a digital receiver to tune into any FM station that can be 
received with an "analog FM radio" using the same antenna? Why or why not?
 
I'm sure Dick Roy understands all three of these questions, and what is going 
on. But I'm equally sure that the designers of WiFi radios or broadcast radios 
or even the base stations of cellular data systems include few who understand.
 
And literally no one at the FCC or CTIA understand how to answer these 
questions.  But the problem is that they are *confident* that they know the 
answers, and that they are right.
 
The same is true about the packet layers and routing layers of the Internet. 
Very few engineers, much less lay people realize that what they have been told 
by "experts" is like how Einstein explained how radio works to a teenaged kid:
 
  "Imagine a cat whose tail is in New York and his head is in Los Angeles. If 
you pinch his tail in NY, he howls in Los Angeles. Except there is no cat."
 
Though others have missed it, Einstein was not making a joke. The non-cat is 
the laws of quantum electrodynamics (or classically, the laws of Maxwell's 
Equations). The "cat" would be all the stories people talk about how radio 
works - beams of energy (or puffs of energy), modulated by some analog 
waveform, bouncing off of hard materials, going through less dense materials, 
"hugging the ground", "far field" and "near field" effects, etc.
 
Einstein's point was that there is no cat - that is, all the metaphors and 
models aren't accurate or equivalent to how radio actually works. But the 
underlying physical phenomenon supporting radio is real, and scientists do 
understand it pretty deeply.
 
Same with how packet networks work. There are no "streams" that behave like 
water in pipes, the connection you have to a shared network has no "speed" in 
megabits per second built in to it, A "website" isn't coming from one place in 
the world, and bits don't have inherent meaning.
 
There is NO CAT (not even a metaphorical one that behaves like the Internet 
actually works).
 
But in the case of the Internet, unlike radio communications, there is no deep 
mystery that requires new discoveries to understand it, because it's been built 
by humans. We don't need metaphors like "streams of water" or "sites in a 
place". We do it a disservice by making up these metaphors, which are only apt 
in a narrow context.
 
For example, congestion in a shared network is just unnecessary queuing delay 
caused by multiplexing the capacity of a particular link among different users. 
It can be cured by slowing down all the different packet sources in some more 
or less fair way. The simplest approach is just to discard from the queue 
excess packets that make that queue longer than can fit through the link Then 
there can't be any congestion. However, telling the sources to slow down 
somehow would be an improvement, hopefully before any discards are needed.
 
There is no "back pressure", because there is no "pressure" at all in a packet 
network. There are just queues and links that empty queues of packets at a 
certain rate. Thinking about back pressure comes from thinking about sessions 
and pipes. But 90% of the Internet has no sessions and no pipes. Just as there 
is "no cat" in real radio systems.
 
On Monday, September 20, 2021 12:09am, "David Lang" <da...@lang.hm> said:



> On Mon, 20 Sep 2021, Valdis Klētnieks wrote:
> 
> > On Sun, 19 Sep 2021 18:21:56 -0700, Dave Taht said:
> >> what actually happens during a web page load,
> >
> > I'm pretty sure that nobody actually understands that anymore, in any
> > more than handwaving levels.
> 
> This is my favorite interview question, it's amazing and saddning at the 
> answers
> that I get, even from supposedly senior security and networking people.
> 
> David Lang_______________________________________________
> Bloat mailing list
> bl...@lists.bufferbloat.net
> https://lists.bufferbloat.net/listinfo/bloat
>
_______________________________________________
Cerowrt-devel mailing list
Cerowrt-devel@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/cerowrt-devel

Reply via email to