Dave Sherman wrote: > If I understand correctly, one of the things that makes X use more > memory and run slower is that it is a network application, and as such, > must run all of its requests through the local loopback on your > workstation (X is both a client-server protocol and software suite). > > This being the case, couldn't someone write a "localhost-only" version > of X that emulates the networking protocols without actually doing > anything through the loopback device, and only runs on a local > workstation? This would provide backwards compatibility with all X > software, but would remove the network code and functionality that takes > memory, processing power, and resources that could be better used > elsewhere. > > Just a thought. Maybe someone can rip a few holes in my theory to prove > me wrong.
Well, I am not (by any means) an expert. What I think is part of the problem is what I'll call "loose coupling" between the application and, for example, the video memory on the video card. This is alluded to on the Berlin site, in one of the FAQs. For example, I can imagine a video subsystem where [controls | widgets] (like a combo box) are generated and stored "closer to" the video card -- in X, they are generated in the application, and sent to the video device "pixel by pixel". When this pixel by pixel transmission has to be via a network, it can be considerably slower than an alternate approach. I think as long as X is running locally, it is using very fast data transmission over the busses in the system (pci, agp, etc.). X could be faster even over a network if it transmitted a message to say "please display a combo box at coordinates blah, blah, with background color blah, and filled with these choices, blah, blah, blah, etc. etc. etc.", instead of generating the combo box in the application and then sending all the individual pixels. It is my understanding that Berlin is working on an approach more like this. You're right if you recognize that I've avoided the question about which is faster within a single computer -- I think one of the trends that will make video cards faster is having more intelligence within the video card. I think video cards can now generate triangles / polygons from commands rather than having them transmitted pixel by pixel. Sooner or later (maybe already), video cards will have the capability to generate [controls | widgets] from commands (all the electronic technology exists), and video display speed will be increased when applications send commands instead of pixels to the video card. Berlin is not ready for prime time yet (AFAIK), but, whenever I get a chance I like to mention Berlin so another way is recognized. (Of course, X may turn out to evolve just like Fortran -- I learned Algol before Fortran, and then wondered why they taught me Fortran -- over the years, Fortran has evolved to include most of the (good) features and more of Algol and other modern languages.) Randy Kramer
Want to buy your Pack or Services from MandrakeSoft? Go to http://www.mandrakestore.com