I just recompiled perl for libc6 (I needed a libc6 version), but
now dpkg-shlibdep gives me this error:
dpkg-shlibdeps ./fakeroot
Can't load '/usr/lib/perl5/i386-linux/5.004/auto/POSIX/POSIX.so' for module
POSIX: /usr/lib/perl5/i386-linux/5.004/auto/POSIX/POSIX.so: undefined symbol:
Brian == Brian White [EMAIL PROTECTED] writes:
Brian Maintainers don't chose what section their package goes
Brian in -- they only recommend. It's decided by the ftp site
Brian maintainer (Guy).
Maybe they should co-ordinate with one another a little then.
Brian Besides
On Jun 29, Fernando wrote
I fear the Documentation Policy is being based on the wrong technical
assumptions.
The fact is that what slows down HTML in an old system is not a web server or
the cgi converters. It is the browser!
[-stuff snipped-]
Thanks,
Hi,
Christoph == Christoph Lameter [EMAIL PROTECTED] writes:
You can't fix the browsers, because we don't have the source for
important browsers like netscape.
Christoph You mean the Debian Project caving in and changing its
Christoph standards because some non free product cannot be changed?
Christoph == Christoph Lameter [EMAIL PROTECTED] writes:
Christoph This wont work as we already have said again and
Christoph again. You are modifying the HTTP protocol with this
Christoph and creating a new .html.gz extension in essence. And
Christoph sometimes the web browser
On 29 Jun 1997, Karl M. Hegbloom wrote:
Christoph == Christoph Lameter [EMAIL PROTECTED] writes:
Christoph This wont work as we already have said again and
Christoph again. You are modifying the HTTP protocol with this
Christoph and creating a new .html.gz extension in essence. And
On Sat, 28 Jun 1997, Jim Pick wrote:
You are proposing that a web-server is supposed to be searching
through the .html code it serves and replace all links referring to
.html.gz by .html links?
dwww does this - it's not trivial. This is definitely not the job of a
web server.
I agree
On Sun, 29 Jun 1997 11:57:49 -0400 , Joey Hess wrote:
Karl M. Hegbloom:
I think it would be good to divide the /usr/doc directory into sub
directories. It should be divided in the same as the Debian ftp site,
and packages should put their documentation into the same slot as the
one they
On Sun, 29 Jun 1997, Christoph Lameter wrote:
This is a non-standard extension of the http protocol!
I support your idea of using a WWW server for documentation, but you're
saying wrong things and making people be angry with you.. =)
The HTTP protocol DOESN'T rely on extensions. No HTTP
On Sun, 29 Jun 1997, Jim Pick wrote:
One complication I can think of - dselect and the ftp sites have the
concept of overrides, where Guy can change the section a package
is assigned to. This wouldn't be reflected in the /usr/doc
directory - of course, this might not really matter.
I think
Bill Mitchell:
someone else (I missed the aqttribution) said:
Me :-)
I completly agree. I have 434 items in /usr/doc, and that's too many.
Splitting it up by package section is a very good idea.
I'd agree that a directory with over 400 items in it is probably
excessively unwieldy.
On Sun, 29 Jun 1997, Christoph Lameter wrote:
This is a non-standard extension of the http protocol!
This is a pretty silly argument. The web server has complete control over
how a compressed document is presented. It can send the document as
Content-Type: text/html or as Content-Type:
Hi,
Also, 11M may not be a typical install. I get a far higher number:
__ du -s /usr/doc
92026 /usr/doc
Uncompressing this is very likely to annoy me.
11M was for my old 386 box (no X installed) - I'm only using about
200M total on that system. That works out to about 5%
[please cc any responses to me.]
Is anybody busy working on these? I ask because I'm fairly close to
(hopefully :) creating a working set of rules/control files/etc. for
compiling SRC Modula-3, and associated programs. If all goes well, I
should have them finished within a week or two.
I
On Sun, 29 Jun 1997, Bruce Perens wrote:
On Sun, 29 Jun 1997, Christoph Lameter wrote:
This is a non-standard extension of the http protocol!
This is a pretty silly argument. The web server has complete control over
how a compressed document is presented. It can send the document as
In your email to me, Christoph Lameter, you wrote:
Since we were talking about including a web-server in the base system here
some thoughts.
I often maintain headless servers. I always have to attach a screen for
the initial install or if something is seriously wrong with the machine.
Sounds slick. It wouldn't be too hard to do. It would be slick to
have some more network smarts (like DHCP, and dialup to an ISP) on
the boot disks (or some variant thereof).
As for configuration via the web - check out the GPL'd Java telnet applet
I've got installed on my webserver
From: Christoph Lameter [EMAIL PROTECTED]
This is a non-customary extension to the functionality available in common
web-browsers on non-Linux platforms.
As far as I can tell you could have written it as we really need a web
server here, unless all web browsers are guaranteed to be able to do
No, we don't have perl on the rescue disk. However really tiny servers
that handle CGI are probably possible.
Bruice
--
Bruce Perens K6BP [EMAIL PROTECTED] 510-215-3502
Finger [EMAIL PROTECTED] for PGP public key.
PGP fingerprint = 88 6A 15 D0 65 D4 A3 A6 1F 89 6A 76 95 24 87 B3
19 matches
Mail list logo