sarge hd, chkrootkit gives
> > about twenty lines of
> >
> > /proc/1544/fd/1 : value too large for defined data type
>
> The chkrootkit-users mailing list archives are at
> <http://marc.theaimsgroup.com/?l=chkrootkit-users>, and I suggest you
> update
proc/1544/fd/1 : value too large for defined data type
The chkrootkit-users mailing list archives are at
<http://marc.theaimsgroup.com/?l=chkrootkit-users>, and I suggest you
update to the latest (0.44) and try it first. Grab the tarball from
chkrootkit.org, ungzip it, type "make sense&quo
Hello all,
I'm running woody on one hd, and have recently installed sarge on
another by upgrading everything over the net after installing from my
original woody cds. Tonight on the sarge hd, chkrootkit gives about
twenty lines of
/proc/1544/fd/1 : value too large for defined data type
Steven Romanow (1) wrote:
> Thanks for the reply. Some gentoo users are also reporting having this
> fixed. I understand the mm-sources are based on vanilla + andrew morton
> patches, and I saw a LKML post (referred to in my bug report) stating a
> relation to a statfs64 patch. I havn't looked y
Thanks for the reply. Some gentoo users are also reporting having this
fixed. I understand the mm-sources are based on vanilla + andrew morton
patches, and I saw a LKML post (referred to in my bug report) stating a
relation to a statfs64 patch. I havn't looked yet to see if my kernel
contains th
Steven Romanow (1) wrote:
> Hi Cory,
> I'm having same problem with my gentoo install. Worked fine with
> 2.4
> kernel. Lemme know if you get a resolution.
>
> http://bugs.gentoo.org/show_bug.cgi?id=39516
>
> Thanks,
> Steve
This has been fixed for me for quite a while; I really can't remem
Hi Cory,
I'm having same problem with my gentoo install. Worked fine with
2.4
kernel. Lemme know if you get a resolution.
http://bugs.gentoo.org/show_bug.cgi?id=39516
Thanks,
Steve
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PRO
Hello,
I'm running Linux 2.6.0-test9-mm1, and having some trouble with df and
nfs mounts. It prints:
df: `/mnt/nfs': Value too large for defined data type
I'm running Sarge currently, and I tried updating to the unstable
version of coreutils, but that didn't solve the pr
of=testfile bs=1024k count=2060
> 2060+0 records in
> 2060+0 records out
> [EMAIL PROTECTED]:/mnt$ ls -la
> ls: testfile: Value too large for defined data type
> total 8
> [...]
>
> 2.4.5 apparently can't handle > 2Gb files, and as someone said, this
> is most
also sprach Christoph Simon (on Tue, 03 Jul 2001 05:53:45PM -0300):
> 2.4.5 can handle it, but maybe you didn't update the tools:
>
> $ uname -a
> Linux 2.4.5 #1 Sun May 27 11:18:54 BRT 2001 i686 unknown
> $ dd if=/dev/zero of=testfile bs=1024k count=2060
> 2060+0 records i
On Tue, 03 Jul 2001 16:02:30 -0400
Alan Shutko <[EMAIL PROTECTED]> wrote:
> "Martin F. Krafft" <[EMAIL PROTECTED]> writes:
>
> > um, can't 2.4.x handle > 2 Gb?
>
> Yes, but you also need userspace to handle > 2GB. If you're running
> woody, most stuff probably supports it. If you're running po
"Martin F. Krafft" <[EMAIL PROTECTED]> writes:
> um, can't 2.4.x handle > 2 Gb?
Yes, but you also need userspace to handle > 2GB. If you're running
woody, most stuff probably supports it. If you're running potato, it
probably doesn't.
--
Alan Shutko <[EMAIL PROTECTED]> - In a variety of flavo
nough.
nope:
[EMAIL PROTECTED]:/mnt$ uname -a
Linux piper 2.4.5 #1 Mon Jul 2 18:46:48 CEST 2001 i686 unknown
[EMAIL PROTECTED]:/mnt$ dd if=/dev/zero of=testfile bs=1024k count=2060
2060+0 records in
2060+0 records out
[EMAIL PROTECTED]:/mnt$ ls -la
ls: testfile: Value too large for defined data
also sprach Robert Waldner (on Tue, 03 Jul 2001 09:49:44PM +0200):
> I can´t find anything about that in $LINUX243SRC/Documentation, only
> stuff about RAM and harddisks. But maybe I´m just not looking hard
> enough...although I´d think that that would not only on the kernel but
> on the file
Even "ls -l" bombs with the message "Value too large for
> defined data type".
>
> What can I do about this?
Well, you can remove it. And then, before restoring it from tape
again, try to get large file support in the kernel. Maybe there are
other ways, but installing
On Tue, 03 Jul 2001 21:37:08 +0200, "Martin F. Krafft" writes:
>also sprach Robert Waldner (on Tue, 03 Jul 2001 09:23:01PM +0200):
>> Admittedly, there´s not much else you can do about it if the file is
>> > 2 GB.
>
>um, can't 2.4.x handle > 2 Gb?
I can´t find anything about that in $LINUX243S
;Martin F. Krafft" writes:
>also sprach Eric N. Valor (on Tue, 03 Jul 2001 12:07:30PM -0700):
>> I keep getting this message while trying to access a (presumably) very
>> large file. It's a tarball restored from tape. I can't do anything more
>> than "ls"
also sprach Robert Waldner (on Tue, 03 Jul 2001 09:23:01PM +0200):
> Admittedly, there´s not much else you can do about it if the file is
> > 2 GB.
um, can't 2.4.x handle > 2 Gb?
martin; (greetings from the heart of the sun.)
\ echo mailto: !#^."<*>"|tr "<*> mailto:"; [EMAIL P
also sprach Robert Waldner (on Tue, 03 Jul 2001 09:23:01PM +0200):
> >cat /dev/null > filename
>
> aieeeh. Are you sure he simply wants the file out-of-the-way?
wow. good point. when i had the problem, i just wanted to delete the
file. oops. should have thought better. let's hope he has NO_GLOB
s
#x27;t do anything more
>> than "ls". Even "ls -l" bombs with the message "Value too large for
>> defined data type".
>>
>> What can I do about this?
>
>cat /dev/null > filename
aieeeh. Are you sure he simply wants the file out-of-the-way?
also sprach Eric N. Valor (on Tue, 03 Jul 2001 12:07:30PM -0700):
> I keep getting this message while trying to access a (presumably) very
> large file. It's a tarball restored from tape. I can't do anything more
> than "ls". Even "ls -l" bombs with the
I keep getting this message while trying to access a (presumably) very
large file. It's a tarball restored from tape. I can't do anything more
than "ls". Even "ls -l" bombs with the message "Value too large for
defined data type".
What can I do
While attempting to untar a very large file (~3gb I believe) I get an error:
"Value too large for defined data type"
I have never seen this before, nor understand what it means. I can ls the
file, but can't "ls -l". Neither will du or anything else cope (all
"Jonathan D. Proulx" <[EMAIL PROTECTED]> writes:
> On Wed, Feb 14, 2001 at 07:20:17PM -0600, William Jensen wrote:
>
> :rm: cannot remove `/tmp/save.tar': Value too large for defined data type
> :
> :I was storing the save.tar in /tmp. I'm running sta
On Wed, Feb 14, 2001 at 07:20:17PM -0600, William Jensen wrote:
:rm: cannot remove `/tmp/save.tar': Value too large for defined data type
:
:I was storing the save.tar in /tmp. I'm running stable version with kernel
:2.4.0. I have tried as root also to remove this file. How can I g
/save.tar': Value too large for defined data type
I was storing the save.tar in /tmp. I'm running stable version with kernel
2.4.0. I have tried as root also to remove this file. How can I get rid
of this bugger?
Bill
Seth Arnold <[EMAIL PROTECTED]> writes:
> Does anyone have suggestions on how to remove this thing? :)
echo > mydocs.zip; rm mydocs.zip ?
moritz
--
/* Moritz Schulte <[EMAIL PROTECTED]>
* http://hp9001.fh-bielefeld.de/~moritz/
* PGP-Key available, encrypted Mail is welcome.
*/
type
total 0
$ rm mydocs.zip
rm: cannot remove `mydocs.zip': Value too large for defined data type
and more.
Does anyone have suggestions on how to remove this thing? :)
Thanks! :)
BTW -- CC's appreciated. Debian-user is a bit too high-traffic for me to
subscribe to it again... :)
--
28 matches
Mail list logo