On Fri, 30 Jul 2004, Silvan wrote:
cooked up. What I want to do is look at my disks and gather statistics about
what is eating the most space. Where the biggest files are, which
directories are the largest, etc.
What I do to see large directories is just 'du -sh *' starting at root and
On Sat, Jul 31, 2004 at 08:47:47AM -0700, Matt Perry wrote:
On Fri, 30 Jul 2004, Silvan wrote:
cooked up. What I want to do is look at my disks and gather statistics about
what is eating the most space. Where the biggest files are, which
directories are the largest, etc.
What I do
hi ya paul
On Sat, 31 Jul 2004, Paul E Condon wrote:
#!/usr/bin/perl
... randal's giant script snipped ..
}
..
My one liner is:
du / | sort -k 1,1nr | most
it would be good if we could get paid $100 for each character saved,
like perl golf .. or script golf ..
(
On Saturday 31 July 2004 01:48 pm, Paul E Condon wrote:
(most is a better less. you can use less if you don't want to install
most.)
Cool! Me likey. Wish I'd found most sooner. :)
--
Michael McIntyre Silvan [EMAIL PROTECTED]
Linux fanatic, and certified Geek; registered Linux
On Saturday 31 July 2004 11:47 am, Matt Perry wrote:
What I do to see large directories is just 'du -sh *' starting at root and
then drilling down from there. If you want to see what the largest files
are, you can use this Perl script that Randal Schwartz wrote:
That looks like a keeper.
Before I reinvent the wheel, I thought I'd see if there's already something
cooked up. What I want to do is look at my disks and gather statistics about
what is eating the most space. Where the biggest files are, which
directories are the largest, etc. I'm running out of room, and I'm sure I
Silvan wrote:
I'm running out of room, and I'm sure I must have gigabytes of stupid
junk laying around, but I'm not sure where I left all of it.
- Every month or so I inspect my non-stable systems using the cruft
package.
- For packages not already managed by aptitude, repetitive applications
Silvan wrote:
Before I reinvent the wheel, I thought I'd see if there's already something
cooked up. What I want to do is look at my disks and gather statistics about
what is eating the most space. Where the biggest files are, which
directories are the largest, etc. I'm running out of room,
Alvin Oga wrote:
On Fri, 30 Jul 2004, John Summerfield wrote:
When I get desperate,as root I do things like this:
du --max-depth 2 -m | sort -n | sort
not fair ... that's the same answer i was gonna post, but you
beat me cause i was home eating and watching Jay leno
Whoever Jay is.
On Fri, 30 Jul 2004, John Summerfield wrote:
When I get desperate,as root I do things like this:
du --max-depth 2 -m | sort -n | sort
not fair ... that's the same answer i was gonna post, but you
beat me cause i was home eating and watching Jay leno
though i'd just have used sort +1
hi ya john
- fun stuff ...
When I get desperate,as root I do things like this:
du --max-depth 2 -m | sort -n | sort
...
not fair ... that's the same answer i was gonna post, but you
beat me cause i was home eating and watching Jay leno
i had fat fingers hit return before i wanted to
On Fri, 30 Jul 2004 17:54, Alvin Oga wrote:
On Fri, 30 Jul 2004, John Summerfield wrote:
When I get desperate,as root I do things like this:
du --max-depth 2 -m | sort -n | sort
not fair ... that's the same answer i was gonna post, but you
beat me cause i was home eating and watching Jay
On Friday 30 July 2004 02:52 am, John Summerfield wrote:
These days, disk is cheap in com[arison with time, and you need to keep
this in mind. It might be better to replace the drive, put the old one
in a USB enclosure so you can get your important bits back. However,
some will choose to
On Fri, Jul 30, 2004 at 04:37:32PM -0400, Silvan wrote:
On Friday 30 July 2004 02:52 am, John Summerfield wrote:
These days, disk is cheap in com[arison with time, and you need to keep
this in mind. It might be better to replace the drive, put the old one
in a USB enclosure so you can get
14 matches
Mail list logo