If I am looking for quick wins i find "du --max-depth=1 | sort -rn"
always a nice way to find the directories that have the biggest impact
on disk space. (Often a lot of small files in part of the file hierarchy
are what fills disks)

Regards, Martin

Martin Visser ,CISSP
Network and Security Consultant 
Consulting & Integration
Technology Solutions Group - HP Services

3 Richardson Place 
North Ryde, Sydney NSW 2113, Australia 
Phone: +61-2-9022-1670    
Mobile: +61-411-254-513
Fax: +61-2-9022-1800     
E-mail: martin.visserAThp.com

This email (including any attachments) is intended only for the use of
the individual or entity named above and may contain information that is
confidential, proprietary or privileged. If you are not the intended
recipient, please notify HP immediately by return email and then delete
the email, destroy any printed copy and do not disclose or use the
information in it.


-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Julio Cesar Ody
Sent: Friday, 15 April 2005 12:29 PM
To: James Ballantine
Cc: slug@slug.org.au
Subject: Re: [SLUG] finding a file

A little bit lazy to figure how to get full paths, but

clean and simple: 

$ ls -RShl 

just the size and filename:

$ ls -Shl  | awk '{print $5 " " $8}'

no directories, just the size and filename:

$ ls -RShl  | grep -v '^d' | awk '{print $5 " " $8}'

there's probably easier ways to do it, but that's my 2 cents.


On 4/15/05, James Ballantine <[EMAIL PROTECTED]> wrote:
> Not quite what you wanted, but to get the largest files or directories

> in the current directory in order, you can use:
> 
> du -cks * |sort -nr |head -n15
> 
> This came from one of the O'Reilly UNIX books if I recall correctly.
> They suggested you alias it to 'ducks' for ease of typing.
> 
> /james
> 
> 
> Ben Donohue wrote:
> > Voytek wrote:
> >
> >> I'm trying to find a specific file withing a web tree, what the way

> >> to do it:
> >>
> >> I tried this with no luck
> >>
> >> # locate /home/domain.org.au localconf.php only to get
> >> find: localconf.php: No such file or directory
> >>
> >>
> >>
> > Further to this (and this is not an answer to the question above)
but
> > I'm buggered if i can find the largest files on the hard disk and
list
> > them in order.
> > I've tried various arguements but can't seem to crack it.
> > like find / -S -r (or -s) -name xxx|more
> >
> > Any ideas out there?
> > Ben
> >
> --
> SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
> Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
> 


-- 
Julio C. Ody
http://www.livejournal.com/users/julioody/
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to