Sharp, Craig wrote:
I have a script that uses file::find to create a list of files within
a subdirectory. Depending on the size of the directory, find can be
very resource intensive. I need to find a better way to generate a
list of files within a subdirectory for comparison to a time for deletion.
TIA
Best regards,
Craig Sharp
Quicken Loans | Unix Engineer | Direct: 734.805.7839 | Fax:
734.805.5513 | [EMAIL PROTECTED]
<mailto:[EMAIL PROTECTED]>
The contents of this e-mail are intended for the named addressee only.
It contains information that may be confidential. Unless you are the
named addressee or an authorized designee, you may not copy or use it,
or disclose it to anyone else. If you received it in error please
notify us immediately and then destroy it.
------------------------------------------------------------------------
_______________________________________________
Perl-Unix-Users mailing list
Perl-Unix-Users@listserv.ActiveState.com
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs
That kind of depends on what you want to do. If all you want is to get a
list of files in a specific directory, you could use
while (</directory/*.gif>)
to give you a list of all the .gif files in the specified directory.
If you want something a little more general, you can read a list of all
the files (including other directory names) using opendir and readdir.
You could even use opendir and readdir along with your own code to read
an entire tree of files, but if you're going to all that work, you might
as well use File::Find.
Carl.
_______________________________________________
Perl-Unix-Users mailing list
Perl-Unix-Users@listserv.ActiveState.com
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs