perlwannabe wrote:
I have a file I download everyday, let's call it "output.txt."  The file
"output.txt" is saved in a directory by date, for example 10012005 (for Oct.
1, 2005).  I have a years worth of output files in my c:\ drive.  Now I have
to rename each output file so that I can copy all of the output to a single
directory.  It's a nightmare to do manually.  I really don't care what the
files are named as long as each file gets a unique name.  Imaging this:

C:\01012005
C:\01022005
C:\01032005
C:\01042005
.. . .

Now, each of those directories has a file in it called "output.txt."  I want
to get everyone of those "output.txt" and copy it to a single directory
(call it c:\renoutput) and each output will have a unique name.  So when I
do a "dir" of C:\RENOUTPUT is looks like:

output1.txt
output2.txt
output3.txt
output4.txt
.. . .

So I suppose I want to do a locate, rename, move.  1)  locate all
"output.txt" files on the hard drive; 2) rename each "output.txt" to
something unique, and 3) move each renamed file from its original location
to a single directory.

I have tried a few ways with no success.  Thanks for the help.


It's hard to imagine that you spend multiple days looking at such an easy problem and can't come up with a single line of code to show the list. To get you started here's a solution of your problem in bash (you do have bash installed on your windows box ?)

Create a test environment:
[EMAIL PROTECTED]:~/vuilbak$ for i in 1 2 3 4 5 6 7; do mkdir $i; touch $i/output.txt; done

Create directory for output
[EMAIL PROTECTED]:~/vuilbak$ mkdir result;

Use the following bash oneliner:
[EMAIL PROTECTED]:~/vuilbak$ (( i=0 )); for j in `find . -name "output.txt"`; do cp $j result/output$i.txt ; (( i++ )); done

Verify result:
[EMAIL PROTECTED]:~/vuilbak$ ls result/
output0.txt  output2.txt  output4.txt  output6.txt
output1.txt  output3.txt  output5.txt

gr.
E.

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to