I wanted to do this without completely loading the
page into an array and then checking for duplicates.
So I just inserted some flags into my while statement.
#!/usr/bin/perl -w
$dupli_file = <STDIN>;
chomp ($dupli_file );
$res_file = <STDIN>;
chomp ($res_file );
open (DUPLI, "./$dupli_file") or die "Failed to open
$dupli_file:\n$!";
my @fil1;my$flag;
while (<DUPLI>) {
#this sets an initial value so the foreach is
evaluated
if(!$flag){push(@fil1,$_);print "first push :
$_\n";$flag++}
#allow it to push if $pushit isn't reset to 0
my$pushit = 1;
foreach $line (@fil1){
if($_ eq $line){$pushit =0;last;}
}
unless($flag == 0){if($pushit){push(@fil1,$_);print
"pushed $_\n"}}
}
close(DUPLI);
print "RESULT\n";
open (RESULT, ">./$res_file") or die "Failed to open
$res_file for writing:\n $!";
print RESULT @fil1;
close (RESULT);
__END__
--- Sudarshan Raghavan <[EMAIL PROTECTED]> wrote:
> On Fri, 27 Sep 2002, waytech wrote:
>
> > hi,
> >
> > i want to remove duplicate lines from one
> file(original file), and save
> > the result to another file.
> >
> > the origianl file like this:
> >
> >
>
--------------------------------------------------------------------------------------------
> > [EMAIL PROTECTED]
> > [EMAIL PROTECTED]
> > [EMAIL PROTECTED]
> > [EMAIL PROTECTED]
> > [EMAIL PROTECTED]
> > [EMAIL PROTECTED]
> >
>
-------------------------------------------------------------------------------------------
> >
> > with each email address a line. I want to remove
> the duplicate
> > lines(emais adresses) and
> >
> > save the result to a new file.
> >
> > I wrote a program , but it output nothing. can
> someome help ?
> >
> > Thanks a lot.
> >
> >
> kevin
> >
> >
>
-------------------------------------------------------------------------------------------
> > #!/usr/bin/perl -w
> > #######################
> > #Remove the duplicate line #
> > #from a orginal file and save #
> > #the result to a file. #
> > #######################
> > print "Enter the file that has duplicate
> lines.\n";
> > $Dupli_file=<stdin>;
> > chomp;
>
> chomp by default works on $_, to chomp $Dupli_file
> you will have to say
> chomp ($Dupli_file);
>
> > print "The file you select is '$Dupli_file'\n";
> > open (Dupli_file,"$Dupli_file");
>
> When opening a file always check for failure like
> this
> open (Dupli_file, $Dupli_file) or die "Cannot open
> $Dupli_file: $!\n";
> $! will contain the error string. For more info on
> $! (perldoc perlvar)
>
> > print "Please select the file you want to save the
> result to:\n";
> > $Resu_file=<stdin>;
> > chomp;
>
> chomp ($Resu_file);
>
> > print "The result file is $Resu_file\n";
> > open (Resu_file,">$Resu_file");
>
> Check for failure
>
> > while (<Dupli_file>)
> > {
> > $orin_line=$_;
> > while (<Resu_file>){
>
> You are trying to read from a filehandle that has
> been opened for
> writing. This will throw a warning message.
>
> This logic will not work even if you open the result
> file for both reading
> and writing. If the result file is empty to start
> with, the execution path
> will never go into the while loop. while
> (<Resu_file>) will fail the first
> time, which means nothing gets written into
> Resu_file. This makes it fail
> again the 2'nd time, 3'rd time ...
>
> > if("$_" eq "$orin_line")
> > {
> > next;
> > }
> > print Resu_file "$orin_line";
> > };
> > }
>
> A hash is more suited for your job
> #!/usr/bin/perl -w
> use strict;
>
> chomp (my $dupli_file = <STDIN>);
> chomp (my $res_file = <STDIN>);
>
> open (DUPLI, $dupli_file) or die "Failed to open
> $dupli_file: $!\n";
> open (RESULT, ">$res_file") or die "Failed to open
> $res_file for writing: $!\n";
>
> my %res_hash;
> while (<DUPLI>) {
> chomp;
> unless ($res_hash{$_}++) {
> print RESULT "$_\n";
> }
> }
> close (DUPLI);
> close (RESULT);
>
>
>
>
>
>
> --
> To unsubscribe, e-mail:
> [EMAIL PROTECTED]
> For additional commands, e-mail:
> [EMAIL PROTECTED]
>
__________________________________________________
Do you Yahoo!?
New DSL Internet Access from SBC & Yahoo!
http://sbc.yahoo.com
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]