Re: Filering a file

2005-12-06 Thread Sara

Here is a small piece of code if it could be helpful.

## START #

#!f:/perl/bin/perl.exe
# set the above to your perl path.

use strict;
use warnings;
use CGI qw(:standard);
use CGI::Carp 'fatalsToBrowser';

my $q = new CGI;

open (DUPF,"dupes.txt"); # name of the file containig dupes to be removed.
open (OUTF,">>dupes_removed.txt"); #new file with dupes removed.
my %saw;
print OUTF grep(!$saw{$_}++,);
close OUTF;
close DUPF;

print $q->header();
print "Duplicate Enteries Removed";

exit;

# End of Code 

Thanks,

Sara.



- Original Message - 
From: "Adedayo Adeyeye" <[EMAIL PROTECTED]>
To: "'David Dorward'" <[EMAIL PROTECTED]>; "'Perl Beginners - CGI List'" 


Sent: Tuesday, December 06, 2005 12:14 PM
Subject: RE: Filering a file



Hello David,

I'm able to open the file, read the contents and output the results of my
initial filtering to a new file.

The problem is that my new file has duplicate entries, and cleaning up
duplicates is where I'm stuck.

Kind regards

Dayo

-Original Message-
From: David Dorward,,, [mailto:[EMAIL PROTECTED] On Behalf Of David 
Dorward

Sent: Monday, December 05, 2005 2:35 PM
To: 'Perl Beginners - CGI List'
Subject: Re: Filering a file

On Mon, Dec 05, 2005 at 02:20:33PM +0100, Adedayo Adeyeye wrote:

   How do I write a script to parse through this file and just return the
   unique names. Ie I want the repetitions ignored.


What have you tried?  Where are you stuck? (Opening the file? Reading
the contents? The actual filtering?). Nothing in your question is CGI
related, have you got this working as a command line script but are
having trouble converting it to work under CGI? What code have you got
so far?


--
David Dorward  http://dorward.me.uk


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>





--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>





--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>




RE: Filering a file

2005-12-05 Thread Adedayo Adeyeye
Hello David,

I'm able to open the file, read the contents and output the results of my
initial filtering to a new file.

The problem is that my new file has duplicate entries, and cleaning up
duplicates is where I'm stuck.

Kind regards

Dayo

-Original Message-
From: David Dorward,,, [mailto:[EMAIL PROTECTED] On Behalf Of David Dorward
Sent: Monday, December 05, 2005 2:35 PM
To: 'Perl Beginners - CGI List'
Subject: Re: Filering a file

On Mon, Dec 05, 2005 at 02:20:33PM +0100, Adedayo Adeyeye wrote:
>How do I write a script to parse through this file and just return the
>unique names. Ie I want the repetitions ignored.

What have you tried?  Where are you stuck? (Opening the file? Reading
the contents? The actual filtering?). Nothing in your question is CGI
related, have you got this working as a command line script but are
having trouble converting it to work under CGI? What code have you got
so far?


-- 
David Dorward  http://dorward.me.uk


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>





-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>




RE: Filering a file

2005-12-05 Thread Adedayo Adeyeye
You are a man of honour Sean.

I maintain an account with Safari Online books (http://safari.informit.com)
which I find to be an invaluable resource for access to these books.

Kind regards

Dayo

-Original Message-
From: Sean Davis [mailto:[EMAIL PROTECTED] 
Sent: Monday, December 05, 2005 5:28 PM
To: Chris Devers
Cc: Adedayo Adeyeye; Perl Beginners - CGI List
Subject: Re: Filering a file




On 12/5/05 9:47 AM, "Chris Devers" <[EMAIL PROTECTED]> wrote:

> On Mon, 5 Dec 2005, Sean Davis wrote:
> 
>> See here:
>> 
>> http://
> 
> Please do not link to this site.
> 
> These are pirated copies of the books in question, hosted on a Ukranian
> web server without the authorization of the publishers or authors of the
> books in question.
> 
> There are legit ways to get access to these books, including O'Reilly's
> Safari book subscription service, your favorite local or online
> bookstores, and good old public libraries.
> 
> I'd have expected someone with a .gov address to be more cognizant of
> such flagrant circumvention of copyright law... :-)

To Chris and others on the list, I apologize.  More importantly, I should
apologize to Tom Christiansen and Nathan Torkington, the authors of the book
in question.  I was much too quick to post the link from google without
reading the copyright (quoted below) available as a link from the homepage
for the site.  This was unacceptable of me to do, particularly on a list
that claims to be giving advice to beginners.  Again, my sincerest
apologies.

Sean

Copyright notice:

The electronic versions of Perl in a Nutshell, Learning Perl, Learning Perl
on Win32 Systems, Programming Perl, Advanced Perl Programming, and Perl
Cookbook are copyright C 1999 by O'Reilly & Associates, Inc.; all rights
reserved.

This CD-ROM is intended for use by one individual. As such, you may make
copies for your own personal use. However, you may not provide copies to
others, or make this reference library available to others over a LAN or
other network. You may not reprint, offer for sale, or otherwise re-use
material from this book without the explicit written permission of O'Reilly
& Associates, Inc. If you would like to discuss licensing for multiple
users, contact [EMAIL PROTECTED]

You can purchase print editions of these books directly from O'Reilly &
Associates, Inc. (see http://www.oreilly.com/order_new/) or from bookstores
that carry O'Reilly & Associates books.




-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>




Re: Filering a file

2005-12-05 Thread Sean Davis



On 12/5/05 9:47 AM, "Chris Devers" <[EMAIL PROTECTED]> wrote:

> On Mon, 5 Dec 2005, Sean Davis wrote:
> 
>> See here:
>> 
>> http://
> 
> Please do not link to this site.
> 
> These are pirated copies of the books in question, hosted on a Ukranian
> web server without the authorization of the publishers or authors of the
> books in question.
> 
> There are legit ways to get access to these books, including O'Reilly's
> Safari book subscription service, your favorite local or online
> bookstores, and good old public libraries.
> 
> I'd have expected someone with a .gov address to be more cognizant of
> such flagrant circumvention of copyright law... :-)

To Chris and others on the list, I apologize.  More importantly, I should
apologize to Tom Christiansen and Nathan Torkington, the authors of the book
in question.  I was much too quick to post the link from google without
reading the copyright (quoted below) available as a link from the homepage
for the site.  This was unacceptable of me to do, particularly on a list
that claims to be giving advice to beginners.  Again, my sincerest
apologies.

Sean

Copyright notice:

The electronic versions of Perl in a Nutshell, Learning Perl, Learning Perl
on Win32 Systems, Programming Perl, Advanced Perl Programming, and Perl
Cookbook are copyright © 1999 by O'Reilly & Associates, Inc.; all rights
reserved.

This CD-ROM is intended for use by one individual. As such, you may make
copies for your own personal use. However, you may not provide copies to
others, or make this reference library available to others over a LAN or
other network. You may not reprint, offer for sale, or otherwise re-use
material from this book without the explicit written permission of O'Reilly
& Associates, Inc. If you would like to discuss licensing for multiple
users, contact [EMAIL PROTECTED]

You can purchase print editions of these books directly from O'Reilly &
Associates, Inc. (see http://www.oreilly.com/order_new/) or from bookstores
that carry O'Reilly & Associates books.



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 




Re: Filering a file

2005-12-05 Thread David Dorward
On Mon, Dec 05, 2005 at 02:20:33PM +0100, Adedayo Adeyeye wrote:
>How do I write a script to parse through this file and just return the
>unique names. Ie I want the repetitions ignored.

What have you tried?  Where are you stuck? (Opening the file? Reading
the contents? The actual filtering?). Nothing in your question is CGI
related, have you got this working as a command line script but are
having trouble converting it to work under CGI? What code have you got
so far?


-- 
David Dorward  http://dorward.me.uk


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 




Re: Filering a file

2005-12-05 Thread Chris Devers
On Mon, 5 Dec 2005, Sean Davis wrote:

> See here:
> 
> http://

Please do not link to this site. 

These are pirated copies of the books in question, hosted on a Ukranian 
web server without the authorization of the publishers or authors of the 
books in question. 

There are legit ways to get access to these books, including O'Reilly's 
Safari book subscription service, your favorite local or online 
bookstores, and good old public libraries.

I'd have expected someone with a .gov address to be more cognizant of 
such flagrant circumvention of copyright law... :-)


-- 
Chris Devers

ÉAmk•ÐÞá†Øq
-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 


Re: Filering a file

2005-12-05 Thread Sean Davis
On 12/5/05 8:20 AM, "Adedayo Adeyeye" <[EMAIL PROTECTED]> wrote:

> I have a file that contains a listing of names like:
> 
> 
> 
> John
> 
> Paul
> 
> Kate
> Paul
> 
> Charles
> 
> Kate
> 
> 
> 
> How do I write a script to parse through this file and just return the unique
> names. Ie I want the repetitions ignored.

See here:

http://www.unix.org.ua/orelly/perl/cookbook/ch04_07.htm

I'm assuming you know how to read a text file

Sean


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 




Filering a file

2005-12-05 Thread Adedayo Adeyeye








I have a file that contains a listing of names like:

 

John

Paul

Kate
Paul

Charles

Kate

 

How do I write a script to parse through this file and just
return the unique names. Ie I want the repetitions ignored.

 

Kind regards

 

 


 
  
  
  
  
   
  
 
 
  
  Adedayo Adeyeye
  
  
  Netcom Africa Limited   
  
 
 
  
  Engineering
  
  
  South Atlantic Petroleum Tower 
  
 
 
  
   
  
  
  7/F, 7 Adeola Odeku Street
  
 
 
  
   
  
  
  Victoria Island, Lagos,
   Nigeria  
  
  
 
 
  
  [EMAIL PROTECTED]   
  
  
  
  Tel:
  
  
  234.1.461.1234
  
 
 
  
  Skype:
  crownd
  
  
  Fax:
  
  
  234.1.461.1235
  
 
 
  
  http://www.netcomafrica.com
  
  
  Mobile:
  
  
  0802.501.3758
  
 


 

 

 

 

 

 

The information contained in this communication is confidential
and may be legally privileged. It is intended solely for the use of the
individual or entity to whom it is addressed and others authorized to receive
it. If you are not the intended recipient you are hereby notified that any
disclosure, copying, distribution or taking action in reliance of the contents
of this information is strictly prohibited and may be unlawful. Kindly destroy
this message and notify the sender by replying the email in such instances. We
do not accept responsibility for any changes made to this message after it was
originally sent and any views, opinions, conclusions or other information in
this message which do not relate to the business of this firm or are not
authorized by us. Netcom is not liable neither for the proper and complete
transmission of the information contained in this communication nor any delay
in its receipt.