On Wed, 6 Sep 2000, MacFarlane, Jarrod wrote:

> This outputs something like:
> www.reallynaughtysite.com
> www.smackmeimbad.com
> and so on....
> 
> The problem is that it has many double ups... are there a long confusing
> string of commands that will go through my logfile and remove all but one
> instance of every domain listed?

No but there are some simple ones. First sort the output so the same names
are next to each other and then run it through uniq to remove duplicates.

eg

ls > foo
ls >> foo
cat foo | sort | uniq

will give only one entry for each file not two. So just add 

 | sort | uniq

to the end of you existing command.

This assumes that you don't mind mucking up the order. If you want to
preserve the order then a simple perl script will do it.

Rodos



--
SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
More Info: http://slug.org.au/lists/listinfo/slug

Reply via email to