Maybe this would do it (as a shell script):

#! /bin/sh
for $i in `find .`; do
        cp $i $i.old
        cat $i.old | sed -e "s/MarketingWorks\/Collard
Associates/MarketingWorks" > $i
next;

I am NOT a shell script guru- you can tell because my for loop syntax is
wrong, and I'm not sure what the proper way to write it is.  I hope I
get my idea across, though.  You might run it as `script 2> /dev/null`
because it will generate an error message for every directory under ./
as I couldn't figure out a way to get `find` to list only files, or 
`ls -R` to not print out directory names.  
With this pseudo-script, you also get backups of your HTML files
(with a .old extension) in case it screws up.  If it works, then you can 
rm `find . |grep .old`
to recusively remove all old files.  I think.

-Matt Stegman
<[EMAIL PROTECTED]>

On Thu, 2 Sep 1999, James Stewart wrote:

> Been investigating the problem I posted yesterday a bit more, and have a
> slightly better way of expressing it now ;)
> 
> What I want to do is like issuing:
> 
> grep -r MarketingWorks\/Collard Associates /home/httpd/marketingworks.co.uk
> 
> but instead of listing the results I want to replace it with simply
> MarketingWorks
> 
> Another way of looking at it is issuing perl's
> s/MarketingWorks\/Collard Associates/MarketingWorks/;
> 
> but recursively through the directory structure.
> 
> Can anyone suggest how to do this?
> 
> James.
> -- 
>     James Stewart     |          Britlinks         |  The Phantom Tollbooth
> [EMAIL PROTECTED] | http://www.britlinks.co.uk | http://www.tollbooth.org
> 
>      Sixpence None The Richer UK -- http://www.britlinks.co.uk/sixpence/
> 

Reply via email to