On Sun Oct 25 1998 15:31, Tony Nugent wrote:

[I love talking to myself:-]

> On Sat Oct 24 1998 20:06, Coran Fisher aka The Doctor wrote:
> 
> > I have a silly question on grep usage.  I'm interested in a way for
> > grep to search the content of files in directory chains. I was wondering
> > if someone could clue me in on a way to do this either with a switch I haven't
> > figured out or by piping if necessary.  I have used find piped with grep to
> > search for files so I am aware of this particular ability.
> 
> There's more than one way to do this.  An example might be useful:
> 
> % find /usr/include -follow -name \*.h | xargs grep time /dev/null

I just had cause to do something similar to this, but it involved editing a
whole bunch of html files to replace all instances of a URL address with
another.  While I was at it I also wanted to add a "background" image to
the <BODY> tag.

Pretty boring job to do by hand, one file at a time.

I thought about using sed, but I'd have to do it from a shell script (or
bash function) because I'd have to filter each file, save the output
into a temporary file, then copy this over the top of the original one.
Tricky and messy.

But then perl came to the rescue...

% find . -name \*.html -print | \
    xargs perl -pi 's/<BODY>/<BODY background=mat3.jpg>/i'

Magical stuff, eh?  :)

BTW, see the man pages for perlrun (for -pi) and perlop (for the regexp) to
see how this works.

(Tet: hey mate, note the -print... happy now? :)

Cheers
Tony
 -=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-
  Tony Nugent <[EMAIL PROTECTED]>           <[EMAIL PROTECTED]>
  Computer Support Officer                       Faculty of Science
  University of Southern Queensland, Toowoomba Oueensland Australia
 -=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-

Reply via email to