On 1/5/12 Thu  Jan 5, 2012  11:01 AM, "ram ram" <ram_p...@yahoo.com>
scribbled:

> Thank You very much Jim for your immediate response.
> my requirement is that I have standard installation on server1 which is a
> master copy. I know the location of the installation directory which contains
> bunch of libraries.
> I have a another installation on Server2 for the same application. I do know
> the location of the installation dir.
> I have to compare all the libraries (checksums) on the server2 with the
> standard installation on server1, for all the files in the installation dir.
> I have to do dynamically compute the checksum of each file in the server2 and
> compare same with on server1.
> I am not sure which protocol suits best. right now I am considering NFS.
> Your help greatly appreciated.

(Please do not top post on this list. If you do not know what this means,
please ask or look at <http://en.wikipedia.org/wiki/Posting_style>. Thanks.)


Using NFS (or SMB) will allow you to treat the files as if they are local.
Using SCP or FTP means that you will have to fetch the files first before
they can be compared.

Why do you need to compute checksums? If you are just trying to see if two
files are the same, you can compare them. For example, the Unix cmp program
will do that. The File::Compare module provides a Perl equivalent.

Checksums will be useful if generating and saving them allow you to avoid
comparing long files more than once. But if you are not in control of the
library files on the two servers, then you can't be sure a file has changed,
and you will have to read it again. Checksums would also be useful if you
had more than two servers to compare. You could compute the checksums once
and then make multiple comparisons of the various checksums instead of
comparing the files themselves.

However, if all you are doing is comparing the same file on two different
servers, computing the checksums doesn't buy you a lot. You still have to
read the whole file in order to compute a valid checksum.

Whatever scheme you use, your goal should be to only read each file once.

You might want to compare file sizes first. If two files are different
sizes, then they are not exact copies, and you do not need to read the
entire files. If two files are the same size, then they might be the same or
they might be different, and you will have to compare the files (or
checksums).

If you do use NFS, then the details of fetching the files will be handled by
the operating system. In that case, I would suggest the use of the
File::Find module for finding the files on the two servers.

How often do the files change? The answer may affect your choice of method.



-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to