Forrest Aldrich wrote:

> we use CVS to maintain our DNS data.  We also use a DNS regression
> suite (see http://www.quick.com.au/help/dns.html) to do pre-commit
> checks - very handy.
>
> The above toolset includes facilities for generating PTR records.
> If/when we re-configure said tool, it can cause a few hundred in-addr
> zone files to change.
>
> This typically causes the pre-commit checks to fail as the OS command
> line length is exceeded when cvs attempts to run the regression suite
> in that directory.
>
> The patch below (to cvs-1.10) modifies precommit_proc() such that if
> the pre-commit filter command begins with "xargs" or _PATH_XARGS, then
> it is run using run_popen() rather than run_exec().

Hmm.  I think the long term fix for this is to use XML or CGI or the like to
always pass any data needed by the child on stdin.  There was a discussion on this
matter on one of the lists sometime over the last couple months.

I don't like this as a long term fix because the *info hooks should work in some
uniform manner and, at the least, the verifymsg and loginfo hooks already accept
data on stdin.

I might not object to something like this as a temporary fix, though.  What do
other people think?

Derek

--
Derek Price                      CVS Solutions Architect ( http://CVSHome.org )
mailto:[EMAIL PROTECTED]     OpenAvenue ( http://OpenAvenue.com )
--
Information is the currency of democracy.

                        - Thomas Jefferson




_______________________________________________
Bug-cvs mailing list
[EMAIL PROTECTED]
http://mail.gnu.org/mailman/listinfo/bug-cvs

Reply via email to