Re: How to handle data/script files in a VPATH build ?
Steffen Dettmer wrote: Just because of my curiosity, when writing portable packages (i.e. packages compiling ON many platforms), on which platform this is recommended to do so? GNU/Linux is great for working because it has all the efficient tools, but bad for testing, because it is too powerful and everything works :) Or is there some `mini linux' or alike that uses e.g. busybox find etc? Then someone could have a virtual machine to test. Solaris must come very high up on the priority list. Sun's tools are often quite different to the GNU ones. Solaris is probably 4th in popularity compared to Windows, OS X and Linux. It is the most common real Unix system. I'm all for people testing code on other platforms. I can make available an HP-UX box for limited testing. Due to power consumption, I can't run the thing 24/7 for 52 weeks per year, but I can power it up for a couple of days if someone want to test on HP-UX. Dave
Re: advice for pre-generating documentation
On Fri, 2010-02-12 at 12:21 +0100, Steffen Dettmer wrote: > On Thu, Feb 11, 2010 at 5:08 PM, Gaetan Nadon wrote: > > generated using tools such as doxygen, asciidoc, xmlto, groff, and > > ps2pdf. I can state some reasons why generated docs and included in the > > tarball: > > This is interesting and many seem to agree here, but I think this is wrong. I don't like it either, the makefile gets complicated and error prone. The main argument is that not all platforms have the tool so they would not be able to install the doc. I think there are other ways to handle such situation like publishing doc-only tarballs. The docs are not binaries and don't have to be built on each platform. If you can avoid putting doc generated files in the tarball, you will save yourself the trouble. > In my team we made bad experiences and now have the rules that: > - a file either is autogen OR in CVS/SVN but not both > - a file either is generated by make OR in srcdist > - generated files (doxygen, libs) are in bindist only > If someone wants to read the compiled documentation (be it HTML or > PDF) or just use any other compiled object (such as a lib), he should > use a bindist or a webbrowser. > > We use doxygen is most cases only for API documentation (I think this > is the normal case), but our developers here usually seem never to > read the HTML doc but read the .h (.java...) files. > > Including documentation may also be wrong when it is conditional, but > this might be a special case. > > When CVS snaps or alike, let's say you check out trunk HEAD and run > make dist, the generated documentation also might be invalid for > example because required Change History information may not be filled > or so. I think typically someone can expect valid documentation only > from released versions. > > I think with doxygen it is difficult to get the decencies right, don't > know if it even works, so how do you update the documentation? Are you > re-generating it when running make dist? > > > 1) convenience > > 2) the target platform building the tarball does not have the tool > > 3) the target platform building the tarball has the tool, but at the > > wrong version > 4) the target may have the tool but different (compatible or > incompatible) dependencies leading to different files be generated > (might be less important for documentation). > > > 3) unconditionally add html generated files in EXTRA_DIST: this will > > cause 'make dist' to fail if the platform creating a tarball does not > > have the doc tool. > > So in each srcdist you include all the HTML, PDF, man pages and > whatever files generated, right? Is this the intended "automake way"? > I just looked to some arbitrary lib and it has 8 MB sources but 20 MB > doc (just HTML+PDF), so it would bloat up srcdist a lot... > How to avoid to include outdated documentation? > > >EXTRA_DIST = all the html and doxygen files listed here > > Does this mean to list all files individually here? In my example case > I had to list 1422 files, wow... > But how to maintain that? If anyone documents some struct somewhere, a > new file will be generated, but how to know what to add to EXTRA_DIST? > Do you create/use some include filelist.mak or so? > > oki, > > Steffen > > signature.asc Description: This is a digitally signed message part
Re: advice for pre-generating documentation
On Fri, 2010-02-12 at 12:59 +0100, Stefano Lattarini wrote: > At Friday 12 February 2010, Braden McDaniel > wrote: > > Actually, EXTRA_DIST can pull in a whole subdirectory. > Thank you for the information, I didn't know that (or I forgot it). I > guess It's time for me to re-read the Automake documentation. > > Wildcards work there as well. > Mmhhh... technically, you are right on this point too (as can be seen > by reading any Automake-generated makefile), but I couldn't find > anything about this "feature" in the manual... Are you sure that > wildcards in EXTRA_DIST are supported "by design" rather than as > a side-effect of implementation details? In the last case, it could be > a bad idea to exploit that "feature". I have no idea. It's worked that way for a long time. While in general I agree with the Automake philosophy regarding wildcards, Doxygen integration is one place where they make a whole lot of sense. I hope the Automake maintainers don't decide to break this. But if they do, we still have dist-hook. -- Braden McDaniel
Re: How to handle data/script files in a VPATH build ?
Hi Ralf, thanks again for your helpful message. It is interesting how many mistakes (non-portable constructions) can be in such a small snipped I wrote. Thanks for spotting. On Wed, Feb 10, 2010 at 9:29 PM, Ralf Wildenhues wrote: > * Steffen Dettmer wrote on Wed, Feb 10, 2010 at 11:01:34AM CET: >> module=xyz >> _mak=${module}.mak >> echo "# Autogenerated by $0" > ${_mak} >> echo "${module}files = \\" >> ${_mak} >> find directoty1 directory2 directory3 -type f \ >>-not -path '*/CVS*' \ > > With find, -not is not portable, but ! is (suitably escaped when used in > an interactive shell); -path is not portable either. ohh, how bad... I've read a bit about find in info find (I have no `info findutils' page). It does not tell much about portability but still a lot... (but I didn't find how to write -path correctly, fortunatly we can require to have a GNU find :-)). Just because of my curiosity, when writing portable packages (i.e. packages compiling ON many platforms), on which platform this is recommended to do so? GNU/Linux is great for working because it has all the efficient tools, but bad for testing, because it is too powerful and everything works :) Or is there some `mini linux' or alike that uses e.g. busybox find etc? Then someone could have a virtual machine to test. >>-not -path '*/*.sw?' \ >>-exec echo "{} \\" \; \ > > POSIX find only allows you to use {} as a single argument IIRC; you can > just pass arguments to echo separately here: `-exec echo "{}" \\ \;'. ohh interesting, thanks. Yes, if I look to the right place (http://www.opengroup.org/onlinepubs/9699919799/utilities/find.html) and understand correctly, you do remember correctly: Historical implementations do not modify "{}" when it appears as a substring of an -exec or -ok utility_name or argument string. There have been numerous user requests for this extension, so this volume of POSIX.1-2008 allows the desired behavior. At least one recent implementation does support this feature, but encountered several problems in managing memory allocation and dealing with multiple occurrences of "{}" in a string while it was being developed, so it is not yet required behavior. >> | sort>> ${_mak} > > With sort, you should always normalize the locale, i.e., > LC_ALL=C sort Is having `LC_COLLATE="POSIX"' also sufficient and correct? (we have this in /etc/profile). But I added export LC_ALL=C to all that scripts to go sure :) > Well, don't look at the GNU find(1) manpage if you're looking for > portable options only. That's what POSIX/SUSv3 is for; the findutils > info pages are more verbose about portability, too. How do I get the findutils info pages? Is this `info find' or is there another one? oki, Steffen
Re: advice for pre-generating documentation
At Friday 12 February 2010, Braden McDaniel wrote: > Actually, EXTRA_DIST can pull in a whole subdirectory. Thank you for the information, I didn't know that (or I forgot it). I guess It's time for me to re-read the Automake documentation. > Wildcards work there as well. Mmhhh... technically, you are right on this point too (as can be seen by reading any Automake-generated makefile), but I couldn't find anything about this "feature" in the manual... Are you sure that wildcards in EXTRA_DIST are supported "by design" rather than as a side-effect of implementation details? In the last case, it could be a bad idea to exploit that "feature". Regards, Stefano
Re: advice for pre-generating documentation
On Thu, Feb 11, 2010 at 6:06 PM, Andreas Jellinghaus wrote: > also I wonder: > what about builddir vs. sourcedir? how do you handle that? > does automake handle that automaticaly? make does handle it (at least GNU Make, I don't know others): If you have in Makefile.am let's say: EXTRA_DIST = sourcefile.c sourcefile.o test: sourcefile.c sourcefile.o @echo "$^" it will work even if sourcefile.o is generated from sourcefile.c; both will be in srcdist. Make will set things like $^ correctly because VPATH is considered, that means `make test' tells e.g.: stef...@host:/tmp/steffen/.../build/i386-gnulinux/test # make test ../../../../ccomm/test/sourcefile.c sourcefile.o -- cool aint??? :-) but I would ever put generated sources to EXTRA_DIST, because in the dist in this example file you would find: .../test/sourcefile.c .../test/sourcefile.o then if you compile from that dist with builddir != srcdir and deps enforce generation of sourcefile.o you end up with 2 sourcefile.o files: one in srcdir and other in builddir, which (IMHO) must never ever happen. oki, Steffen
Re: advice for pre-generating documentation
On Thu, Feb 11, 2010 at 5:08 PM, Gaetan Nadon wrote: > generated using tools such as doxygen, asciidoc, xmlto, groff, and > ps2pdf. I can state some reasons why generated docs and included in the > tarball: This is interesting and many seem to agree here, but I think this is wrong. In my team we made bad experiences and now have the rules that: - a file either is autogen OR in CVS/SVN but not both - a file either is generated by make OR in srcdist - generated files (doxygen, libs) are in bindist only If someone wants to read the compiled documentation (be it HTML or PDF) or just use any other compiled object (such as a lib), he should use a bindist or a webbrowser. We use doxygen is most cases only for API documentation (I think this is the normal case), but our developers here usually seem never to read the HTML doc but read the .h (.java...) files. Including documentation may also be wrong when it is conditional, but this might be a special case. When CVS snaps or alike, let's say you check out trunk HEAD and run make dist, the generated documentation also might be invalid for example because required Change History information may not be filled or so. I think typically someone can expect valid documentation only from released versions. I think with doxygen it is difficult to get the decencies right, don't know if it even works, so how do you update the documentation? Are you re-generating it when running make dist? > 1) convenience > 2) the target platform building the tarball does not have the tool > 3) the target platform building the tarball has the tool, but at the > wrong version 4) the target may have the tool but different (compatible or incompatible) dependencies leading to different files be generated (might be less important for documentation). > 3) unconditionally add html generated files in EXTRA_DIST: this will > cause 'make dist' to fail if the platform creating a tarball does not > have the doc tool. So in each srcdist you include all the HTML, PDF, man pages and whatever files generated, right? Is this the intended "automake way"? I just looked to some arbitrary lib and it has 8 MB sources but 20 MB doc (just HTML+PDF), so it would bloat up srcdist a lot... How to avoid to include outdated documentation? >EXTRA_DIST = all the html and doxygen files listed here Does this mean to list all files individually here? In my example case I had to list 1422 files, wow... But how to maintain that? If anyone documents some struct somewhere, a new file will be generated, but how to know what to add to EXTRA_DIST? Do you create/use some include filelist.mak or so? oki, Steffen