I would rather see a heck of a lot of new functions actually. I am really
fed up with some of the limitations of gnu make as it is that might be
solved very easily with even 1 or two well chosen new ones.

Perhaps a warning when one redefines an internal function might be the way
to avoid throttling this project any more than it already is?

A couple of examples that would make my work *much* easier:
$(equal X,Y) # for use inside $(if) and must work when X and Y can be blank
$(range 1,10) # 1 2 3 4 5 6 7 8 9 10
$(tolower ABCDE) # very hard and unpleasant and slow to implement
$(info >>filename) # appending info statements to a file without invoking
$(shell) (also need >  to create)
$(cdr LIST) # perhaps can be done with $(words) - not so sure on this.

I can't see the point of developing new versions if they can't have new
features.

Regards,

Tim

On 13 June 2011 17:01, Rob Walker <rwal...@qualcomm.com> wrote:

> Have you considered the backwards compatibility issues this patch might
> cause?
>
> The functions you've implemented for inclusion are implementable
> directly in the make language.  Therefore, there are very likely
> implementations floating around in existing makefiles, very likely with
> the names you've chosen, but possibly with slightly different semantics.
>
> When both make and a makefile define a function $(call fn) always
> resolves to the built-in.
>
> Until make gives make programmers more control over their environment, I
> think a moratorium on new built-in functions might be a good idea.  Any
> seconds?
>
> -Rob
>
>
> On 6/12/11 2:42 PM, Ben Robinson wrote:
> > Hello,
> >
> > I thank the community for their feedback regarding my submission of
> > $(trimpath ...) and $(relpath ...).  Many other benefits to the
> > submission have been identified which would improve GNU Make if
> > incorporated.  However, I see two outstanding issues which remain:
> >
> > 1) The submission needs to be refactored to incorporate the coding style
> > and macros of 3.82 (it is currently based on 3.81).  I will proceed with
> > this effort as soon as I hear word that the submission is approved.
> >
> > 2) Handling of symbolic links.  The current implementation only analyzes
> > the string representation of the path, and does not touch the file
> > system.  This has the advantage of performance, and the fact that
> > expanding and resolving symbolic links can lead to longer resultant
> > paths, when the purpose of the symbolic link was to skip over numerous
> > long directory names.  I would recommend the submission remain in its
> > current form, and if symbolic link expansion is needed, it be added to
> > another new function, such as $(resolve-symlinks...), as was previously
> > suggested.  In this way, the expansion of symbolic links can be
> > controlled by the user, when the user desires that functionality.
> >
> > I appreciate the community's feedback, and I look forward to hearing
> > approval for submission if appropriate.
> >
> > Thank you,
> >
> > Ben Robinson
> >
> >
> >
> > On Tue, May 31, 2011 at 9:37 AM, Howard Chu <h...@highlandsun.com
> > <mailto:h...@highlandsun.com>> wrote:
> >
> >     Edward Welbourne wrote:
> >
> >             Pretty weak. If a few more include paths were added to the
> >             project it would
> >             still break, regardless of your patch.
> >
> >
> >         When you have thousands of object files, shortening the name of
> each
> >         by several bytes when invoking ar can make room for really quite
> >         a lot
> >         more files before you run up against the command-line length
> limit.
> >
> >         Never understimate the value of a modest shortening of file names
> -
> >         when our ar command-line got to about 230 kibibytes, we had to
> >         re-work
> >         the way we invoked ar !
> >
> >
> >     Whatever you're doing, you're doing it wrong. ;)
> >
> >     If you're tossing a bunch of object files into an ar library simply
> >     for grouping purposes, and all of the object files will eventually
> >     be used in the final executable, it's faster to use ld -r and
> >     combine groups of them all into a single object file. If you
> >     profiled your build I'm sure you would find that a large percentage
> >     of time is being wasted in ar itself.
> >
> >     And if you're really running a build with thousands of object files,
> >     on a frequent basis, you're foolish not to profile it and figure out
> >     what else you're doing wrong. Giant monolithic command lines also
> >     prevent you from leveraging parallel make, so that's another reason
> >     not to do things that way...
> >
> >
> >     --
> >      -- Howard Chu
> >      CTO, Symas Corp.           http://www.symas.com
> >      Director, Highland Sun     http://highlandsun.com/hyc/
> >      Chief Architect, OpenLDAP  http://www.openldap.org/project/
> >
> >
> >
> >
> > _______________________________________________
> > Bug-make mailing list
> > Bug-make@gnu.org
> > https://lists.gnu.org/mailman/listinfo/bug-make
>
> _______________________________________________
> Bug-make mailing list
> Bug-make@gnu.org
> https://lists.gnu.org/mailman/listinfo/bug-make
>



-- 
You could help some brave and decent people to have access to uncensored
news by making a donation at:

http://www.thezimbabwean.co.uk/
_______________________________________________
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make

Reply via email to