Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-03 Thread hiro
Yeah, marketing: forcing you to buy shit you will never need.

On 2/3/10, sta...@cs.tu-berlin.de  wrote:
> * David Tweed  [2010-02-03 08:32]:
>> On Wed, Feb 3, 2010 at 4:15 AM, Noah Birnel  wrote:
>> > On Mon, Feb 01, 2010 at 04:49:52PM +0100, sta...@cs.tu-berlin.de wrote:
>> >
>> >>... a mobile phone with integrated camera,
>> >> touch screen, 'apps' for learning languages, etc. is as much suckless
>> >> as an
>> >> axe with a door bell, toilet paper and nuclear power generator.
>> >>
>> > At this point a mobile phone is a general purpose portable computer. The
>> > camera is no more out of line than the speakers hooked up to your home
>> > box.
>>
>> I partly think it's perception shape by marketing. You can still buy a
>> mobile phone that only has voice functions. You can also buy a more
>> general communications/entertainment node device which has a host of
>> hardware and software that's all appropriate to that usage, including
>> as one component making voice calls. The only problem is that they're
>> still marketed as "phones"
>
> Perfectly right.
>
>> (I've never subscribed to the philosophy that an entity should "do one
>> thing well" but rather that "there should not have non-orthogonal
>> capabilities in the same entity". If you're into that sort of thing, I
>> don't see any reason why you'd consider mobile photo-taking, internet
>> browsing, causual entertainment games, etc, to  be non-orthogonal to
>> chatting to friends: they're all ways to entertain yourself while not
>> at home.)
>
> Yes. I have no problem with integrated devices as long as they agree with
> the above philosophy, *and* as long as integration is not the cause for
> preventing them to fulfill their purpose. What is the benefit of a phone
> which has to be recharged every day, just because it is capable of playing
> music (not because you hear music every day)? Zero. Synergy^{-1}
>
> In addition, I definitely prefer to be able to take my phone, camera,
> portable computing device separately, when I need only one of them, with
> sane battery life, dimensions, etc.
>
> So, the solution is modular design with clear and simple interfaces. And
> this is possible. But we come again to the marketing division -- they would
> rather buy the license for such design, put it in the safe in the basement
> and be calm that nobody will come up with sth that cuts their sells a bit,
> even if it will make the world better. There are many examples of this.
>
> --
>  stanio_
>
>



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-03 Thread hiro
> The over-equiped mobile phone fails at being general purpose portable
> *mobile* computer due to the battery life for example. Having acceptable
> battery life with the same equipment would result in other
> weight/dimensions which would not fit in  peoples' understanding for mobile
> or portable ...
>
> --
>  stanio_
>
>

perhaps you should get a better phone



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-03 Thread stanio
* David Tweed  [2010-02-03 08:32]:
> On Wed, Feb 3, 2010 at 4:15 AM, Noah Birnel  wrote:
> > On Mon, Feb 01, 2010 at 04:49:52PM +0100, sta...@cs.tu-berlin.de wrote:
> >
> >>... a mobile phone with integrated camera,
> >> touch screen, 'apps' for learning languages, etc. is as much suckless as an
> >> axe with a door bell, toilet paper and nuclear power generator.
> >>
> > At this point a mobile phone is a general purpose portable computer. The
> > camera is no more out of line than the speakers hooked up to your home
> > box.
> 
> I partly think it's perception shape by marketing. You can still buy a
> mobile phone that only has voice functions. You can also buy a more
> general communications/entertainment node device which has a host of
> hardware and software that's all appropriate to that usage, including
> as one component making voice calls. The only problem is that they're
> still marketed as "phones" 

Perfectly right.

> (I've never subscribed to the philosophy that an entity should "do one
> thing well" but rather that "there should not have non-orthogonal
> capabilities in the same entity". If you're into that sort of thing, I
> don't see any reason why you'd consider mobile photo-taking, internet
> browsing, causual entertainment games, etc, to  be non-orthogonal to
> chatting to friends: they're all ways to entertain yourself while not
> at home.)

Yes. I have no problem with integrated devices as long as they agree with
the above philosophy, *and* as long as integration is not the cause for
preventing them to fulfill their purpose. What is the benefit of a phone
which has to be recharged every day, just because it is capable of playing
music (not because you hear music every day)? Zero. Synergy^{-1}

In addition, I definitely prefer to be able to take my phone, camera,
portable computing device separately, when I need only one of them, with
sane battery life, dimensions, etc.

So, the solution is modular design with clear and simple interfaces. And
this is possible. But we come again to the marketing division -- they would
rather buy the license for such design, put it in the safe in the basement
and be calm that nobody will come up with sth that cuts their sells a bit,
even if it will make the world better. There are many examples of this.

-- 
 stanio_



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-03 Thread stanio
* Noah Birnel  [2010-02-03 05:46]:
> On Mon, Feb 01, 2010 at 04:49:52PM +0100, sta...@cs.tu-berlin.de wrote:
>  
> >... a mobile phone with integrated camera,
> > touch screen, 'apps' for learning languages, etc. is as much suckless as an
> > axe with a door bell, toilet paper and nuclear power generator. 
> > 
> At this point a mobile phone is a general purpose portable computer. The
> camera is no more out of line than the speakers hooked up to your home
> box.

The over-equiped mobile phone fails at being general purpose portable
*mobile* computer due to the battery life for example. Having acceptable
battery life with the same equipment would result in other
weight/dimensions which would not fit in  peoples' understanding for mobile
or portable ... 

-- 
 stanio_



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread David Tweed
On Wed, Feb 3, 2010 at 4:15 AM, Noah Birnel  wrote:
> On Mon, Feb 01, 2010 at 04:49:52PM +0100, sta...@cs.tu-berlin.de wrote:
>
>>... a mobile phone with integrated camera,
>> touch screen, 'apps' for learning languages, etc. is as much suckless as an
>> axe with a door bell, toilet paper and nuclear power generator.
>>
> At this point a mobile phone is a general purpose portable computer. The
> camera is no more out of line than the speakers hooked up to your home
> box.

I partly think it's perception shape by marketing. You can still buy a
mobile phone that only has voice functions. You can also buy a more
general communications/entertainment node device which has a host of
hardware and software that's all appropriate to that usage, including
as one component making voice calls. The only problem is that they're
still marketed as "phones" which seems to cause cognition problems for
some people who seem to take is as gospel the label marketers use must
be right and therefore that the device is wrong, rather than vice
versa.

(I've never subscribed to the philosophy that an entity should "do one
thing well" but rather that "there should not have non-orthogonal
capabilities in the same entity". If you're into that sort of thing, I
don't see any reason why you'd consider mobile photo-taking, internet
browsing, causual entertainment games, etc, to  be non-orthogonal to
chatting to friends: they're all ways to entertain yourself while not
at home.)

-- 
cheers, dave tweed__
computer vision reasearcher: david.tw...@gmail.com
"while having code so boring anyone can maintain it, use Python." --
attempted insult seen on slashdot



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread Noah Birnel
On Mon, Feb 01, 2010 at 04:49:52PM +0100, sta...@cs.tu-berlin.de wrote:
 
>... a mobile phone with integrated camera,
> touch screen, 'apps' for learning languages, etc. is as much suckless as an
> axe with a door bell, toilet paper and nuclear power generator. 
> 
At this point a mobile phone is a general purpose portable computer. The
camera is no more out of line than the speakers hooked up to your home
box.

--Noah



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread hiro
what the fuck is all that pcp about anyway?

On Tue, Feb 2, 2010 at 11:04 PM, Chris Palmer  wrote:
> Jacob Todd writes:
>
>> > Maybe he's that rude because the plan9-colors make him aggressive ;)
>>
>> I think it has to do with people being fucking dumb.
>
> Let's not reject the PCP hypothesis out of hand, now.
>
>
>



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread Chris Palmer
Jacob Todd writes:

> > Maybe he's that rude because the plan9-colors make him aggressive ;)
> 
> I think it has to do with people being fucking dumb.

Let's not reject the PCP hypothesis out of hand, now.




Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread Jacob Todd
On Tue, Feb 02, 2010 at 03:57:05PM +0100, Moritz Wilhelmy wrote:
> Maybe he's that rude because the plan9-colors make him aggressive ;)
>

I think it has to do with people being fucking dumb.

-- 
I am a man who does not exist for others.


pgptG6In8tsHm.pgp
Description: PGP signature


Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread Moritz Wilhelmy
On Tue, Feb 02, 2010 at 12:09:28AM +0100, Nicolai Waniek wrote:
> On 02/01/2010 10:25 PM, Uriel wrote:
> > If you define your personal identity based on the colors of your
> > fucking window manager I feel sorry for your pathetic worthless life.
> 
> This is not the first time that you confuse cause and result. Additionally, 
> you
> seem to not have a fucking clue about people's different perception of the
> world around them (e.g. emotional/biophysical influence on colors) or the most
> mundane knowledge of psychophysics in general.
> 
> You should definitely stop talking about this topic if you don't want to
> ridicule yourself anymore.
> 
> 

Maybe he's that rude because the plan9-colors make him aggressive ;)



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread julien steinhauser
On Tue, Feb 02, 2010 at 09:57:04AM +0100, Antoni Grzymala wrote:
> 
> Anselm R Garbe dixit (2010-02-02, 08:05):
> 
> > On 1 February 2010 23:56, Antoni Grzymala  wrote:
> > > Well, a while ago I saw a back-to-front Trabant on the streets of
> > > Warsaw, a quick google and here you go:
> > >
> > > http://autofoto.pl/blogs/prezes/archive/2009/05/11/trabant-je-d-cy-ty-em.aspxa
> > >
> > > http://piotr.biegala.pl/foto/displayimage.php?pid=342&fullsize=1
> > > http://piotr.biegala.pl/foto/displayimage.php?pid=343&fullsize=1
> > 
> > I'd consider this guy living on the sharp edge, because anyone who has
> > seen lot's of Trabis on the streets will break when he sees this
> > vehicle or join left traffic ;)
> 
> Hehe... Those who have seen lots of Trabbies are on the verge of
> extinction these days (yeah, I know they're *not*).
> 
> -- 
> [a]
> 
If you've ever been in Chemnitz (Saxony),
you know that Trabbies are not dead.

I've seen a lot of normal trabbies there and a few customized ones.
I don't have photos at hand but I can remember of one which had
the rear half cut of (except for the wheels part) and replace
with a huge grill, which was bigger than the front part of the car.
It was still drivable and was named the "Wurstmobil" or something.

But what you linked at with the photos is much funnier, thank you :)





Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread Claudio M. Alessi

I meant werc.rc, as you can image.


-- 
JID: smoppy AT gmail.com
WWW: http://cma.teroristi.org



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread pancake

Uriel wrote:

On Tue, Feb 2, 2010 at 12:23 AM, pancake  wrote:
  

On Mon, 1 Feb 2010 22:31:38 +0100
Uriel  wrote:



On Mon, Feb 1, 2010 at 2:20 PM, pancake  wrote:
  

anonymous wrote:


Having said that, in case of rfork vice versa from FreeBSD.



Yes, I am talking about FreeBSD. With configure you can make your
program portable between FreeBSD and Linux. Most probably other
systems won't implement clone/rfork their own way so program will be
portable between all systems with some kind of rfork implementation.

  

in that specific case i would prefer to use __FreeBSD__ ifdef instead of a
configure stage.


This is totally and completely RETARDED. #ifdefs are a disgrace and
people that use them should be shot on sight.
  

if you deny ifdefs for minimal portability fixes and deny configure options
to specify OS or way to compile this program you are denying also portability
and incrementing the complexity in development and structuration.



This claim is patently ridiculous and wrong. As The Practice of
Programming points out the only proper way to write portable code is
to restrict yourself to the shared subset of interfaces available on
all desired platforms, this certainly *reduces* complexity, and there
are tons of software out there that use this approach and work just
fine on pretty much any platform imaginable.

Hell even dwm has no ifdefs or configuration step (or it didn't until
stupid XINERAMA support was added).

If due to the nature of the app one *really* needs to access
system-specific APIs (this is much more rare than people claims to the
point that I had trouble finding an example)) there are perfectly fine
ways to do this without using a retarded configuration step or insane
#ifdefs, for an example of how to do this see drawterm:
http://code.swtch.com/drawterm/src/ (and yes, drawterm has a handful
of ifdefs but most of them are either to comment out code or to enable
some compiler specific pragmas, and the rest should be done away with
and as far as I can tell were added by people that didn't quite know
how to do things properly).
  

Congratulations for the first coherent-nontroll response :)

(Yeah! i did it!)

As read in drawterm's README:
---

To build on Unix, run CONF=unix make.

To build on Solaris using Sun cc, run CONF=sun make.

---

So.. there's a configure stage to define the kind of OS. which is in fact
a non-standard way to do it. which sucks because makes the developer
loss time by reading human-friendly texts to understand how to build it
instead of letting the system build it by itself.

--pancake



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread Antoni Grzymala
Anselm R Garbe dixit (2010-02-02, 08:05):

> On 1 February 2010 23:56, Antoni Grzymala  wrote:
> > Well, a while ago I saw a back-to-front Trabant on the streets of
> > Warsaw, a quick google and here you go:
> >
> > http://autofoto.pl/blogs/prezes/archive/2009/05/11/trabant-je-d-cy-ty-em.aspxa
> >
> > http://piotr.biegala.pl/foto/displayimage.php?pid=342&fullsize=1
> > http://piotr.biegala.pl/foto/displayimage.php?pid=343&fullsize=1
> 
> I'd consider this guy living on the sharp edge, because anyone who has
> seen lot's of Trabis on the streets will break when he sees this
> vehicle or join left traffic ;)

Hehe... Those who have seen lots of Trabbies are on the verge of
extinction these days (yeah, I know they're *not*).

-- 
[a]



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-02 Thread Anselm R Garbe
On 1 February 2010 23:56, Antoni Grzymala  wrote:
> Well, a while ago I saw a back-to-front Trabant on the streets of
> Warsaw, a quick google and here you go:
>
> http://autofoto.pl/blogs/prezes/archive/2009/05/11/trabant-je-d-cy-ty-em.aspxa
>
> http://piotr.biegala.pl/foto/displayimage.php?pid=342&fullsize=1
> http://piotr.biegala.pl/foto/displayimage.php?pid=343&fullsize=1

I'd consider this guy living on the sharp edge, because anyone who has
seen lot's of Trabis on the streets will break when he sees this
vehicle or join left traffic ;)

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Tue, Feb 2, 2010 at 12:23 AM, pancake  wrote:
> On Mon, 1 Feb 2010 22:31:38 +0100
> Uriel  wrote:
>
>> On Mon, Feb 1, 2010 at 2:20 PM, pancake  wrote:
>> > anonymous wrote:
>> >>>
>> >>> Having said that, in case of rfork vice versa from FreeBSD.
>> >>>
>> >>
>> >> Yes, I am talking about FreeBSD. With configure you can make your
>> >> program portable between FreeBSD and Linux. Most probably other
>> >> systems won't implement clone/rfork their own way so program will be
>> >> portable between all systems with some kind of rfork implementation.
>> >>
>> >
>> > in that specific case i would prefer to use __FreeBSD__ ifdef instead of a
>> > configure stage.
>>
>> This is totally and completely RETARDED. #ifdefs are a disgrace and
>> people that use them should be shot on sight.
>
> if you deny ifdefs for minimal portability fixes and deny configure options
> to specify OS or way to compile this program you are denying also portability
> and incrementing the complexity in development and structuration.

This claim is patently ridiculous and wrong. As The Practice of
Programming points out the only proper way to write portable code is
to restrict yourself to the shared subset of interfaces available on
all desired platforms, this certainly *reduces* complexity, and there
are tons of software out there that use this approach and work just
fine on pretty much any platform imaginable.

Hell even dwm has no ifdefs or configuration step (or it didn't until
stupid XINERAMA support was added).

If due to the nature of the app one *really* needs to access
system-specific APIs (this is much more rare than people claims to the
point that I had trouble finding an example)) there are perfectly fine
ways to do this without using a retarded configuration step or insane
#ifdefs, for an example of how to do this see drawterm:
http://code.swtch.com/drawterm/src/ (and yes, drawterm has a handful
of ifdefs but most of them are either to comment out code or to enable
some compiler specific pragmas, and the rest should be done away with
and as far as I can tell were added by people that didn't quite know
how to do things properly).

uriel



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Tue, Feb 2, 2010 at 12:09 AM, Nicolai Waniek  wrote:
> On 02/01/2010 10:25 PM, Uriel wrote:
>> If you define your personal identity based on the colors of your
>> fucking window manager I feel sorry for your pathetic worthless life.
>
> This is not the first time that you confuse cause and result. Additionally, 
> you
> seem to not have a fucking clue about people's different perception of the
> world around them (e.g. emotional/biophysical influence on colors) or the most
> mundane knowledge of psychophysics in general.
>
> You should definitely stop talking about this topic if you don't want to
> ridicule yourself anymore.

And you are so dumb that you are intellectually incapable of
differentiating between the aesthetics of a tool and artistic
expression, in both cases aesthetics are very important but in very
different ways.

If you care about art you put up real paintings on your wall, you
don't spend your life treating your fucking window manager the way
some nitwit teenage girl treats the cover of their crap cellphone.

uriel



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Mon, Feb 1, 2010 at 10:46 PM, Charlie Kester  wrote:
> On Mon 01 Feb 2010 at 13:30:00 PST Uriel wrote:
>>
>> On Mon, Feb 1, 2010 at 2:06 PM, anonymous  wrote:

 Having said that, in case of rfork vice versa from FreeBSD.
>>>
>>> Yes, I am talking about FreeBSD. With configure you can make your
>>> program portable between FreeBSD and Linux. Most probably other
>>> systems won't implement clone/rfork their own way so program will be
>>> portable between all systems with some kind of rfork implementation.
>>
>> This is bullshit, one of the reasons I gave up using FreeBSD long ago
>> is because so much crap software that used auto*hell would blow up
>> when trying to build it on FreeBSD, and trying to fix up auto*hell so
>> the damned thing would build was a fucking nightmare.
>
> Perhaps that was a problem "long ago" but it doesn't seem to be a
> problem *now*.  I've been using FreeBSD since release 7.0 and have never
> had a problem with configure.

How many apps have you installed *not* from the ports tree that use
auto*hell? Note that this issue has *zero* to do with FreeBSD and all
to do with braindead auto*hell scripts.

That said, FreeBSD has been an ever growing pile of shit since the 4.x series.

uriel



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Joseph Xu
On Tue, Feb 02, 2010 at 12:09:28AM +0100, Nicolai Waniek wrote:
> On 02/01/2010 10:25 PM, Uriel wrote:
> > If you define your personal identity based on the colors of your
> > fucking window manager I feel sorry for your pathetic worthless life.
> 
> This is not the first time that you confuse cause and result. Additionally, 
> you
> seem to not have a fucking clue about people's different perception of the
> world around them (e.g. emotional/biophysical influence on colors) or the most
> mundane knowledge of psychophysics in general.
> 
> You should definitely stop talking about this topic if you don't want to
> ridicule yourself anymore.
> 

I remember tweaking my Enlightenment theme and background every 10
minutes instead of getting work done in college, so it doesn't really
seem like a productivity enhancing feature. Plus, configurable themes
means extra software bloat.

Please excuse my lack of knowledge of psychophysics in general.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Antoni Grzymala
Chris Palmer dixit (2010-02-01, 15:48):

> Anselm R Garbe writes:
> 
> > "[...] as even refueling the car required lifting the hood, filling the
> > tank with gasoline (only 24 litres[1]), then adding two-stroke oil and
> > shaking it back and forth to mix."
> 
> Never mind that bit of compile-time configuration -- look at this filth!
> 
> http://en.wikipedia.org/wiki/File:Trabant_RS02%28ThKraft%29.jpg
> 
> People who paint their cars should be stabbed to death! With a configure
> script! I love milking cows that have a single pre-determined pattern of
> black and white spots!! CAKE LACED WITH PCP FOR MY BIRTHDAY!!

Well, a while ago I saw a back-to-front Trabant on the streets of
Warsaw, a quick google and here you go:

http://autofoto.pl/blogs/prezes/archive/2009/05/11/trabant-je-d-cy-ty-em.aspx

http://piotr.biegala.pl/foto/displayimage.php?pid=342&fullsize=1
http://piotr.biegala.pl/foto/displayimage.php?pid=343&fullsize=1

-- 
[a]



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Chris Palmer
Anselm R Garbe writes:

> "[...] as even refueling the car required lifting the hood, filling the
> tank with gasoline (only 24 litres[1]), then adding two-stroke oil and
> shaking it back and forth to mix."

Never mind that bit of compile-time configuration -- look at this filth!

http://en.wikipedia.org/wiki/File:Trabant_RS02%28ThKraft%29.jpg

People who paint their cars should be stabbed to death! With a configure
script! I love milking cows that have a single pre-determined pattern of
black and white spots!! CAKE LACED WITH PCP FOR MY BIRTHDAY!!




Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Chris Palmer
Uriel writes:

> If you define your personal identity based on the colors of your fucking
> window manager I feel sorry for your pathetic worthless life.

Have you considered smoking less PCP?




Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread pancake
On Mon, 1 Feb 2010 22:31:38 +0100
Uriel  wrote:

> On Mon, Feb 1, 2010 at 2:20 PM, pancake  wrote:
> > anonymous wrote:
> >>>
> >>> Having said that, in case of rfork vice versa from FreeBSD.
> >>>
> >>
> >> Yes, I am talking about FreeBSD. With configure you can make your
> >> program portable between FreeBSD and Linux. Most probably other
> >> systems won't implement clone/rfork their own way so program will be
> >> portable between all systems with some kind of rfork implementation.
> >>
> >
> > in that specific case i would prefer to use __FreeBSD__ ifdef instead of a
> > configure stage.
> 
> This is totally and completely RETARDED. #ifdefs are a disgrace and
> people that use them should be shot on sight.

if you deny ifdefs for minimal portability fixes and deny configure options
to specify OS or way to compile this program you are denying also portability
and incrementing the complexity in development and structuration.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Nicolai Waniek
On 02/01/2010 10:25 PM, Uriel wrote:
> If you define your personal identity based on the colors of your
> fucking window manager I feel sorry for your pathetic worthless life.

This is not the first time that you confuse cause and result. Additionally, you
seem to not have a fucking clue about people's different perception of the
world around them (e.g. emotional/biophysical influence on colors) or the most
mundane knowledge of psychophysics in general.

You should definitely stop talking about this topic if you don't want to
ridicule yourself anymore.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Rob
> If you define your personal identity based on the colors of your
> fucking window manager I feel sorry for your pathetic worthless life.
>
> uriel

Please keep these unconstrained insults coming, I laughed heartily at
the quoted.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Mate Nagy
> Out of curiosity: what were the other reasons and what did you settle on
> instead (if anything)?
 Windows (and the iPhone)

M.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread jonathan . slark
> You idiots keep missing the point: if you need to change the colors to
> improve your productivity then either the original colors were totally
> broken and the developer that picked them should get a clue and fix
> them, or your brain is broken, and you should stop using computers if
> you can't deal with sane colors.
> 
> One has to wonder by what miracle of god people managed to work for
> centuries without being able to change the color of their pens and
> papers!

An example: I find apps with a white background a problem.  A white background 
is shining a light from your monitor into your eyes, it's a bit like trying to 
read a book outside in direct sunshine, the book is too bright.  Most people 
seem to prefer a white background as it's natural but I appreciate an option to 
make it black.

BTW do you talk to people like this you meet on a day to day basis?  You make 
some good points but your choice of language makes you loose a lot of gravitas.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Antoni Grzymala
Uriel dixit (2010-02-01, 22:30):

> On Mon, Feb 1, 2010 at 2:06 PM, anonymous  wrote:
> >> Having said that, in case of rfork vice versa from FreeBSD.
> >
> > Yes, I am talking about FreeBSD. With configure you can make your
> > program portable between FreeBSD and Linux. Most probably other
> > systems won't implement clone/rfork their own way so program will be
> > portable between all systems with some kind of rfork implementation.
> 
> This is bullshit, one of the reasons I gave up using FreeBSD long ago
> is because so much crap software that used auto*hell would blow up
> when trying to build it on FreeBSD, and trying to fix up auto*hell so
> the damned thing would build was a fucking nightmare.

Out of curiosity: what were the other reasons and what did you settle on
instead (if anything)?

-- 
[a]



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Charlie Kester

On Mon 01 Feb 2010 at 13:30:00 PST Uriel wrote:

On Mon, Feb 1, 2010 at 2:06 PM, anonymous  wrote:

Having said that, in case of rfork vice versa from FreeBSD.


Yes, I am talking about FreeBSD. With configure you can make your
program portable between FreeBSD and Linux. Most probably other
systems won't implement clone/rfork their own way so program will be
portable between all systems with some kind of rfork implementation.


This is bullshit, one of the reasons I gave up using FreeBSD long ago
is because so much crap software that used auto*hell would blow up
when trying to build it on FreeBSD, and trying to fix up auto*hell so
the damned thing would build was a fucking nightmare.


Perhaps that was a problem "long ago" but it doesn't seem to be a
problem *now*.  I've been using FreeBSD since release 7.0 and have never
had a problem with configure.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Mon, Feb 1, 2010 at 3:46 PM, Anselm R Garbe  wrote:
> On 1 February 2010 13:30,   wrote:
>> experts rule: Actually they don't want!  Ever seen a suckless car, or
>> mobile phone?
>
> There was the DDR Trabant, which I consider quite close to a suckless
> car: http://en.wikipedia.org/wiki/Trabant
>
> As for a mobile phone I'd say that the iphone is quite suckless in
> some respects. Afaik its UI is not very customisable and it runs only
> 1 app at a time which is a nice restriction and eliminates a whole
> bunch of problems (well and has positive side-effect on power
> consumption as well).

I hate apple and the iphone with passion, but this is one thing they
got completely right: all the morons that claim they need to waste
their life fiddling around with stupid colors and shit have been
proven wrong by the millions of people that use iPhones without any
need for such crap.

uriel

> Cheers,
> Anselm
>
>



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Mon, Feb 1, 2010 at 3:26 PM, markus schnalke  wrote:
> [2010-02-01 13:06] Anselm R Garbe 
>> On 1 February 2010 12:52,   wrote:
>> >
>> > This is my PC and I decide what colours are used.
>>
>> To be fair Uriel isn't completely wrong. In an ideal world everyone
>> would just use the software as is and not waste time on fiddling
>> around with colors and such. But obviously a lot of people like
>> customizing things/making them different to the default. I'm not sure
>> what the reason is [...]
>
> Reasons to change the colors are that this may improve your
> productivity or comfort.

You idiots keep missing the point: if you need to change the colors to
improve your productivity then either the original colors were totally
broken and the developer that picked them should get a clue and fix
them, or your brain is broken, and you should stop using computers if
you can't deal with sane colors.

One has to wonder by what miracle of god people managed to work for
centuries without being able to change the color of their pens and
papers!

uriel

> The point it, that colors are nearly completely unrelated to the
> functionality of the program. They are only cosmetic, and thus
> everthing related to them should not add complexity in any way.
>
> I'd adjust the colors on my computer though, but by editing the code
> directly.
>
> But tagging rules are an example of custumization of dwm, that does
> not directly changes it's functions, but how it operates in the
> specific environment. This is similar to mailcap, termcap and the
> like.
>
> In my eyes, this is where ``configuration'' is important. (In contrast
> to colors, which are only cosmetic, and layouting algorithms which are
> basic functionality and thus should be changed in the main source
> directly, if at all.)
>
>
> meillo
>
>



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Mon, Feb 1, 2010 at 2:20 PM, pancake  wrote:
> anonymous wrote:
>>>
>>> Having said that, in case of rfork vice versa from FreeBSD.
>>>
>>
>> Yes, I am talking about FreeBSD. With configure you can make your
>> program portable between FreeBSD and Linux. Most probably other
>> systems won't implement clone/rfork their own way so program will be
>> portable between all systems with some kind of rfork implementation.
>>
>
> in that specific case i would prefer to use __FreeBSD__ ifdef instead of a
> configure stage.

This is totally and completely RETARDED. #ifdefs are a disgrace and
people that use them should be shot on sight.

uriel

> But for -lpthread and -pthread ..i will probably use a configure stage.
> because its something freebsd-specific for linking.
>
> --pancake
>
>



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Mon, Feb 1, 2010 at 2:06 PM, anonymous  wrote:
>> Having said that, in case of rfork vice versa from FreeBSD.
>
> Yes, I am talking about FreeBSD. With configure you can make your
> program portable between FreeBSD and Linux. Most probably other
> systems won't implement clone/rfork their own way so program will be
> portable between all systems with some kind of rfork implementation.

This is bullshit, one of the reasons I gave up using FreeBSD long ago
is because so much crap software that used auto*hell would blow up
when trying to build it on FreeBSD, and trying to fix up auto*hell so
the damned thing would build was a fucking nightmare.

On the other hand programs without auto*hell that were in the ports
tree 'just worked' (while the ones that used auto*hell and were in the
ports tree just wasted my fucking times running 'tests' for shit that
the packager already knew).

uriel



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Mon, Feb 1, 2010 at 12:16 PM, Nicolai Waniek  wrote:
> On 02/01/2010 12:02 PM, Uriel wrote:
>> People are retards that should get a life, and developers that can't
>> pick bearable colors should not pick colors (just ask for advice from
>> an artists as Rob did for acme and rio).
>
> Desktop Look&Feel Communism up ahead.
> Yours is the most retarded and human-diversification-ignoring comment I read 
> in
> a long while now on this mailing list.

If you define your personal identity based on the colors of your
fucking window manager I feel sorry for your pathetic worthless life.

uriel



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread hiro
Hah, a trabbie sucks less?! That's pure idiocy!



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Antoni Grzymala
Anselm R Garbe dixit (2010-02-01, 15:58):

> On 1 February 2010 15:49,   wrote:
> > * Anselm R Garbe  [2010-02-01 15:48]:
> >> On 1 February 2010 13:30,   wrote:
> >> > experts rule: Actually they don't want!  Ever seen a suckless car, or
> >> > mobile phone?
> >>
> >> There was the DDR Trabant, which I consider quite close to a suckless
> >> car: http://en.wikipedia.org/wiki/Trabant
> >
> > Well, Trabi is close to suckless, I agree. I still enjoy the simplicity 
> > when I
> > have a ride with an owner of an old one occasionally. But it is not safe,
> > for instance. Safety, in turn, is generally important, but not that much an
> > issue for the everyday home-work-back trip in a large city.
> 
> Security is relative with a car like the Trabant. Back in GDR times it
> was rather secure since it's maximum speed was around 120km/h and
> roads weren't as crowded as today and hence car accidents were a rare
> occasion. Driving a Trabant today is surely a security risk but so is
> driving an original Mini Cooper as well or some other classic car.

That's major bullshit. Please...

-- 
[a]



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Anselm R Garbe
On 1 February 2010 15:49,   wrote:
> * Anselm R Garbe  [2010-02-01 15:48]:
>> On 1 February 2010 13:30,   wrote:
>> > experts rule: Actually they don't want!  Ever seen a suckless car, or
>> > mobile phone?
>>
>> There was the DDR Trabant, which I consider quite close to a suckless
>> car: http://en.wikipedia.org/wiki/Trabant
>
> Well, Trabi is close to suckless, I agree. I still enjoy the simplicity when I
> have a ride with an owner of an old one occasionally. But it is not safe,
> for instance. Safety, in turn, is generally important, but not that much an
> issue for the everyday home-work-back trip in a large city.

Security is relative with a car like the Trabant. Back in GDR times it
was rather secure since it's maximum speed was around 120km/h and
roads weren't as crowded as today and hence car accidents were a rare
occasion. Driving a Trabant today is surely a security risk but so is
driving an original Mini Cooper as well or some other classic car.

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread stanio
* Anselm R Garbe  [2010-02-01 15:48]:
> On 1 February 2010 13:30,   wrote:
> > experts rule: Actually they don't want!  Ever seen a suckless car, or
> > mobile phone?
> 
> There was the DDR Trabant, which I consider quite close to a suckless
> car: http://en.wikipedia.org/wiki/Trabant

Well, Trabi is close to suckless, I agree. I still enjoy the simplicity when I
have a ride with an owner of an old one occasionally. But it is not safe,
for instance. Safety, in turn, is generally important, but not that much an
issue for the everyday home-work-back trip in a large city.

> As for a mobile phone I'd say that the iphone is quite suckless in
> some respects. Afaik its UI is not very customisable and it runs only
> 1 app at a time which is a nice restriction and eliminates a whole
> bunch of problems (well and has positive side-effect on power
> consumption as well).

In the cited respects, maybe. But a mobile phone with integrated camera,
touch screen, 'apps' for learning languages, etc. is as much suckless as an
axe with a door bell, toilet paper and nuclear power generator. 

-- 
 stanio_



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Anselm R Garbe
A remark about the Trabant, quote from the Wikipedia article

"[...] as even refueling the car required lifting the hood, filling
the tank with gasoline (only 24 litres[1]), then adding two-stroke oil
and shaking it back and forth to mix."

This isn't correct, in the GDR each petrol station had petrol that was
prepared for the Trabant, there was no such thing as mixing the two
stroke-oil and shaking, that is absolute nonsense (it is kind of true
if you have such a car nowadays since petrol stations stopped to sell
two stroke fuel).

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Anselm R Garbe
On 1 February 2010 13:30,   wrote:
> experts rule: Actually they don't want!  Ever seen a suckless car, or
> mobile phone?

There was the DDR Trabant, which I consider quite close to a suckless
car: http://en.wikipedia.org/wiki/Trabant

As for a mobile phone I'd say that the iphone is quite suckless in
some respects. Afaik its UI is not very customisable and it runs only
1 app at a time which is a nice restriction and eliminates a whole
bunch of problems (well and has positive side-effect on power
consumption as well).

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread markus schnalke
[2010-02-01 13:06] Anselm R Garbe 
> On 1 February 2010 12:52,   wrote:
> >
> > This is my PC and I decide what colours are used.
> 
> To be fair Uriel isn't completely wrong. In an ideal world everyone
> would just use the software as is and not waste time on fiddling
> around with colors and such. But obviously a lot of people like
> customizing things/making them different to the default. I'm not sure
> what the reason is [...]

Reasons to change the colors are that this may improve your
productivity or comfort.


The point it, that colors are nearly completely unrelated to the
functionality of the program. They are only cosmetic, and thus
everthing related to them should not add complexity in any way.

I'd adjust the colors on my computer though, but by editing the code
directly.


But tagging rules are an example of custumization of dwm, that does
not directly changes it's functions, but how it operates in the
specific environment. This is similar to mailcap, termcap and the
like.

In my eyes, this is where ``configuration'' is important. (In contrast
to colors, which are only cosmetic, and layouting algorithms which are
basic functionality and thus should be changed in the main source
directly, if at all.)


meillo



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Claudio M. Alessi
On Mon, Feb 01, 2010 at 01:06:09PM +, Anselm R Garbe wrote:
> To be fair Uriel isn't completely wrong. In an ideal world everyone
> would just use the software as is and not waste time on fiddling
> around with colors and such. But obviously a lot of people like
> customizing things/making them different to the default. I'm not sure
> what the reason is, but people fiddle around their cars, by bigger
> wheels, bigger exhausts, make the windows black etc. Same with
> software and desktop setups.
Yes, he is completely wrong. Using software as is means using a software with
the developer's tastes, not mine. That's a totally retarded concept (almost
like the previous Uriel's statement). It's not only a matter of colors, but
most important things like fonts, key bindings, and so forth. That's where
config.h wins againts initrc.local, where you can't also configure the PLAN9
base (well, you can't but it's useless) without have to change the shesbang of
werc.bin.

> So the ideal world doesn't exist, however we think the lesser options
> there are, the better. The best tools are those that have no options,
> like nearly no one changes the look of the vacuum cleaner once bought,
> or of your micro wave, or of your iron board, or nearly no one
> repaints the case of a TV.
The ideal world does exists: we have suckless software and config.h. Well, we
also have werc.
The best tools are those which are better. Period. Less options is not better,
the need to have less options converge with a better software. It's different.
In the real world, you don't have any real advantages by changing how the
vacuum cleaner looks like, while you obviously will find much more comfortable
a font (or color) which fits well your eyes; expecially if you spend much of
your time ahead a monitor. This will make you less distract, much more
efficient and productive and also much happy. That's our real perfect world.

> The point is people would be able doing much more useful stuff when
> they won't spend their time with fiddling around with things that are
> not mandatory == eg not customising cars but coming up with some great
> philosophy instead ;)
Philosophy is nothing if not applied to the real world. If it doesn't fits with
the real world, then is a bad philosophy (or, this is the case, retarded).


Regards,
Claudio M. Alessi

-- 
JID: smoppy AT gmail.com
WWW: http://cma.teroristi.org



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread twfb
On 11:15 Mon 01 Feb, Anselm R Garbe wrote:
> Well if you ask artists they will come up with gradients, translucency
> and other bullshit. I think the default color scheme in dwm is great.

Then you are asking the wrong "artists".

> >> I know you will say there shouldn't be any options, but even werc has 
> >> options ;)

Dwm stands out from the other suckless projects in that it's a great
piece of software with adequate settings as is, even installed from
binary. Dwm doesn't really have any options if you look at it from this
angle. Perhaps the default settings could be improved to make dwm even
more usable/perfect straight out of the box.

I agree with Uriel, optionless is an improvement on suckless. A small
shift in attitude that could improve the software. 

-- 
TWFB  -  PGP: D7A420B3



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread stanio
* Anselm R Garbe  [2010-02-01 14:09]:
> On 1 February 2010 12:52,   wrote:
> >> People are retards that should get a life,
> >> [...]
> >> shortcuts are part of the UI which should be sane and consistent.
> >
> > This is my PC and I decide what colours are used.
> like nearly no one changes the look of the vacuum cleaner once bought,
> or of your micro wave, or of your iron board, or nearly no one
> repaints the case of a TV.

When I am looking at terminal most of the day, it does matter whether its
color is gentle to my eyes, or they get tired after 2 hours. And if non of the
standard configurations does, I appreciate when I'm able to to set it to
what I feel is gentle. 

Even when buying a vacuum cleaner you have preferences. You have to go for
specific model rather than configuring one yourself, which is a bit like
precompiled stuff you'll never be able to make clean && make  the way you
like. 

In software you have all the freedom to make things modular and still
maintainable and sane, that's the difference. 

In hardware you don't. Especially in 'hype' segments, where marketing
experts rule: Actually they don't want!  Ever seen a suckless car, or
mobile phone?


-- 
 stanio_



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread pancake

anonymous wrote:

Having said that, in case of rfork vice versa from FreeBSD.



Yes, I am talking about FreeBSD. With configure you can make your
program portable between FreeBSD and Linux. Most probably other
systems won't implement clone/rfork their own way so program will be
portable between all systems with some kind of rfork implementation.
  
in that specific case i would prefer to use __FreeBSD__ ifdef instead of 
a configure stage.


But for -lpthread and -pthread ..i will probably use a configure stage. 
because its something freebsd-specific for linking.


--pancake



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Anselm R Garbe
On 1 February 2010 13:06, anonymous  wrote:
>> Having said that, in case of rfork vice versa from FreeBSD.
>
> Yes, I am talking about FreeBSD. With configure you can make your
> program portable between FreeBSD and Linux. Most probably other
> systems won't implement clone/rfork their own way so program will be
> portable between all systems with some kind of rfork implementation.

Well in such a case you might want to provide an abstraction like
implementing pclone() and that is implemented as rfork on FreeBSD and
clone on Linux and then you provide two Makefiles or config.mk's for
inclusion, one building pclone() using rfork that is being used on
FreeBSD and the other one building it using clone() on Linux.

I can't see why you'd want configure or something similar for such
kind of stuff.

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread anonymous
> Having said that, in case of rfork vice versa from FreeBSD.

Yes, I am talking about FreeBSD. With configure you can make your
program portable between FreeBSD and Linux. Most probably other
systems won't implement clone/rfork their own way so program will be
portable between all systems with some kind of rfork implementation.




Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Anselm R Garbe
On 1 February 2010 12:52,   wrote:
>> People are retards that should get a life, and developers that can't
>> pick bearable colors should not pick colors (just ask for advice from
>> an artists as Rob did for acme and rio). Layout algorithms are more an
>> intrinsic part of the application and should not be considered 'an
>> option' (and configuring them via a .h file is plain idiotic),
>> shortcuts are part of the UI which should be sane and consistent.
>
> This is my PC and I decide what colours are used.

To be fair Uriel isn't completely wrong. In an ideal world everyone
would just use the software as is and not waste time on fiddling
around with colors and such. But obviously a lot of people like
customizing things/making them different to the default. I'm not sure
what the reason is, but people fiddle around their cars, by bigger
wheels, bigger exhausts, make the windows black etc. Same with
software and desktop setups.

So the ideal world doesn't exist, however we think the lesser options
there are, the better. The best tools are those that have no options,
like nearly no one changes the look of the vacuum cleaner once bought,
or of your micro wave, or of your iron board, or nearly no one
repaints the case of a TV.

The point is people would be able doing much more useful stuff when
they won't spend their time with fiddling around with things that are
not mandatory == eg not customising cars but coming up with some great
philosophy instead ;)

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread jonathan . slark
> People are retards that should get a life, and developers that can't
> pick bearable colors should not pick colors (just ask for advice from
> an artists as Rob did for acme and rio). Layout algorithms are more an
> intrinsic part of the application and should not be considered 'an
> option' (and configuring them via a .h file is plain idiotic),
> shortcuts are part of the UI which should be sane and consistent.

This is my PC and I decide what colours are used.

Jon.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Anselm R Garbe
On 1 February 2010 11:45, pancake  wrote:
> Anselm R Garbe wrote:

 People have different taste regarding the colors, fonts, layout
 algorithms, shortcuts etc.

>>>
>>> People are retards that should get a life, and developers that can't
>>> pick bearable colors should not pick colors (just ask for advice from
>>> an artists as Rob did for acme and rio). Layout algorithms are more an
>>> intrinsic part of the application and should not be considered 'an
>>> option' (and configuring them via a .h file is plain idiotic),
>>> shortcuts are part of the UI which should be sane and consistent.
>>>
>>
>> Well if you ask artists they will come up with gradients, translucency
>> and other bullshit. I think the default color scheme in dwm is great.
>>
>
> I hate it ;) The radioactive blue and white burns my eyes. I use black,
> gray and orange.

Well it is very similar to the color scheme of Norton Commander,
Windows 3.x, Windows 95 and Windows XP with the classic UI. I think
that blue/grey/white is the most widespread default color scheme. And
following the acme approach == asking artists will always result in
big arguments. I for example hate the acme or Plan 9 color scheme, it
looks so 1980ish  and to some extend I doubt that Rob or anyone else
asked a well recognised UI designer for the color scheme. Obviously I
didn't ask UI designers either, but I followed the MS scheme which did
some research in this area, at least in Windows 95 for sure.

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread pancake

Anselm R Garbe wrote:

People have different taste regarding the colors, fonts, layout algorithms, 
shortcuts etc.
  

People are retards that should get a life, and developers that can't
pick bearable colors should not pick colors (just ask for advice from
an artists as Rob did for acme and rio). Layout algorithms are more an
intrinsic part of the application and should not be considered 'an
option' (and configuring them via a .h file is plain idiotic),
shortcuts are part of the UI which should be sane and consistent.



Well if you ask artists they will come up with gradients, translucency
and other bullshit. I think the default color scheme in dwm is great.
  

I hate it ;) The radioactive blue and white burns my eyes. I use black,
gray and orange.

there's no standards about colour tastes.

Well I disagree, there is no real difference between werc's
initrc[.local] and dwm's config.h.
  

The difference is that one is at compile time and werc cannot be compiled.

So yeah, no difference at all.

--pancake



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Nicolai Waniek
On 02/01/2010 12:02 PM, Uriel wrote:
> People are retards that should get a life, and developers that can't
> pick bearable colors should not pick colors (just ask for advice from
> an artists as Rob did for acme and rio). 

Desktop Look&Feel Communism up ahead.
Yours is the most retarded and human-diversification-ignoring comment I read in
a long while now on this mailing list.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Anselm R Garbe
On 1 February 2010 11:02, Uriel  wrote:
> On Mon, Feb 1, 2010 at 8:18 AM, Anselm R Garbe  wrote:
>> I agree to all you said, except:
>>
>> On 31 January 2010 22:00, Uriel  wrote:
>>> No, it is not OK, the gratuitous fiddling with the .h files is one of
>>> the most retarded things about dwm.
>>
>> If you know a better way, please let me know. The idea behind config.h
>> is to provide a mechanism where people can customize and extend dwm
>> without hacking into core dwm.c.
>
> Like with auto*hell, the idea is retarded, so the implementation can't not 
> suck.

It's not like configure, it simply eases source modifications/patching.

>> People have different taste regarding the colors, fonts, layout algorithms, 
>> shortcuts etc.
>
> People are retards that should get a life, and developers that can't
> pick bearable colors should not pick colors (just ask for advice from
> an artists as Rob did for acme and rio). Layout algorithms are more an
> intrinsic part of the application and should not be considered 'an
> option' (and configuring them via a .h file is plain idiotic),
> shortcuts are part of the UI which should be sane and consistent.

Well if you ask artists they will come up with gradients, translucency
and other bullshit. I think the default color scheme in dwm is great.

>> I know you will say there shouldn't be any options, but even werc has 
>> options ;)
>
> Werc has few (if any) options that are not intrinsically linked to
> *functionality* whatever a page is a wiki or a blog is not an 'option'
> it is simply a different functionality part of the same app, and
> things like page titles are also an intrinsic part of the application
> (just as an app name is not an 'option' in a window manager but an
> intrinsic part of its functionality).

Well I disagree, there is no real difference between werc's
initrc[.local] and dwm's config.h.

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Uriel
On Mon, Feb 1, 2010 at 8:18 AM, Anselm R Garbe  wrote:
> I agree to all you said, except:
>
> On 31 January 2010 22:00, Uriel  wrote:
>> No, it is not OK, the gratuitous fiddling with the .h files is one of
>> the most retarded things about dwm.
>
> If you know a better way, please let me know. The idea behind config.h
> is to provide a mechanism where people can customize and extend dwm
> without hacking into core dwm.c.

Like with auto*hell, the idea is retarded, so the implementation can't not suck.

> People have different taste regarding the colors, fonts, layout algorithms, 
> shortcuts etc.

People are retards that should get a life, and developers that can't
pick bearable colors should not pick colors (just ask for advice from
an artists as Rob did for acme and rio). Layout algorithms are more an
intrinsic part of the application and should not be considered 'an
option' (and configuring them via a .h file is plain idiotic),
shortcuts are part of the UI which should be sane and consistent.

> I know you will say there shouldn't be any options, but even werc has options 
> ;)

Werc has few (if any) options that are not intrinsically linked to
*functionality* whatever a page is a wiki or a blog is not an 'option'
it is simply a different functionality part of the same app, and
things like page titles are also an intrinsic part of the application
(just as an app name is not an 'option' in a window manager but an
intrinsic part of its functionality).

uriel

> Cheers,
> Anselm
>
>



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Dmitry Maluka
On Mon, Feb 01, 2010 at 09:48:37AM +, Anselm R Garbe wrote:
> IMHO such a package manager is not needed, all we need are static
> executables of each tool what I try to achieve with static linux. Only
> exception are config files for daemons and tools, however this is all
> achievable using git or rsync for upgrading.
> 
> So there is really no need for a package management system ;)

I was talking on a more general problem. Though it's just a concept or
maybe a dream. :) In this world, developers distribute their software in
native source tarballs containing human- but not machine-readable
instructions for building, installation, uninstallation etc. Many people
prefer manual searching, downloading and installation just because they
want to deal with native software packages provided by software
developers, not intermediate maintainers. Why not automate our actions,
retaining flexibility and transparency? No package repositories, just
package metadata servers with package URLs; developers registering their
(native) packages at that servers; servers synchronizing metadata
between each other; define the metadata format allowing automated
package management. Though people may still do all the stuff manually.

And, what was this thread about? Annoying inconsistency of system
interfaces (due to bloated and at the same time incomplete standards)
leaves us without guarantee that something will work somewhere. Aren't
we sick of that? In a good world, software developers would follow
systems approach when making their software be used world-wide. They
rely upon simple well-defined interfaces. Interfaces are to be
registered at metadata servers too. Packages are tarballs to be
downloaded, untarred and done some actions defined by package metadata
(usually make install, make uninstall etc.) after cd'ing into the
untarred directory. (That's one of possible ways.) If something doesn't
work, it means a bug in implementation of some package, i.e. behavior
violating some of the documented interfaces. That bug can be simply
discovered and fixed.  Nice dream, isn't it?



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Anselm R Garbe
On 1 February 2010 09:38, Dmitry Maluka  wrote:
> On Sun, Jan 31, 2010 at 11:00:58PM +0100, Uriel wrote:
>> There are retarded standards for all kinds of crap, too bad that there
>> are thousands of standards and nobody follows them anyway.
>>
>> It is simple, the system user knows much better where shit is than the
>> developer can dream knowing, if the developer tries to guess he will
>> invariably fuck it up and waste even more of the user's time.
>>
>> If you want pre-chewed software, use whatever packaging system your OS
>> provides and let packagers deal with this, expecting the original
>> software developers to do it is extremely naive.
>
> I think a lot on a concept of an OS-independent package manager destined
> not just to automate software installation but to make software
> development and distribution more consistent. (And to get rid of extra
> layer of software maintainance for each OS.) In this hypothetical
> concept, package is a unit of world-wide software distribution with some
> dependencies, but (sic!) dependencies are not just other packages - they
> are _interfaces_ provided by other packages, by the base system or
> whatever. This is a simple and evident idea based on an assumption that
> any system relies upon well-defined interfaces provided by other
> systems. Those interfaces are documented by humans in systems
> documentation or in well known standards. Unfortunately, this would work
> in an ideal world or at least a good one, not in this one. There are
> some good standards but they are a puny minority.

IMHO such a package manager is not needed, all we need are static
executables of each tool what I try to achieve with static linux. Only
exception are config files for daemons and tools, however this is all
achievable using git or rsync for upgrading.

So there is really no need for a package management system ;)

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-02-01 Thread Dmitry Maluka
On Sun, Jan 31, 2010 at 11:00:58PM +0100, Uriel wrote:
> There are retarded standards for all kinds of crap, too bad that there
> are thousands of standards and nobody follows them anyway.
>
> It is simple, the system user knows much better where shit is than the
> developer can dream knowing, if the developer tries to guess he will
> invariably fuck it up and waste even more of the user's time.
> 
> If you want pre-chewed software, use whatever packaging system your OS
> provides and let packagers deal with this, expecting the original
> software developers to do it is extremely naive.

I think a lot on a concept of an OS-independent package manager destined
not just to automate software installation but to make software
development and distribution more consistent. (And to get rid of extra
layer of software maintainance for each OS.) In this hypothetical
concept, package is a unit of world-wide software distribution with some
dependencies, but (sic!) dependencies are not just other packages - they
are _interfaces_ provided by other packages, by the base system or
whatever. This is a simple and evident idea based on an assumption that
any system relies upon well-defined interfaces provided by other
systems. Those interfaces are documented by humans in systems
documentation or in well known standards. Unfortunately, this would work
in an ideal world or at least a good one, not in this one. There are
some good standards but they are a puny minority.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-31 Thread Anselm R Garbe
I agree to all you said, except:

On 31 January 2010 22:00, Uriel  wrote:
> No, it is not OK, the gratuitous fiddling with the .h files is one of
> the most retarded things about dwm.

If you know a better way, please let me know. The idea behind config.h
is to provide a mechanism where people can customize and extend dwm
without hacking into core dwm.c. People have different taste regarding
the colors, fonts, layout algorithms, shortcuts etc.

I know you will say there shouldn't be any options, but even werc has options ;)

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-31 Thread Anselm R Garbe
On 1 February 2010 07:06, Anselm R Garbe  wrote:
> On 31 January 2010 22:40, anonymous  wrote:
>>> If you need mkstemp you have have two options: either you restrict
>>> your program to run only on platforms that include it, or you
>>> implement your own version. In either case there is *zero* advantage
>>> of finding out at build time whatever your system includes it or not.
>>
>> What if there is different APIs on different systems? Example is
>> clone/rfork. With configure you can choose one of them at compile time.
>> And even if you want, you can't implement your own version in userspace.
>
> clone/rfork is the typical case for a program that won't be portable
> from Linux to other Unix-like platforms. So if you use clone/rfork you
> have to live with the fact that this will only work on Linux. If you
> can't then write portable code instead and use fork and don't use
> clone.

Having said that, in case of rfork vice versa from FreeBSD.

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-31 Thread Anselm R Garbe
On 31 January 2010 22:40, anonymous  wrote:
>> If you need mkstemp you have have two options: either you restrict
>> your program to run only on platforms that include it, or you
>> implement your own version. In either case there is *zero* advantage
>> of finding out at build time whatever your system includes it or not.
>
> What if there is different APIs on different systems? Example is
> clone/rfork. With configure you can choose one of them at compile time.
> And even if you want, you can't implement your own version in userspace.

clone/rfork is the typical case for a program that won't be portable
from Linux to other Unix-like platforms. So if you use clone/rfork you
have to live with the fact that this will only work on Linux. If you
can't then write portable code instead and use fork and don't use
clone.

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-31 Thread anonymous
> If you need mkstemp you have have two options: either you restrict
> your program to run only on platforms that include it, or you
> implement your own version. In either case there is *zero* advantage
> of finding out at build time whatever your system includes it or not.

What if there is different APIs on different systems? Example is
clone/rfork. With configure you can choose one of them at compile time.
And even if you want, you can't implement your own version in userspace.




Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-31 Thread Uriel
On Sun, Jan 31, 2010 at 10:16 PM, pancake  wrote:
>
>
> On Jan 30, 2010, at 3:58 PM, Uriel  wrote:
>
>>>
>>
>> Plan 9 solves this, the standard set of mkfiles you can use are
>> described here: http://doc.cat-v.org/plan_9/4th_edition/papers/mkfiles
>>
> Will take a look. Thanks
>>
>>> The configure stage is in many situations innecessary, but its just a way
>>> to do different actions to 'configure' the project. This is:
>>> - detect libraries, programs
>>> - check for endianness
>>
>> If your program needs to check for endianness, it is broken, period.
>>
>
> Really?

Yes, really, any competent programmer should know how to write
portable code that works fine no matter the endianness of the host.

>>> - check cpu
>>
>> If your program needs to check for cpu, it is probably broken, and if
>> not, it will break cross compiling and it should allow building
>> versions for each architecture no matter what the current environment
>> is, ideally as the Plan 9 compilers do actually implemented as
>> independent programs.
>
> Think in a crossplatform debugger/profiler. You need to know the CPU and the
> OS otherwise is not portable.

Wrong again, doubly wrong even, a crossplatform debugger/profile has
even more reason to not care what environment it is built under,
specially if it is sane enough to allow transparent access over the
network to work on processes running on other machines that might run
a completely different architecture.

For example, see acid: http://doc.cat-v.org/plan_9/4th_edition/papers/acid

>>
>>> - check for include files
>>
>> This is hopeless, the only proper solution is to provide somewhere for
>> the user to manually define where to find include files and libraries,
>> otherwise your program will be unportable, unless magically it can
>> predict where any system ever created in the past and future will have
>> its headers, which is impossible and is why auto*hell ends up failing
>> miserably at finding shit.
>>
>
> There are standard ways to place and locate include files. It's not that
> cathastrofic at all.

There are retarded standards for all kinds of crap, too bad that there
are thousands of standards and nobody follows them anyway.

It is simple, the system user knows much better where shit is than the
developer can dream knowing, if the developer tries to guess he will
invariably fuck it up and waste even more of the user's time.

If you want pre-chewed software, use whatever packaging system your OS
provides and let packagers deal with this, expecting the original
software developers to do it is extremely naive.

At most the source might include a set of system-specific makefiles
that are known to work on certain given systems and let the user pick
which one to use or customize as needed.


> I hate programs failing at compile time. A simple check to cache paths for
> incldes, files,.. Can make the build cleaner and faster.
>
>>> - check OS (target,host,..) useful for crosscompiling
>>
>> Why the fuck does one need a configure step for crosscopiling?
>> Actually not having one makes crosscompiling infinitely more sane and
>> convenient.
>
> Did you ever compiled a compiler and have specified different host, target
> and build profiles? Tell me how can you set those options outside the
> configure stage.

Please, do not insult my intelligence and do not make me do all your
homework: http://man.cat-v.org/plan_9/1/2c

Just because you are used to terminally braindead software doesn't
mean that software needs to be that way.


> Note that this stage doesnt needs to specially be a ./configure script.
>
> In dwm you also have a configuration stage, but it is not a shellscript.
> It's just manual hack of mk and .h files which is also ok, but still a
> configuration stage.

No, it is not OK, the gratuitous fiddling with the .h files is one of
the most retarded things about dwm.

And I'm done with this discussion, I have provided more than enough
references to show software doesn't need to be built in totally
retarded ways that require totally retarded 'configure' steps.

uriel

>>
>>> - check for system-related libraries to fit certain features (-lsocket in
>>> solaris. fex)
>>
>> If you depend on system-specific features your code is not portable,
>> period, and pretending otherwise is madness.
>
> Think on a profiler or a debugger. Or just any piece of code that is not
> posix you need to change the build stage to adapt to the system. This is..
> To compile other sources or change include path.
>>
>> Also note that all kinds of build time 'configuration' exponentially
>> increase the difficulty of testing and debugging as you basically stop
>> being a single version of your program, and instead you have one
>> version based on what features happen to be 'enabled'.
>>
> Suckless software promotes patches, which is ok for small programs. But for
> big ones you usually want to simplify the compilation by setting up the
> build with some options easily defined by a single line.
>
> Which 

Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-31 Thread pancake



On Jan 30, 2010, at 3:58 PM, Uriel  wrote:





Plan 9 solves this, the standard set of mkfiles you can use are
described here: http://doc.cat-v.org/plan_9/4th_edition/papers/mkfiles


Will take a look. Thanks


The configure stage is in many situations innecessary, but its just  
a way

to do different actions to 'configure' the project. This is:
- detect libraries, programs
- check for endianness


If your program needs to check for endianness, it is broken, period.



Really?


- check cpu


If your program needs to check for cpu, it is probably broken, and if
not, it will break cross compiling and it should allow building
versions for each architecture no matter what the current environment
is, ideally as the Plan 9 compilers do actually implemented as
independent programs.


Think in a crossplatform debugger/profiler. You need to know the CPU  
and the OS otherwise is not portable.





- check for include files


This is hopeless, the only proper solution is to provide somewhere for
the user to manually define where to find include files and libraries,
otherwise your program will be unportable, unless magically it can
predict where any system ever created in the past and future will have
its headers, which is impossible and is why auto*hell ends up failing
miserably at finding shit.



There are standard ways to place and locate include files. It's not  
that cathastrofic at all.


I hate programs failing at compile time. A simple check to cache paths  
for incldes, files,.. Can make the build cleaner and faster.



- check OS (target,host,..) useful for crosscompiling


Why the fuck does one need a configure step for crosscopiling?
Actually not having one makes crosscompiling infinitely more sane and
convenient.


Did you ever compiled a compiler and have specified different host,  
target and build profiles? Tell me how can you set those options  
outside the configure stage.


Note that this stage doesnt needs to specially be a ./configure script.

In dwm you also have a configuration stage, but it is not a  
shellscript. It's just manual hack of mk and .h files which is also  
ok, but still a configuration stage.


- check for system-related libraries to fit certain features (- 
lsocket in

solaris. fex)


If you depend on system-specific features your code is not portable,
period, and pretending otherwise is madness.


Think on a profiler or a debugger. Or just any piece of code that is  
not posix you need to change the build stage to adapt to the system.  
This is.. To compile other sources or change include path.


Also note that all kinds of build time 'configuration' exponentially
increase the difficulty of testing and debugging as you basically stop
being a single version of your program, and instead you have one
version based on what features happen to be 'enabled'.

Suckless software promotes patches, which is ok for small programs.  
But for big ones you usually want to simplify the compilation by  
setting up the build with some options easily defined by a single line.


Which is easily integrable in many build systems. Probably more than  
replacing files or changing environment variables. Which makes errors  
and problems appear if trying to cmpile without the correct env



Also this encourages #ifdef 'pseudo-portability', which is *always*
the *wrong* way to do it.

Yeah, but how you drop ifdef without a way to change the makefiles to  
use other files depending on some options.


If you have to do all this job for every project you do.. you will  
probably
fail at some point. This is why is good to have a centralized  
project that

generates the proper checks to get all this information in a way that
works on most of systems and most of arquitectures.

If you find a problem in a certain arquitecture you just need to  
fix it

in one point, not in all your projects.

The problem is that GNU configure sucks, is slow, is dirty, is  
bloated, is
full of weird hidden features and lot of collateral issues can  
easily appear

which are really hard to debug.


This is completely wrong, the problem with auto*hell is not the
implementation, but the concept and idea itself, the implementation
sucks so much mostly as consequence of how idiotic and retarded the
idea is.



I still think that a configuration stage is useful, but I think that  
an neither make, sed or configure are the right ways to do such step.


But until nobody does it right we will stick discusing as trolls about  
that.




Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-30 Thread Anselm R Garbe
On 29 January 2010 10:32, pancake  wrote:
> The problem I see in makefiles is that they dont follow a strict usage
> rules and there are no 'standards' on their usage. And this is pretty
> anoying. Because with the makefile approach you end up by implementing
> everything from scratch, there's no set of .mk files to do it magically
> or anything else. And of course that magical approach will only cover
> the build, install and distribution stages.

Well I have some handy templates at hand when creating a new makefile.

> The configure stage is in many situations innecessary, but its just a way
> to do different actions to 'configure' the project. This is:
> - detect libraries, programs

Question is if you need to do this programmatically. The config.mk
approach lists all depencies and required tools for building (or if
one likes one can add them to a README file).

> - check for endianness
> - check cpu
> - check for include files
> - check OS (target,host,..) useful for crosscompiling
> - check for system-related libraries to fit certain features (-lsocket in
> solaris. fex)

Same, config.mk settings imho.

> If you have to do all this job for every project you do.. you will probably
> fail at some point. This is why is good to have a centralized project that
> generates the proper checks to get all this information in a way that
> works on most of systems and most of arquitectures.

The problem is that every system is different and those tools that try
to generalize nearly never get it right.
I'm not against using pkg-config for example in Makefiles, but what
else does one really need?

> If you find a problem in a certain arquitecture you just need to fix it
> in one point, not in all your projects.

What kind of problem are you describing here? I agree the config.mk
approach should be designed in a way that you have one central
config.mk for each target platform that you build for. But what else
do you need? Fixing a problem in such a central config.mk file is as
easier than fixing some script that tries to generate makefiles.

> The problem is that GNU configure sucks, is slow, is dirty, is bloated, is
> full of weird hidden features and lot of collateral issues can easily appear
> which are really hard to debug.

The problem is not the GNU configure is slow, dirty, bloated, the
problem is the idea to generate Makefiles using some script or tool
apart from cp or cat ;)

--Anselm

PS: Amen to Uriel.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-30 Thread Uriel
On Fri, Jan 29, 2010 at 11:32 AM, pancake  wrote:
> Anselm R Garbe wrote:
>>
>> Well I've heared these reasons before and I don't buy them. There are
>> toolchains like the BSD ones and they proof pretty much that the
>> "everything is a Makefile approach" is the most portable and
>> sustainable one. Running a configure script from 10 years ago will
>> fail immediately.
>>
>>
>
> Dont mix configure plus makefile approach. They are different stages.
>
> The makefiles work quite well, but its not its task to detect or setup
> options. There are many approaches from the makefiles to do this.
>
> In suckless we use the config.mk approach which is quite good to
> setup the options, but this requires no optional dependencies or
> compilation features (or at least not so many).
>
> The problem I see in makefiles is that they dont follow a strict usage
> rules and there are no 'standards' on their usage. And this is pretty
> anoying. Because with the makefile approach you end up by implementing
> everything from scratch, there's no set of .mk files to do it magically
> or anything else. And of course that magical approach will only cover
> the build, install and distribution stages.

Plan 9 solves this, the standard set of mkfiles you can use are
described here: http://doc.cat-v.org/plan_9/4th_edition/papers/mkfiles


> The configure stage is in many situations innecessary, but its just a way
> to do different actions to 'configure' the project. This is:
> - detect libraries, programs
> - check for endianness

If your program needs to check for endianness, it is broken, period.

> - check cpu

If your program needs to check for cpu, it is probably broken, and if
not, it will break cross compiling and it should allow building
versions for each architecture no matter what the current environment
is, ideally as the Plan 9 compilers do actually implemented as
independent programs.

> - check for include files

This is hopeless, the only proper solution is to provide somewhere for
the user to manually define where to find include files and libraries,
otherwise your program will be unportable, unless magically it can
predict where any system ever created in the past and future will have
its headers, which is impossible and is why auto*hell ends up failing
miserably at finding shit.

> - check OS (target,host,..) useful for crosscompiling

Why the fuck does one need a configure step for crosscopiling?
Actually not having one makes crosscompiling infinitely more sane and
convenient.

> - check for system-related libraries to fit certain features (-lsocket in
> solaris. fex)

If you depend on system-specific features your code is not portable,
period, and pretending otherwise is madness.

Also note that all kinds of build time 'configuration' exponentially
increase the difficulty of testing and debugging as you basically stop
being a single version of your program, and instead you have one
version based on what features happen to be 'enabled'.

Also this encourages #ifdef 'pseudo-portability', which is *always*
the *wrong* way to do it.

> If you have to do all this job for every project you do.. you will probably
> fail at some point. This is why is good to have a centralized project that
> generates the proper checks to get all this information in a way that
> works on most of systems and most of arquitectures.
>
> If you find a problem in a certain arquitecture you just need to fix it
> in one point, not in all your projects.
>
> The problem is that GNU configure sucks, is slow, is dirty, is bloated, is
> full of weird hidden features and lot of collateral issues can easily appear
> which are really hard to debug.

This is completely wrong, the problem with auto*hell is not the
implementation, but the concept and idea itself, the implementation
sucks so much mostly as consequence of how idiotic and retarded the
idea is.

uriel

>
> I wrote ACR for fun. I just wanted to see how hard would be to replace
> such features in shellscript (yeah, acr is written in shellscript). So the
> code
> is unreadable, but it does its job. Something that started as an experiment
> is something that it solves me many KBs in my repositories and sources,
> and saves lot of time in compilation process, because the checks are faster
> than in autoconf.
>
> The generated code is almost human readable and can be easily debugged,
> so that's why I use it.
>
> I know that there are better approaches for this, but there are no books, no
> standards and one promoting them to replace the current used build systems.
>
> When building you have to think on different compilers, different
> arquitectures,
> different platforms, different library dependencies, different compilation
> and
> installation paths, ways to find the dependencies, etc. It's all a full mess
> actually.
>
> I would like to see a clean solution for all those problems by just using a
> single
> tool (mk?) but without having to maintain many files or having to type
> useless
> or repet

Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-30 Thread Uriel
On Wed, Jan 27, 2010 at 1:12 PM, anonymous  wrote:
> On Wed, Jan 27, 2010 at 07:07:49AM +, David Tweed wrote:
>> On Wed, Jan 27, 2010 at 6:25 AM, Uriel  wrote:
>> > Why the fucking hell should the fucking build tool know shit about the
>> > OS it is running on?!?!?!
>> >
>> > If you need to do OS guessing, that is a clear sign that you are doing
>> > things *wrong* 99% of the time.
>>
>> [In what follows by "OS" I mean kernel plus userspace libraries that
>> provide a higher level interface to the hardware than runs in the
>> kernel.]
>>
>> It would be great if "conceptual interfaces" that are a decade or more
>> old were universally standardised (so you don't have to worry about
>> whether mkstemp() is provided, etc) so that a lot of the configuration
>> processing could go away, and maybe that's the situation for most
>> "text and filesystem applications". But there are and are will be in
>> the future new interfaces that haven't solidified into a common form
>> yet, eg, webcam access, haptic input devices, accelerometers/GPS,
>> cloud computing APIs, etc, for which figuring out what is provided
>> will still necessary in meta-build/configuration systems for years to
>> come for any software that will be widely distributed.
>
> I think Uriel means that if you need mkstemp(), you should check if
> mkstemp() is there (by trying to compile some code with mkstemp(), for
> example), not what OS it is. Same with other features.

*WRONG*, that is what auto*hell does, and it is totally *retarded*.

If you need mkstemp you have have two options: either you restrict
your program to run only on platforms that include it, or you
implement your own version. In either case there is *zero* advantage
of finding out at build time whatever your system includes it or not.

If you want to write portable software the only sane way is to stick
to the *subset* of functionality across all the platforms you want to
support, and hide any unavoidable differences under your own library
abstractions.

Also note how libraries tend to be more retarded and ineptly designed
the more 'modern' and least portable they are, smart people stick to
the classic standard Unix APIs that are both portable and sane unlike
the alternatives.

uriel



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-29 Thread pancake

Anselm R Garbe wrote:

Well I've heared these reasons before and I don't buy them. There are
toolchains like the BSD ones and they proof pretty much that the
"everything is a Makefile approach" is the most portable and
sustainable one. Running a configure script from 10 years ago will
fail immediately.

  

Dont mix configure plus makefile approach. They are different stages.

The makefiles work quite well, but its not its task to detect or setup
options. There are many approaches from the makefiles to do this.

In suckless we use the config.mk approach which is quite good to
setup the options, but this requires no optional dependencies or
compilation features (or at least not so many).

The problem I see in makefiles is that they dont follow a strict usage
rules and there are no 'standards' on their usage. And this is pretty
anoying. Because with the makefile approach you end up by implementing
everything from scratch, there's no set of .mk files to do it magically
or anything else. And of course that magical approach will only cover
the build, install and distribution stages.

The configure stage is in many situations innecessary, but its just a way
to do different actions to 'configure' the project. This is:
- detect libraries, programs
- check for endianness
- check cpu
- check for include files
- check OS (target,host,..) useful for crosscompiling
- check for system-related libraries to fit certain features (-lsocket 
in solaris. fex)


If you have to do all this job for every project you do.. you will probably
fail at some point. This is why is good to have a centralized project that
generates the proper checks to get all this information in a way that
works on most of systems and most of arquitectures.

If you find a problem in a certain arquitecture you just need to fix it
in one point, not in all your projects.

The problem is that GNU configure sucks, is slow, is dirty, is bloated, is
full of weird hidden features and lot of collateral issues can easily appear
which are really hard to debug.

I wrote ACR for fun. I just wanted to see how hard would be to replace
such features in shellscript (yeah, acr is written in shellscript). So 
the code

is unreadable, but it does its job. Something that started as an experiment
is something that it solves me many KBs in my repositories and sources,
and saves lot of time in compilation process, because the checks are faster
than in autoconf.

The generated code is almost human readable and can be easily debugged,
so that's why I use it.

I know that there are better approaches for this, but there are no books, no
standards and one promoting them to replace the current used build systems.

When building you have to think on different compilers, different 
arquitectures,
different platforms, different library dependencies, different 
compilation and
installation paths, ways to find the dependencies, etc. It's all a full 
mess actually.


I would like to see a clean solution for all those problems by just 
using a single
tool (mk?) but without having to maintain many files or having to type 
useless

or repetitive things along the projects.

I know that your problem vector is different, but I think reinventing
square wheels like autoconf again is not helping us any further. And I
really believe that sticking to mk or make files in large projects
saves you a lot of headaches in the long term (think years ahead, like
10 years or so).
  
Well. ACR was reinvented some years ago, i just do few commits lately, 
i'm happy
with it, because it makes the users and packagers feel comfortable while 
compiling
the software because it follows a 'standard' format, which saves many 
time for
me to fix issues in their building environments (debian, ..) and for 
them without

having to mess into the deep gnu shit.

--pancake



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-27 Thread Charlie Kester

On Wed 27 Jan 2010 at 06:48:22 PST Noah Birnel wrote:

On Wed, Jan 27, 2010 at 07:43:22AM +, Anselm R Garbe wrote:

In my observation one should stick to one platform, which is nowadays
Linux+common libraries (most of the time) when packaging some source
code. In >90% of all cases it will work fine, because the other 95% of
users use Linux as well and the  5% remainder either uses some BSD
where the likelihood is high that it'll just compile as well and some
<<1% users will use some exotic platform where we shouldn't bother at
all if it'll work or not.


Those are amazing percentages.


Yes, and unless I'm mistaken, they're purely anecdotal.  ;-)





Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-27 Thread Noah Birnel
On Wed, Jan 27, 2010 at 07:43:22AM +, Anselm R Garbe wrote:
> In my observation one should stick to one platform, which is nowadays
> Linux+common libraries (most of the time) when packaging some source
> code. In >90% of all cases it will work fine, because the other 95% of
> users use Linux as well and the  5% remainder either uses some BSD
> where the likelihood is high that it'll just compile as well and some
> <<1% users will use some exotic platform where we shouldn't bother at
> all if it'll work or not.

Those are amazing percentages.

Cheers,

Noah



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-27 Thread anonymous
On Wed, Jan 27, 2010 at 07:07:49AM +, David Tweed wrote:
> On Wed, Jan 27, 2010 at 6:25 AM, Uriel  wrote:
> > Why the fucking hell should the fucking build tool know shit about the
> > OS it is running on?!?!?!
> >
> > If you need to do OS guessing, that is a clear sign that you are doing
> > things *wrong* 99% of the time.
> 
> [In what follows by "OS" I mean kernel plus userspace libraries that
> provide a higher level interface to the hardware than runs in the
> kernel.]
> 
> It would be great if "conceptual interfaces" that are a decade or more
> old were universally standardised (so you don't have to worry about
> whether mkstemp() is provided, etc) so that a lot of the configuration
> processing could go away, and maybe that's the situation for most
> "text and filesystem applications". But there are and are will be in
> the future new interfaces that haven't solidified into a common form
> yet, eg, webcam access, haptic input devices, accelerometers/GPS,
> cloud computing APIs, etc, for which figuring out what is provided
> will still necessary in meta-build/configuration systems for years to
> come for any software that will be widely distributed.

I think Uriel means that if you need mkstemp(), you should check if
mkstemp() is there (by trying to compile some code with mkstemp(), for
example), not what OS it is. Same with other features.




Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread Anselm R Garbe
2010/1/27 David Tweed :
> On Wed, Jan 27, 2010 at 6:25 AM, Uriel  wrote:
>> Why the fucking hell should the fucking build tool know shit about the
>> OS it is running on?!?!?!
>>
>> If you need to do OS guessing, that is a clear sign that you are doing
>> things *wrong* 99% of the time.
>
> [In what follows by "OS" I mean kernel plus userspace libraries that
> provide a higher level interface to the hardware than runs in the
> kernel.]
>
> It would be great if "conceptual interfaces" that are a decade or more
> old were universally standardised (so you don't have to worry about
> whether mkstemp() is provided, etc) so that a lot of the configuration
> processing could go away, and maybe that's the situation for most
> "text and filesystem applications". But there are and are will be in
> the future new interfaces that haven't solidified into a common form
> yet, eg, webcam access, haptic input devices, accelerometers/GPS,
> cloud computing APIs, etc, for which figuring out what is provided
> will still necessary in meta-build/configuration systems for years to
> come for any software that will be widely distributed.

In my observation one should stick to one platform, which is nowadays
Linux+common libraries (most of the time) when packaging some source
code. In >90% of all cases it will work fine, because the other 95% of
users use Linux as well and the  5% remainder either uses some BSD
where the likelihood is high that it'll just compile as well and some
<<1% users will use some exotic platform where we shouldn't bother at
all if it'll work or not.

And if there is a problem, some user will report it and one can think
of a change case by case. I applied this to most of the projects I'm
working on (also commercial ones) and it works fine. I don't want to
count the time I saved in not running configure ;)

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread David Tweed
On Wed, Jan 27, 2010 at 6:25 AM, Uriel  wrote:
> Why the fucking hell should the fucking build tool know shit about the
> OS it is running on?!?!?!
>
> If you need to do OS guessing, that is a clear sign that you are doing
> things *wrong* 99% of the time.

[In what follows by "OS" I mean kernel plus userspace libraries that
provide a higher level interface to the hardware than runs in the
kernel.]

It would be great if "conceptual interfaces" that are a decade or more
old were universally standardised (so you don't have to worry about
whether mkstemp() is provided, etc) so that a lot of the configuration
processing could go away, and maybe that's the situation for most
"text and filesystem applications". But there are and are will be in
the future new interfaces that haven't solidified into a common form
yet, eg, webcam access, haptic input devices, accelerometers/GPS,
cloud computing APIs, etc, for which figuring out what is provided
will still necessary in meta-build/configuration systems for years to
come for any software that will be widely distributed.

-- 
cheers, dave tweed__
computer vision reasearcher: david.tw...@gmail.com
"while having code so boring anyone can maintain it, use Python." --
attempted insult seen on slashdot



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread Uriel
On Tue, Jan 26, 2010 at 8:10 AM, Daniel Bainton  wrote:
> 2010/1/25 pancake :
>> I have been using make(1) and acr(1) for most of my projects for a long while
>
> acr seems to have the OS guessing quite bad. It checks if uname is the
> GNU version and then adds -gnu to the system type if it is? What if
> the system is a uClibc based one that uses the GNU version of uname?

Why the fucking hell should the fucking build tool know shit about the
OS it is running on?!?!?!

If you need to do OS guessing, that is a clear sign that you are doing
things *wrong* 99% of the time.

uriel

> gcc -dumpmachine would be a better way IMO (though probably not the
> best anyway, atleast if the system has some other compiler than gcc..)
>
> --
> Daniel
>
>



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread Uriel
On Mon, Jan 25, 2010 at 4:40 PM, pancake  wrote:
> PD: Is there any tutorial or good documentation about how to use mk?

http://doc.cat-v.org/plan_9/4th_edition/papers/mk
http://doc.cat-v.org/plan_9/4th_edition/papers/mkfiles

> because
> 'make' is
> nice, but its too shell dependend and this forces the execution to fork for
> most of basic
> operations slowing down the execution and there are many other things that
> makes
> 'make' innefficient in some situations.

If any of those things are a concern, you are clearly doing things
*completely wrong*.

uriel

> But I dont know if mk will be better
> for that.
>
> About cmake. i never liked because its c++ and it is not everywhere (you
> have to
> explicitly install it), and that's a pain in the ass for distributing apps.
> I like to depend on
> as less things as possible.
>
> Another build system I tried was 'waf'[3], and I got really exhausted of
> changing the
> rule files to match the last version of waf (they changed the API many times
> (at least
> when I was using it). The good thing of waf is that its python ( i dont like
> python, but
> its everywhere) so there's no limitation on shell commands and forks, and
> the
> configure/make stages are done nicer than in make(1) or autotools (they only
> install
> files that are diffeerent in timestamp for example) resulting in a faster
> compilation
> and installation.
>
> Another good thing of waf is that it can be distributed with the project, so
> you dont
> need to install waf to compile the project. Only depends on python which is
> something
> you can sadly find in all current distributions :)
>
> [1] http://hg.youterm.com/acr
> [2] http://radare.org
>
> Armando Di Cianno wrote:
>>
>> David,
>>
>> I worked with the people at Kitware, Inc. for a while (here in
>> beautiful upstate New York), and they wrote and maintain CMake [1].  I
>> believe KDE, IIRC, has used CMake for a while now (which is at least a
>> testament to the complexity it can handle).
>>
>> IMHO, CMake does not have a great syntax, but it's easy to learn and
>> write.  Again, IMHO, orders of magnitude easier to understand than GNU
>> auto*tools -- although it is a bit pedantic (e.g. closing if branches
>> with the condition to match the opening).
>>
>> However, for all its faults, it's *really* easy to use, and the
>> for-free GUIs (ncurses or multi-platforms' GUIs), are icing on the
>> cake.  The simple ncurses GUI is nice to have when reconfiguring a
>> project -- it can really speed things up.
>>
>>
>>>
>>> stuff like "has vsnprintf?" that configure deals with.) In addition,
>>> it'd be nice to be able to have options like "debugging", "release",
>>> "grof-compiled", etc, similar to procesor specification.
>>> It would be preferrable if all
>>> object files and executables could coexist (because it's a C++
>>> template heavy
>>>
>>
>> CMake can do all this for you, and it works great with C and C++
>> projects (really, that's the only reason one would use it).
>>
>> 2¢,
>> __armando
>>
>> [1] http://cmake.org/
>>
>>
>
>
>



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread Ryan R
I switched the development process over to gentoo where I work and
it's been awesome to say the least.



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread David Tweed
Thanks to everyone for all the help.

I'm looking more at the development process than the distribution
process which means different issues are most important for me. The
big issue I'm looking at is that I've got lots of programs which can
be visualised as having "conventional" dependencies with the twist
that suppose executable "foo" depends upon "colourSegmentation.o", if
the target processor has SSE3 instructions the IF there's an processor
optimised segmentation.c in the SSE3 directory compile and link
against that, IF it doesn't exist compile and link against the version
in the GENERIC_C directory. I think maintaining separate makefiles
that are manually kept up to date in each case as new processor
oprtimised code gets written is going to be reliable in the longer
term. I think I'll follow the general advice to maintain a single
makefile that describes the non-processor specific dependencies by
hand and then try some homebrew script to automatically infer and add
appropriate paths to object files in each processor-capability
makefile depending on availability for each processor-capability set.
(This is probaby not a common problem.)

> I recommend mk from Plan 9, the syntax is clean and clearly defined
> (not the problem is it BSD make, is it GNU make or is it some archaic
> Unix make?). I found that all meta build systems suck in one way or
> another -- some do a good job at first glance, like scons, but they
> all hide what they really do and in the end it's like trying to
> understand configure scripts if something goes wrong. make or mk are
> better choices in this regard.

Yeah. I don't mind powerful languages for doing stuff "automatically",
the problem is systems that aren't designed to be easily debuggable
when they go wrong.

-- 
cheers, dave tweed__
computer vision reasearcher: david.tw...@gmail.com
"while having code so boring anyone can maintain it, use Python." --
attempted insult seen on slashdot



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread pancake
I just wanted to say that few months ago I started to write a build 
system in C,
aiming to fix all those issues, but it's quite in early stage, only 
works for simple

projects.

I named it 'cake', as for cooking :)

You will find the source here:

 hg clone http://hg.youterm.com/cake

Instead of makefiles you have Cakefiles which are just plain C include 
files which

are included from cake.c which should be distributed on every package. Each
Cakefile can include other ones, and they describe by filling structures the
dependencies between modules, programs, libraries that are going to be 
built.


At this point you have another .h file that you can use to change the 
compiler
profile which is a struct with information about the flags, name, etc.. 
and when
you type 'make' it compiles 'cake' and runs 'cake' to get the build 
done. (make is

used as just a proxy).

cake -i is used to install the compiled results into the system.

I would really like to move this project forward, but as like many other 
projects of
mine I only use to develop for them when I have time, or I just simply 
need it.


So, if any of you is interested on it, feel free to send me patches or 
discuss ideas

about how to design/implement it.

One of the limitations I found is the lack of paralel compilation (like 
make -j) that
should be implemented, but having all those structs in memory saves some 
time

of Makefile parsing and dependency calculation.

For complex things I would run shellscripts from cake, like in make, but 
more explicitly,

so you are always splitting each functionality in a separate file.

--pancake



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread Anselm R Garbe
2010/1/26 pancake :
> Anselm R Garbe wrote:
>>
>> What about the good old way of providing one master makefile for each
>> platform instead of these scripts that are doomed to fail anyways
>> sooner or later?
>>
>>
>
> It's not only about platform, for small projects i find single makefiles ok,
> but for big ones you need to separate the configuration/build steps, because
> you need to get information about which libraries, include files, programs,
> etc.. are in the system. In some situations just create a .mk file with such
> information, but you will probably need to export some of the information
> into a .h file.
>
> Sometimes the configuration step is not as simple as run this and tell me
> if it works or not. This makes a shellscript much more manageable than
> a makefile, because it's not a task of the make and you end up forking from
> make which is inneficient and ugly to maintain.
>
> About having one different mk file for each platform..well I always find
> anyoing
> to have to maintain N files that does the same, or for building..it's a mess
> for packaging because there's no standard on this, and automatize the
> build or do some packaging becomes a mess.
>
> You get the package and you have to spend few seconds to identify which
> makefile you have to use, and then if you get compilation errors you have to
> guess which dependencies are missing. then you can try to find INSTALL or
> README files to see whats missing or try to fix the program if you get that
> its not a dependency problem. So this makes the build process simpler, but
> less error-prone and more annoying for packagers and users.
>
> The only good thing from autotools is that they are the standard, and this
> hardly simplified the steps of development, packaging, compilation and
> deployment. This is: make dist, make mrproper, automatic detection of
> file dependencies, check for dependencies, etc.
>
> For suckless projects i dont find logic to use such a monster, but for big
> projects it is many times a must. Because you end up with conditional
> dependencies, recursive checks to ensure consistence of a program, etc.
>
> If you have a build farm or any massive-compilation environment, you expect
> that all the packages are going to build and react in the same way. But this
> is not true. There are some basics in the software packaging that not
> everybody understand or know.
>
> Things like sand-boxing installation (make DISTDIR=/foo), cross-path
> compilation (VPATH),
> optimization flags detection for compiler, make dist, etc. are things that
> many of the
> makefile-only projects fail to do. I'm not trying to say that those are
> things that all
> packages must have, but it's something that standarizes the build and
> install process,
> and simplifies the development and maintainance.
>
> I wrote 'acr' because i was looking in something ./configure-compatible, but
> lightweight
> and simpler to maintain than the m4 approach. It works for me quite well,
> but I try to
> only use it for the projects that really need to split the build in two
> steps. For building
> in plan9 I just distribute a separate mkfile which doesnt depends on the
> configure stage.
>
> But plan9 is IMHO a quite different platform to not try to support it from
> the acr side,
> because the makefiles should be completely different too.

Well I've heared these reasons before and I don't buy them. There are
toolchains like the BSD ones and they proof pretty much that the
"everything is a Makefile approach" is the most portable and
sustainable one. Running a configure script from 10 years ago will
fail immediately.

I know that your problem vector is different, but I think reinventing
square wheels like autoconf again is not helping us any further. And I
really believe that sticking to mk or make files in large projects
saves you a lot of headaches in the long term (think years ahead, like
10 years or so).

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread pancake

Anselm R Garbe wrote:

What about the good old way of providing one master makefile for each
platform instead of these scripts that are doomed to fail anyways
sooner or later?

  

It's not only about platform, for small projects i find single makefiles ok,
but for big ones you need to separate the configuration/build steps, because
you need to get information about which libraries, include files, programs,
etc.. are in the system. In some situations just create a .mk file with such
information, but you will probably need to export some of the information
into a .h file.

Sometimes the configuration step is not as simple as run this and tell me
if it works or not. This makes a shellscript much more manageable than
a makefile, because it's not a task of the make and you end up forking from
make which is inneficient and ugly to maintain.

About having one different mk file for each platform..well I always find 
anyoing

to have to maintain N files that does the same, or for building..it's a mess
for packaging because there's no standard on this, and automatize the
build or do some packaging becomes a mess.

You get the package and you have to spend few seconds to identify which
makefile you have to use, and then if you get compilation errors you have to
guess which dependencies are missing. then you can try to find INSTALL or
README files to see whats missing or try to fix the program if you get that
its not a dependency problem. So this makes the build process simpler, but
less error-prone and more annoying for packagers and users.

The only good thing from autotools is that they are the standard, and this
hardly simplified the steps of development, packaging, compilation and
deployment. This is: make dist, make mrproper, automatic detection of
file dependencies, check for dependencies, etc.

For suckless projects i dont find logic to use such a monster, but for big
projects it is many times a must. Because you end up with conditional
dependencies, recursive checks to ensure consistence of a program, etc.

If you have a build farm or any massive-compilation environment, you expect
that all the packages are going to build and react in the same way. But this
is not true. There are some basics in the software packaging that not
everybody understand or know.

Things like sand-boxing installation (make DISTDIR=/foo), cross-path 
compilation (VPATH),
optimization flags detection for compiler, make dist, etc. are things 
that many of the
makefile-only projects fail to do. I'm not trying to say that those are 
things that all
packages must have, but it's something that standarizes the build and 
install process,

and simplifies the development and maintainance.

I wrote 'acr' because i was looking in something ./configure-compatible, 
but lightweight
and simpler to maintain than the m4 approach. It works for me quite 
well, but I try to
only use it for the projects that really need to split the build in two 
steps. For building
in plan9 I just distribute a separate mkfile which doesnt depends on the 
configure stage.


But plan9 is IMHO a quite different platform to not try to support it 
from the acr side,

because the makefiles should be completely different too.

--pancake



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread Anselm R Garbe
2010/1/26 pancake :
> On Jan 26, 2010, at 8:10 AM, Daniel Bainton  wrote:
>
>> 2010/1/25 pancake :
>>>
>>> I have been using make(1) and acr(1) for most of my projects for a long
>>> while
>>
>> acr seems to have the OS guessing quite bad. It checks if uname is the
>> GNU version and then adds -gnu to the system type if it is? What if
>> the system is a uClibc based one that uses the GNU version of uname?
>>
>> gcc -dumpmachine would be a better way IMO (though probably not the
>> best anyway, atleast if the system has some other compiler than gcc..)
>>
> It cannot depend on gcc. What about crosscompiling? What about non-C
> projects?
>
> That string is just orientative imho. So i simplified the algorithm to
> handle most common situations.
>
> The code in autoconf that do this is really painful. And i dont really get
> the point of having a moré accurate and complex host string resolution.
>
> Do you have any other proposal to enhace it? With --target, --host and
> --build you can change the default string.

What about the good old way of providing one master makefile for each
platform instead of these scripts that are doomed to fail anyways
sooner or later?

Cheers,
Anselm



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-26 Thread Daniel Bainton
2010/1/26 pancake :
>
>
> On Jan 26, 2010, at 8:10 AM, Daniel Bainton  wrote:
>
>> 2010/1/25 pancake :
>>>
>>> I have been using make(1) and acr(1) for most of my projects for a long
>>> while
>>
>> acr seems to have the OS guessing quite bad. It checks if uname is the
>> GNU version and then adds -gnu to the system type if it is? What if
>> the system is a uClibc based one that uses the GNU version of uname?
>>
>> gcc -dumpmachine would be a better way IMO (though probably not the
>> best anyway, atleast if the system has some other compiler than gcc..)
>>
> It cannot depend on gcc. What about crosscompiling? What about non-C
> projects?
>
> That string is just orientative imho. So i simplified the algorithm to
> handle most common situations.
>
> The code in autoconf that do this is really painful. And i dont really get
> the point of having a moré accurate and complex host string resolution.
>
> Do you have any other proposal to enhace it? With --target, --host and
> --build you can change the default string.

I can't think of a better way currently, but for example stali, that
will give the wrong build string.

--
Daniel



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-25 Thread pancake



On Jan 26, 2010, at 8:10 AM, Daniel Bainton  wrote:


2010/1/25 pancake :
I have been using make(1) and acr(1) for most of my projects for a  
long while


acr seems to have the OS guessing quite bad. It checks if uname is the
GNU version and then adds -gnu to the system type if it is? What if
the system is a uClibc based one that uses the GNU version of uname?

gcc -dumpmachine would be a better way IMO (though probably not the
best anyway, atleast if the system has some other compiler than gcc..)

It cannot depend on gcc. What about crosscompiling? What about non-C  
projects?


That string is just orientative imho. So i simplified the algorithm to  
handle most common situations.


The code in autoconf that do this is really painful. And i dont really  
get the point of having a moré accurate and complex host string  
resolution.


Do you have any other proposal to enhace it? With --target, --host and  
--build you can change the default string.

--
Daniel





Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-25 Thread Andres Perera
I'd say stay away from cmake. It's very complicated.

I'd like to try plan9 mk, but in the meantime, one more vote for good old make.

Andres



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-25 Thread Daniel Bainton
2010/1/25 pancake :
> I have been using make(1) and acr(1) for most of my projects for a long while

acr seems to have the OS guessing quite bad. It checks if uname is the
GNU version and then adds -gnu to the system type if it is? What if
the system is a uClibc based one that uses the GNU version of uname?

gcc -dumpmachine would be a better way IMO (though probably not the
best anyway, atleast if the system has some other compiler than gcc..)

--
Daniel



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-25 Thread pancake

anonymous wrote:

Radare INSTALL says "The WAF build system is supossed to replace the
ACR one.". This means that ACR is going to be replaced with WAF?

In the section "HOW TO COMPILE" there is "Standard way" with
configure && make && make install and "Alternative (going to be
deprecated)" based on waf. This means that WAF is going to be replaced
with ACR?


  

That's deprecated stuff. I will not change to waf, but im keeping both build
systems for people having problems with make (dunno who actually).

waf is configure+build, and i got bored of it because they changed the 
api many

times, and i spend more time fixing .py files than coding :P

I recommend you to check radare2 build system. r1 is a mess :) but its 
kinda funny

trash and dirty coding ;)

--pancake



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-25 Thread anonymous
Radare INSTALL says "The WAF build system is supossed to replace the
ACR one.". This means that ACR is going to be replaced with WAF?

In the section "HOW TO COMPILE" there is "Standard way" with
configure && make && make install and "Alternative (going to be
deprecated)" based on waf. This means that WAF is going to be replaced
with ACR?




Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-25 Thread pancake
I have been using make(1) and acr(1) for most of my projects for a long 
while

and Im pretty happy with them. But I have to agree that make lacks so many
things and it is bloated enought to think on moving to mk(1).

Some projects like perl use perl (miniperl) to generate makefiles from 
simplest

template files, so the makefile maintainance is easier.

For me I used to write makefiles manually (not using any GNU auto-shit), 
I have
started to write 'amr' like a minimal automake-compatible replacement 
together

with 'acr' (auto conf replacement) which is already an usable solution.

AMR is quite broken atm and works only for simple test cases, but ACR is 
probably
the best alterative for autoconf. It generates a 15KB configure script 
with compatible
posix shellscript instead of the common 300K from GNU, and yeah its 
readable.
I have used ACR for building on solaris, *BSD, Linux and cygwin/mingw32 
and I'm

happy with it.

In radare2[2] I used acr to check and configure the build system for a 
prefix, check
dependencies, system, bitsize, plugins, compilation options, etc.. and 
then it generates
two makefiles which imports some .mk files containing the rules for the 
rest of modules.


I think that if you have a big project and you have to maintain it by 
makefiles is better
to group common blocks by same rules by just configuring those elements 
with few
variables used by the .mk files to set up some rules or others depending 
on the module
type. or just including a different .mk file. This will make your 
project makefiles be 3-4 lines
long and much more maintainable. Check the radare2 hg repo if you are 
interested on

this.

If you are looking for ACR use examples, check any of the other projects 
in hg.youterm.com

or just see the one in radare.

PD: Is there any tutorial or good documentation about how to use mk? 
because 'make' is
nice, but its too shell dependend and this forces the execution to fork 
for most of basic
operations slowing down the execution and there are many other things 
that makes
'make' innefficient in some situations. But I dont know if mk will be 
better for that.


About cmake. i never liked because its c++ and it is not everywhere (you 
have to
explicitly install it), and that's a pain in the ass for distributing 
apps. I like to depend on

as less things as possible.

Another build system I tried was 'waf'[3], and I got really exhausted of 
changing the
rule files to match the last version of waf (they changed the API many 
times (at least
when I was using it). The good thing of waf is that its python ( i dont 
like python, but
its everywhere) so there's no limitation on shell commands and forks, 
and the
configure/make stages are done nicer than in make(1) or autotools (they 
only install
files that are diffeerent in timestamp for example) resulting in a 
faster compilation

and installation.

Another good thing of waf is that it can be distributed with the 
project, so you dont
need to install waf to compile the project. Only depends on python which 
is something

you can sadly find in all current distributions :)

[1] http://hg.youterm.com/acr
[2] http://radare.org

Armando Di Cianno wrote:

David,

I worked with the people at Kitware, Inc. for a while (here in
beautiful upstate New York), and they wrote and maintain CMake [1].  I
believe KDE, IIRC, has used CMake for a while now (which is at least a
testament to the complexity it can handle).

IMHO, CMake does not have a great syntax, but it's easy to learn and
write.  Again, IMHO, orders of magnitude easier to understand than GNU
auto*tools -- although it is a bit pedantic (e.g. closing if branches
with the condition to match the opening).

However, for all its faults, it's *really* easy to use, and the
for-free GUIs (ncurses or multi-platforms' GUIs), are icing on the
cake.  The simple ncurses GUI is nice to have when reconfiguring a
project -- it can really speed things up.

  

stuff like "has vsnprintf?" that configure deals with.) In addition,
it'd be nice to be able to have options like "debugging", "release",
"grof-compiled", etc, similar to procesor specification.
It would be preferrable if all
object files and executables could coexist (because it's a C++
template heavy



CMake can do all this for you, and it works great with C and C++
projects (really, that's the only reason one would use it).

2¢,
__armando

[1] http://cmake.org/

  





Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-25 Thread Armando Di Cianno
David,

I worked with the people at Kitware, Inc. for a while (here in
beautiful upstate New York), and they wrote and maintain CMake [1].  I
believe KDE, IIRC, has used CMake for a while now (which is at least a
testament to the complexity it can handle).

IMHO, CMake does not have a great syntax, but it's easy to learn and
write.  Again, IMHO, orders of magnitude easier to understand than GNU
auto*tools -- although it is a bit pedantic (e.g. closing if branches
with the condition to match the opening).

However, for all its faults, it's *really* easy to use, and the
for-free GUIs (ncurses or multi-platforms' GUIs), are icing on the
cake.  The simple ncurses GUI is nice to have when reconfiguring a
project -- it can really speed things up.

> stuff like "has vsnprintf?" that configure deals with.) In addition,
> it'd be nice to be able to have options like "debugging", "release",
> "grof-compiled", etc, similar to procesor specification.
> It would be preferrable if all
> object files and executables could coexist (because it's a C++
> template heavy

CMake can do all this for you, and it works great with C and C++
projects (really, that's the only reason one would use it).

2¢,
__armando

[1] http://cmake.org/



Re: [dev] [OFFTOPIC] Recommended meta-build system

2010-01-25 Thread Anselm R Garbe
Hi David,

2010/1/25 David Tweed :
> I'm wondering if anyone has had particularly good experiences with any
> meta-build system (cmake, etc) in the following circumstances:
>
> I will have a large codebase which consists of some generic files and
> some processor specific files. (I'm not worried about OS environent
> stuff like "has vsnprintf?" that configure deals with.) In addition,
> it'd be nice to be able to have options like "debugging", "release",
> "grof-compiled", etc, similar to procesor specification. I need to be
> able to select the appropriate files for a given build and compile
> them together to form an executable. It would be preferrable if all
> object files and executables could coexist (because it's a C++
> template heavy source-base that means individual files compile
> relatively slowly, so it'dbe preferrable only to recompile if the
> source has actually changed) using directories or naming conventions.
>
> I've been doing some reading about things like cmake and SCons but
> most strike me as having "built-in logic for their normal way of doing
> things and are relatively clunky if you specify something different".
> (Incidentally, when I say meta-build system I mean that I don't mind
> if it builds things directly or if it outputs makefiles that can be
> invoked.) Does anyone have any experiences of using any tool for this
> kind of purpose?
>
> (One option would be to just have a static makefile and then do some
> include-path hackery to select processor specific directories to pick
> a specific versions of files depending on options and then rely on
> ccache to pick up the correct object file from the cache rather than
> recompiling. But that feels like a hack for avoiding having a more
> expressive build system.)
>
> Many thanks for sharing any experiences,

I recommend mk from Plan 9, the syntax is clean and clearly defined
(not the problem is it BSD make, is it GNU make or is it some archaic
Unix make?). I found that all meta build systems suck in one way or
another -- some do a good job at first glance, like scons, but they
all hide what they really do and in the end it's like trying to
understand configure scripts if something goes wrong. make or mk are
better choices in this regard.

Being involved in a lot of embedded development for the last years I'd
also say that one build chain that I really like is what Google did to
build android (which also inspired my stali efforts). They git rid of
configure and meta build systems nearly completely and are using make
files instead. Well of course the BSDs did this also for decades ;)

But make based build chains seem to be the ones with the least
headaches in my experience. It's worth the extra effort to write all
those make or mk files from scratch, in the end it'll safe you a lot
of time.

Cheers,
Anselm



[dev] [OFFTOPIC] Recommended meta-build system

2010-01-24 Thread David Tweed
Hi,

I'm wondering if anyone has had particularly good experiences with any
meta-build system (cmake, etc) in the following circumstances:

I will have a large codebase which consists of some generic files and
some processor specific files. (I'm not worried about OS environent
stuff like "has vsnprintf?" that configure deals with.) In addition,
it'd be nice to be able to have options like "debugging", "release",
"grof-compiled", etc, similar to procesor specification. I need to be
able to select the appropriate files for a given build and compile
them together to form an executable. It would be preferrable if all
object files and executables could coexist (because it's a C++
template heavy source-base that means individual files compile
relatively slowly, so it'dbe preferrable only to recompile if the
source has actually changed) using directories or naming conventions.

I've been doing some reading about things like cmake and SCons but
most strike me as having "built-in logic for their normal way of doing
things and are relatively clunky if you specify something different".
(Incidentally, when I say meta-build system I mean that I don't mind
if it builds things directly or if it outputs makefiles that can be
invoked.) Does anyone have any experiences of using any tool for this
kind of purpose?

(One option would be to just have a static makefile and then do some
include-path hackery to select processor specific directories to pick
a specific versions of files depending on options and then rely on
ccache to pick up the correct object file from the cache rather than
recompiling. But that feels like a hack for avoiding having a more
expressive build system.)

Many thanks for sharing any experiences,

-- 
cheers, dave tweed__
computer vision reasearcher: david.tw...@gmail.com
"while having code so boring anyone can maintain it, use Python." --
attempted insult seen on slashdot