Re: Linux Agora D thread

2010-11-19 Thread Bruno Medeiros

On 22/10/2010 11:17, retard wrote:

Fri, 22 Oct 2010 02:42:49 -0700, Walter Bright wrote:


retard wrote:


Why I think the D platform's risk is so high is because the author
constantly refuses to give ANY estimates on feature schedules.


Would you believe them if I did?


http://en.wikipedia.org/wiki/Software_development_process

Without project management, software projects can easily be delivered
late or over budget. With large numbers of software projects not meeting
their expectations in terms of functionality, cost, or delivery schedule,
effective project management appears to be lacking.

http://en.wikipedia.org/wiki/Estimation_in_software_engineering

The ability to accurately estimate the time and/or cost taken for a
project to come in to its successful conclusion is a serious problem for
software engineers. The use of a repeatable, clearly defined and well
understood software development process has, in recent years, shown
itself to be the most effective method of gaining useful historical data
that can be used for statistical estimation. In particular, the act of
sampling more frequently, coupled with the loosening of constraints
between parts of a project, has allowed more accurate estimation and more
rapid development times.

http://en.wikipedia.org/wiki/Application_Lifecycle_Management

Proponents of application lifecycle management claim that it
* Increases productivity, as the team shares best practices for
development and deployment, and developers need focus only on current
business requirements
* Improves quality, so the final application meets the needs and
expectations of users
* Breaks boundaries through collaboration and smooth information flow
* Accelerates development through simplified integration
* Cuts maintenance time by synchronizing application and design
* Maximizes investments in skills, processes, and technologies
* Increases flexibility by reducing the time it takes to build and adapt
applications that support new business initiatives

http://en.wikipedia.org/wiki/Cowboy_coding

Lack of estimation or implementation planning may cause a project to be
delayed. Sudden deadlines or pushes to release software may encourage the
use of quick and dirty or code and fix techniques that will require
further attention later.

Cowboy coding is common at the hobbyist or student level where
developers may initially be unfamiliar with the technologies, such as the
build tools, that the project requires.

Custom software applications, even when using a proven development
cycle, can experience problems with the client concerning requirements.
Cowboy coding can accentuate this problem by not scaling the requirements
to a reasonable timeline, and may result in unused or unusable components
being created before the project is finished. Similarly, projects with
less tangible clients (often experimental projects, see independent game
development) may begin with code and never a formal analysis of the
design requirements. Lack of design analysis may lead to incorrect or
insufficient technology choices, possibly requiring the developer to port
or rewrite their software in order for the project to be completed.

Many software development models, such as Extreme Programming, use an
incremental approach which stresses functional prototypes at each phase.
Non-managed projects may have few unit tests or working iterations,
leaving an incomplete project unusable.



What's your point with all of this? That Walter should do estimates?


--
Bruno Medeiros - Software Engineer


Re: Linux Agora D thread

2010-10-25 Thread Steven Schveighoffer
On Fri, 22 Oct 2010 03:09:31 -0400, Walter Bright  
newshou...@digitalmars.com wrote:



Jonathan M Davis wrote:
In any case, that poster seems knowledgeable enough that I don't see  
much point in arguing with him. His opinion obviously differs from that  
of most of us on this list, but it's generally based quite soundly on  
facts, so only time will prove him wrong.


Sure, but it all depends on how one interprets those facts.

For example, C++ is hardly the same language it was in 1988 or so, when  
it became widely used. I don't think any pre-2000 C++ compiler would be  
remotely considered usable today. Languages that are not dead go through  
substantial revisions and upgrades. It is not a defect in D that it does  
so, too.


As anyone can see in the changelog, we've stopped adding features to D2  
and are working on toolchain issues. There's been a lot of progress  
there.


While I agree D2 will be a great platform to develop with, it's currently  
unusable for any major project IMO.


I think the #1 reason is this:  Yes D2 compiler has stopped adding  
features, but D2 standard library is comprised of half-baked components  
and rapidly changing ones (and getting new instances of these monthly).


Before we can compare apples to apples, we need to stabilize phobos.  But  
I don't think we should rush this, let's make phobos the best it can be  
first, and then freeze it.


I'll say that I developed a medium sized project with Tango, and I think  
at this point, if I wanted to upgrade, I would have to spend significant  
time porting it to the latest version.  That was only about a year and a  
half ago.  Tango may have stabilized in recent times (haven't looked at it  
in a while), but phobos 2 is nowhere near as usable as Tango was a year  
and a half ago.  Lets focus on getting it there and stop worrying about  
how some guy doesn't like D.


-Steve


Re: Linux Agora D thread

2010-10-25 Thread Walter Bright

Steven Schveighoffer wrote:
I'll say that I developed a medium sized project with Tango, and I think 
at this point, if I wanted to upgrade, I would have to spend significant 
time porting it to the latest version.  That was only about a year and a 
half ago.  Tango may have stabilized in recent times (haven't looked at 
it in a while), but phobos 2 is nowhere near as usable as Tango was a 
year and a half ago.  Lets focus on getting it there and stop worrying 
about how some guy doesn't like D.


Haters are always gonna hate, no matter how good D is. But we will continue to 
hammer down all the nails that are sticking up.


Re: Linux Agora D thread

2010-10-22 Thread Jonathan M Davis
On Thursday 21 October 2010 23:18:15 Gour wrote:
 Hello!
 
 Few days ago I sent a short post to Linux Agora site explaining why
 I'm not participating any longer in 'learning Haskell' reading group
 and suggesting D as possible target of one of the future groups.
 
 However, it looks that some are not excited about it:
 
 http://www.linuxagora.com/vbforum/showpost.php?p=6313
 
 I've tried to reply as best as I know, but...
 
 He posted to his Hacker News as well:
 
 http://news.ycombinator.com/item?id=1783810
 
 If anyone has something to add, pls. feel free to do it...
 
 
 Sincerely,
 Gour

For the most part, he seems to be spot on - certainly he's more knowledgeable 
than many D detractors out there. The main thing that I'd disagree with would 
be 
D's future, since he obviously thinks that it's never going to go anywhere, but 
only time will tell on that one.  My guess is that he was an active D user who 
got sick of things not stabilizing, which is quite understandable. While things 
have improved considerably, there are quite a few bugs in the compiler, and it 
doesn't yet completely implement the D spec per TDPL. However, the situation 
has 
much improved with the language as a whole no longer having massive changes on 
a 
semi-regular basis, so there's definitely a light at the end of the tunnel.

Still, at this point, you have people comparing D to languages which are 
essentially complete with compilers lacking much in the way of serious bugs, 
and 
D just doesn't measure up to that yet, even if it has great promise, and it 
works well enough to do a lot with it. Many such people just aren't going to 
want anything to do with D until it's stable in the way that C++ or Java is 
stable - both the compiler and the standard library.

In any case, that poster seems knowledgeable enough that I don't see much point 
in arguing with him. His opinion obviously differs from that of most of us on 
this list, but it's generally based quite soundly on facts, so only time will 
prove him wrong.

- Jonathan M Davis


Re: Linux Agora D thread

2010-10-22 Thread Walter Bright

Jonathan M Davis wrote:
In any case, that poster seems knowledgeable enough that I don't see much point 
in arguing with him. His opinion obviously differs from that of most of us on 
this list, but it's generally based quite soundly on facts, so only time will 
prove him wrong.


Sure, but it all depends on how one interprets those facts.

For example, C++ is hardly the same language it was in 1988 or so, when it 
became widely used. I don't think any pre-2000 C++ compiler would be remotely 
considered usable today. Languages that are not dead go through substantial 
revisions and upgrades. It is not a defect in D that it does so, too.


As anyone can see in the changelog, we've stopped adding features to D2 and are 
working on toolchain issues. There's been a lot of progress there.


Re: Linux Agora D thread

2010-10-22 Thread Gour
On Thu, 21 Oct 2010 23:51:20 -0700
 Jonathan == Jonathan M Davis wrote:

Jonathan In any case, that poster seems knowledgeable enough that I
Jonathan don't see much point in arguing with him. His opinion
Jonathan obviously differs from that of most of us on this list, but
Jonathan it's generally based quite soundly on facts, so only time
Jonathan will prove him wrong.

Thanks.

As far as I'm concerned, I plan to tackle 2nd chapter of TDPL. ;)


Sincerely,
Gour

-- 

Gour  | Hlapicina, Croatia  | GPG key: CDBF17CA



signature.asc
Description: PGP signature


Re: Linux Agora D thread

2010-10-22 Thread Jonathan M Davis
On Friday 22 October 2010 00:09:31 Walter Bright wrote:
 Jonathan M Davis wrote:
  In any case, that poster seems knowledgeable enough that I don't see much
  point in arguing with him. His opinion obviously differs from that of
  most of us on this list, but it's generally based quite soundly on
  facts, so only time will prove him wrong.
 
 Sure, but it all depends on how one interprets those facts.
 
 For example, C++ is hardly the same language it was in 1988 or so, when it
 became widely used. I don't think any pre-2000 C++ compiler would be
 remotely considered usable today. Languages that are not dead go through
 substantial revisions and upgrades. It is not a defect in D that it does
 so, too.
 
 As anyone can see in the changelog, we've stopped adding features to D2 and
 are working on toolchain issues. There's been a lot of progress there.

Oh, I agree that he's wrong, and I agree that D2 is making serious progress, 
but 
he's got enough of his facts right that I don't think that he can be convinced 
by correcting what he's saying. I see value in correcting people if they 
misunderstand the situation, but trying to convince someone whose opinion 
differs 
when they have their facts more or less straight is likely to just result in 
heated arguments.

The fact that D2 is not 100% stable is, of course, not something that we want, 
but I do agree that it's completely understandable why D is the way it is at 
the 
moment and that it's not unreasonable for it to be that way. D is improving and 
it will eventually reach the same level of stability that modern C++ compilers 
enjoy. However, it's also pretty much a given that many people won't want to 
use 
D until it has a level of stability comparable with the compilers that they use 
for more mature languages. But there's nothing that we can do about that except 
continue to improve D until it reachs that point. And the more stable it 
becomes, the easier it will become to get people to try it and stick with it.

- Jonathan M Davis


Re: Linux Agora D thread

2010-10-22 Thread retard
Fri, 22 Oct 2010 00:33:15 -0700, Jonathan M Davis wrote:

 On Friday 22 October 2010 00:09:31 Walter Bright wrote:
 Jonathan M Davis wrote:
  In any case, that poster seems knowledgeable enough that I don't see
  much point in arguing with him. His opinion obviously differs from
  that of most of us on this list, but it's generally based quite
  soundly on facts, so only time will prove him wrong.
 
 Sure, but it all depends on how one interprets those facts.
 
 For example, C++ is hardly the same language it was in 1988 or so, when
 it became widely used. I don't think any pre-2000 C++ compiler would be
 remotely considered usable today. Languages that are not dead go
 through substantial revisions and upgrades. It is not a defect in D
 that it does so, too.
 
 As anyone can see in the changelog, we've stopped adding features to D2
 and are working on toolchain issues. There's been a lot of progress
 there.
 
 Oh, I agree that he's wrong, and I agree that D2 is making serious
 progress, but he's got enough of his facts right that I don't think that
 he can be convinced by correcting what he's saying. I see value in
 correcting people if they misunderstand the situation, but trying to
 convince someone whose opinion differs when they have their facts more
 or less straight is likely to just result in heated arguments.
 
 The fact that D2 is not 100% stable is, of course, not something that we
 want, but I do agree that it's completely understandable why D is the
 way it is at the moment and that it's not unreasonable for it to be that
 way. D is improving and it will eventually reach the same level of
 stability that modern C++ compilers enjoy. However, it's also pretty
 much a given that many people won't want to use D until it has a level
 of stability comparable with the compilers that they use for more mature
 languages. But there's nothing that we can do about that except continue
 to improve D until it reachs that point. And the more stable it becomes,
 the easier it will become to get people to try it and stick with it.

What annoys me the most in pro D articles is the author usually tries to 
prove (in a naive way) that despite all the deficiencies the language and 
tool chain is better even *now* than all of the competition or that the 
*potential* is so high that the only logical conclusion is to move to D 
*now*. Clearly this isn't the case. These kind of articles give people 
the wrong impression. I'm just trying to bring up the *pragmatic* point 
of view.

For instance, I'm starting the implementation of a 64-bit systems/
application programming project *now*. The implementation phase will last 
N months (assume optimistic waterfall process model here). How many weeks/
months must the N at least be to make D a feasible option?

A typical lead developer / project manager has to make decisions based on 
some assumptions. E.g.

Platform  Implementation  Developer  Performance  Platform
  TimeMarket IndexRisk factor
--
C/x64 Linux   12 months   good   100  medium
C++/x64 Linux 10 months   ok 110  high
Java/x64 JVM  8 monthsexcellent  80   low
C#/Windows 64 7 monthsvery good  85   low
Python/Linux  4-5 months  very good  30   low
D 12+ months? very bad   80-115 ? very high

The metrics are imaginary. The point was to show that language goodness 
isn't a single scalar value.

Why I think the D platform's risk is so high is because the author 
constantly refuses to give ANY estimates on feature schedules. There's no 
up-to-date roadmap anywhere. The bugzilla voting system doesn't work. 
Lots of production ready core functionality is missing (for example how 
long has d2 distribution had a commercial quality xml framework?)

For example gcc has had 64-bit C/C++ support quite long. But it took 
several years to stabilize. The implementation of a 64-bit X-ray machine 
firmware in D cannot begin one week after 64-bit DMD is announced.


Re: Linux Agora D thread

2010-10-22 Thread Walter Bright

Jonathan M Davis wrote:
Oh, I agree that he's wrong, and I agree that D2 is making serious progress, but 
he's got enough of his facts right that I don't think that he can be convinced 
by correcting what he's saying. I see value in correcting people if they 
misunderstand the situation, but trying to convince someone whose opinion differs 
when they have their facts more or less straight is likely to just result in 
heated arguments.


The fact that D2 is not 100% stable is, of course, not something that we want, 
but I do agree that it's completely understandable why D is the way it is at the 
moment and that it's not unreasonable for it to be that way. D is improving and 
it will eventually reach the same level of stability that modern C++ compilers 
enjoy. However, it's also pretty much a given that many people won't want to use 
D until it has a level of stability comparable with the compilers that they use 
for more mature languages. But there's nothing that we can do about that except 
continue to improve D until it reachs that point. And the more stable it 
becomes, the easier it will become to get people to try it and stick with it.


I agree there are plenty of reasons to not use D, but also a lot more reasons to 
use it.


Re: Linux Agora D thread

2010-10-22 Thread Walter Bright

retard wrote:
What annoys me the most in pro D articles is the author usually tries to 
prove (in a naive way) that despite all the deficiencies the language and 
tool chain is better even *now* than all of the competition or that the 
*potential* is so high that the only logical conclusion is to move to D 
*now*. Clearly this isn't the case. These kind of articles give people 
the wrong impression. I'm just trying to bring up the *pragmatic* point 
of view.


For instance, I'm starting the implementation of a 64-bit systems/
application programming project *now*. The implementation phase will last 
N months (assume optimistic waterfall process model here). How many weeks/

months must the N at least be to make D a feasible option?


The Linux 64 bit dmd is well on its way to completion. The library compiles, and 
simple programs work. I'm working my way through the test suite (which is fairly 
extensive).


A typical lead developer / project manager has to make decisions based on 
some assumptions. E.g.


Platform  Implementation  Developer  Performance  Platform
  TimeMarket IndexRisk factor
--
C/x64 Linux   12 months   good   100  medium
C++/x64 Linux 10 months   ok 110  high
Java/x64 JVM  8 monthsexcellent  80   low
C#/Windows 64 7 monthsvery good  85   low
Python/Linux  4-5 months  very good  30   low
D 12+ months? very bad   80-115 ? very high

The metrics are imaginary. The point was to show that language goodness 
isn't a single scalar value.


Why I think the D platform's risk is so high is because the author 
constantly refuses to give ANY estimates on feature schedules.


Would you believe them if I did?


There's no 
up-to-date roadmap anywhere. The bugzilla voting system doesn't work. 
Lots of production ready core functionality is missing (for example how 
long has d2 distribution had a commercial quality xml framework?)


That depends on if your project needs an xml framework or not. Worst case, which 
is far from bad, you can always connect to any C library. Googling c xml 
library turned up several on the front page.



For example gcc has had 64-bit C/C++ support quite long. But it took 
several years to stabilize. The implementation of a 64-bit X-ray machine 
firmware in D cannot begin one week after 64-bit DMD is announced.


It's foolish to assume any compiler is reliable if you're going to be writing 
critical software. Assuming perfection in any part of such a system is a 
thorough misunderstanding of how to create reliable systems.


I believe it is also an error to require a tool be perfect before you can pick 
it up. All that is required is that its benefit/cost is higher than that of 
other tools. D has quite a few advantages that are available with it right now.


Re: Linux Agora D thread

2010-10-22 Thread retard
Fri, 22 Oct 2010 02:42:49 -0700, Walter Bright wrote:

 retard wrote:
 
 Why I think the D platform's risk is so high is because the author
 constantly refuses to give ANY estimates on feature schedules.
 
 Would you believe them if I did?

http://en.wikipedia.org/wiki/Software_development_process

Without project management, software projects can easily be delivered 
late or over budget. With large numbers of software projects not meeting 
their expectations in terms of functionality, cost, or delivery schedule, 
effective project management appears to be lacking.

http://en.wikipedia.org/wiki/Estimation_in_software_engineering

The ability to accurately estimate the time and/or cost taken for a 
project to come in to its successful conclusion is a serious problem for 
software engineers. The use of a repeatable, clearly defined and well 
understood software development process has, in recent years, shown 
itself to be the most effective method of gaining useful historical data 
that can be used for statistical estimation. In particular, the act of 
sampling more frequently, coupled with the loosening of constraints 
between parts of a project, has allowed more accurate estimation and more 
rapid development times.

http://en.wikipedia.org/wiki/Application_Lifecycle_Management

Proponents of application lifecycle management claim that it
* Increases productivity, as the team shares best practices for 
development and deployment, and developers need focus only on current 
business requirements
* Improves quality, so the final application meets the needs and 
expectations of users
* Breaks boundaries through collaboration and smooth information flow
* Accelerates development through simplified integration
* Cuts maintenance time by synchronizing application and design
* Maximizes investments in skills, processes, and technologies
* Increases flexibility by reducing the time it takes to build and adapt 
applications that support new business initiatives

http://en.wikipedia.org/wiki/Cowboy_coding

Lack of estimation or implementation planning may cause a project to be 
delayed. Sudden deadlines or pushes to release software may encourage the 
use of quick and dirty or code and fix techniques that will require 
further attention later.

Cowboy coding is common at the hobbyist or student level where 
developers may initially be unfamiliar with the technologies, such as the 
build tools, that the project requires.

Custom software applications, even when using a proven development 
cycle, can experience problems with the client concerning requirements. 
Cowboy coding can accentuate this problem by not scaling the requirements 
to a reasonable timeline, and may result in unused or unusable components 
being created before the project is finished. Similarly, projects with 
less tangible clients (often experimental projects, see independent game 
development) may begin with code and never a formal analysis of the 
design requirements. Lack of design analysis may lead to incorrect or 
insufficient technology choices, possibly requiring the developer to port 
or rewrite their software in order for the project to be completed.

Many software development models, such as Extreme Programming, use an 
incremental approach which stresses functional prototypes at each phase. 
Non-managed projects may have few unit tests or working iterations, 
leaving an incomplete project unusable.

 I believe it is also an error to require a tool be perfect before you
 can pick it up. All that is required is that its benefit/cost is higher
 than that of other tools.

That's what I said.

 D has quite a few advantages that are
 available with it right now.

But it doesn't matter. Like you said, the benefit/cost matters.


Re: Linux Agora D thread

2010-10-22 Thread Fawzi Mohamed


On 22-ott-10, at 10:56, retard wrote:


[...]
What annoys me the most in pro D articles is the author usually  
tries to
prove (in a naive way) that despite all the deficiencies the  
language and
tool chain is better even *now* than all of the competition or that  
the
*potential* is so high that the only logical conclusion is to move  
to D

*now*. Clearly this isn't the case. These kind of articles give people
the wrong impression. I'm just trying to bring up the *pragmatic*  
point

of view.

For instance, I'm starting the implementation of a 64-bit systems/
application programming project *now*. The implementation phase will  
last
N months (assume optimistic waterfall process model here). How many  
weeks/

months must the N at least be to make D a feasible option?


D1/tango is feasible now (using ldc)

A typical lead developer / project manager has to make decisions  
based on

some assumptions. E.g.

Platform  Implementation  Developer  Performance  Platform
TimeMarket IndexRisk factor
--
C/x64 Linux   12 months   good   100  medium
C++/x64 Linux 10 months   ok 110  high
Java/x64 JVM  8 monthsexcellent  80   low
C#/Windows 64 7 monthsvery good  85   low
Python/Linux  4-5 months  very good  30   low
D 12+ months? very bad   80-115 ? very high

The metrics are imaginary. The point was to show that language  
goodness

isn't a single scalar value.

Why I think the D platform's risk is so high is because the author
constantly refuses to give ANY estimates on feature schedules.  
There's no

up-to-date roadmap anywhere. The bugzilla voting system doesn't work.
Lots of production ready core functionality is missing (for example  
how

long has d2 distribution had a commercial quality xml framework?)


D1/tango also has a good xml parser


For example gcc has had 64-bit C/C++ support quite long. But it took
several years to stabilize. The implementation of a 64-bit X-ray  
machine

firmware in D cannot begin one week after 64-bit DMD is announced.




Re: Linux Agora D thread

2010-10-22 Thread so
He starts with and things they made worse than C++ (such as struct/class  
asymmetry, the first is stack-based, the second GCed heap).
Having 2 different keywords for exact same purposes (1 difference and it  
doesn't worth mentioning) in a language is good and opposite is bad, with  
this kind of BS start i am not sure he got something to say.


On Fri, 22 Oct 2010 09:18:15 +0300, Gour g...@atmarama.net wrote:


Hello!

Few days ago I sent a short post to Linux Agora site explaining why
I'm not participating any longer in 'learning Haskell' reading group
and suggesting D as possible target of one of the future groups.

However, it looks that some are not excited about it:

http://www.linuxagora.com/vbforum/showpost.php?p=6313

I've tried to reply as best as I know, but...

He posted to his Hacker News as well:

http://news.ycombinator.com/item?id=1783810

If anyone has something to add, pls. feel free to do it...


Sincerely,
Gour




--
Using Opera's revolutionary e-mail client: http://www.opera.com/mail/


Re: Linux Agora D thread

2010-10-22 Thread Paulo Pinto
.Net does the same thing with heap/value objects.

Walter Bright newshou...@digitalmars.com wrote in message 
news:i9sipj$68...@digitalmars.com...
 so wrote:
 He starts with and things they made worse than C++ (such as struct/class 
 asymmetry, the first is stack-based, the second GCed heap).
 Having 2 different keywords for exact same purposes (1 difference and 
 it doesn't worth mentioning) in a language is good and opposite is bad, 
 with this kind of BS start i am not sure he got something to say.

 I think the struct/class distinction has worked out very well in D. It's 
 one of the things we nailed. Often, people with a strong C++ background 
 don't initially see it that way, it takes a bit of explaining. 




Re: Linux Agora D thread

2010-10-22 Thread Gary Whatmore
retard Wrote:

 What annoys me the most in pro D articles is the author usually tries to 
 prove (in a naive way) that despite all the deficiencies the language and 
 tool chain bla blah blah

This guy has nothing better to do? Sheesh..

 For instance, I'm starting the implementation of a 64-bit systems/
 application programming project *now*. The implementation phase will last 
 N months (assume optimistic waterfall process model here). How many weeks/
 months must the N at least be to make D a feasible option?

D has everything you need and rest are available via C bindings. You can start 
your product now. Use DMD for 32-bit code, LDC/GDC for 64-bit. Problem solved. 
The N is zero. Even hello world is usually simpler in D.

 A typical lead developer / project manager has to make decisions based on 
 some assumptions. E.g.
 
 Platform  Implementation  Developer  Performance  Platform
   TimeMarket IndexRisk factor
 --
 C/x64 Linux   12 months   good   100  medium
 C++/x64 Linux 10 months   ok 110  high
 Java/x64 JVM  8 monthsexcellent  80   low
 C#/Windows 64 7 monthsvery good  85   low
 Python/Linux  4-5 months  very good  30   low
 D 12+ months? very bad   80-115 ? very high

The numbers for D are

5-6 months (almost as good as python), very good (lots of unemployed students 
reading this newsgroup), 90-150 (D was #1 in the language shootout but the guy 
got jealous). Risks are very low because everyone knows C and D is almost 
compatible with C if you can't handle object oriented meta programming code.

 Why I think the D platform's risk is so high is because the author 
 constantly refuses to give ANY estimates on feature schedules. There's no 
 up-to-date roadmap anywhere. The bugzilla voting system doesn't work. 
 Lots of production ready core functionality is missing (for example how 
 long has d2 distribution had a commercial quality xml framework?)

64-bit DMD, world fastest stdlib (Phobos 2), other libraries, D3, world 
domination ---

 For example gcc has had 64-bit C/C++ support quite long. But it took 
 several years to stabilize. The implementation of a 64-bit X-ray machine 
 firmware in D cannot begin one week after 64-bit DMD is announced.

We don't need X-ray machines. There is a lot of work replacing all C/C++ apps 
with D code. You know, solitaire.exe, notepad.exe, things like that. Much 
better when done in D.

 - G.W.


Re: Linux Agora D thread

2010-10-22 Thread so

On Fri, 22 Oct 2010 11:56:19 +0300, retard r...@tard.com.invalid wrote:


What annoys me the most in pro D articles is the author usually tries to
prove (in a naive way) that despite all the deficiencies the language and
tool chain is better even *now* than all of the competition or that the
*potential* is so high that the only logical conclusion is to move to D
*now*. Clearly this isn't the case. These kind of articles give people
the wrong impression. I'm just trying to bring up the *pragmatic* point
of view.


Agree on this one except one thing, you have to except it has a really  
high potential.



For instance, I'm starting the implementation of a 64-bit systems/
application programming project *now*. The implementation phase will last
N months (assume optimistic waterfall process model here). How many  
weeks/

months must the N at least be to make D a feasible option?

A typical lead developer / project manager has to make decisions based on
some assumptions. E.g.

Platform  Implementation  Developer  Performance  Platform
  TimeMarket IndexRisk factor
--
C/x64 Linux   12 months   good   100  medium
C++/x64 Linux 10 months   ok 110  high
Java/x64 JVM  8 monthsexcellent  80   low
C#/Windows 64 7 monthsvery good  85   low
Python/Linux  4-5 months  very good  30   low
D 12+ months? very bad   80-115 ? very high


You can't compare D to any of them, at least D2, they are final.
Do people take such risks? Or even question this kind of things? I believe  
not.

most likely it is like :
- This is our target
- This is the best/only for this kind of thing (educated or not)

In past maybe companies had to do such analysis, now they got tools that  
at least work.

The next language transition has to be made by programmers themselves.

--
Using Opera's revolutionary e-mail client: http://www.opera.com/mail/


Re: Linux Agora D thread

2010-10-22 Thread Iain Buclaw
== Quote from Gour (g...@atmarama.net)'s article

 Hello!
 Few days ago I sent a short post to Linux Agora site explaining why
 I'm not participating any longer in 'learning Haskell' reading group
 and suggesting D as possible target of one of the future groups.
 However, it looks that some are not excited about it:
 http://www.linuxagora.com/vbforum/showpost.php?p=3D6313
 I've tried to reply as best as I know, but...
 He posted to his Hacker News as well:
 http://news.ycombinator.com/item?id=3D1783810
 If anyone has something to add, pls. feel free to do it...
 Sincerely,
 Gour


Sheesh, people will use any excuse just to play the GDC's outdated card...