On Saturday, 18 July 2015 at 08:17:50 UTC, Ola Fosheim Grøstad wrote:
On Saturday, 18 July 2015 at 02:45:54 UTC, Laeeth Isharc wrote:
On Friday, 17 July 2015 at 12:06:08 UTC, Ola Fosheim Grøstad wrote:
For what domain is D the best choice?

You are switching the question without recognizing this - some kind of fallacy of composition.

There is no fallacy here.

We are each entitled to our judgement about whether D being in some Platonic sense the best single option for an entire domain has much to do with whether it is practicably speaking a sensible choice made by commercially pragmatic people operating within a certain set of uses. We are entitled to our views about the usefulness of thinking in terms of global optimality amidst uncertainty and local constraints. I shall not say more as there are diminishing returns to what is generative.

You cannot compete until you have something that is the best possible starting point for a range of commercially- or community-backed frameworks.

You don't have growth until you have clearly increasing presence on Github and StackOverflow.

We are each of us entitled to our judgement about whether it's healthy to think in that way, and whether it leads to an accurate understanding of reality to do so. The propensity to put things on github and for people to ask questions on stackoverflow varies according to the problem domain. In web development, it's common to open-source things and to ask questions on stackoverflow. In some other sectors I am familiar with that have a decent share of technology spending overall it's rather less common, and this is not a capricious cultural trait but is the result of the application of common sense by commercial people. Liquidity tends to concentrate in certain places - for D, I wouldn't suggest anyone goes to stackoverflow if they want a good and quick answer to their question. It's healthy that liquidity is concentrated.

Large scale batch-processing cannot drive adoption. Specialized solutions like Chapel and C++/extensions will take the batch-throughput market.

I didn't say anything about batch processing. It's also very intriguing to see you believe you know my problem domain better than me.


The work of Austrian economists on entrepreneurship demonstrate that it simply is not possible to know which people will use a product and how. The future is unknown, if not unimaginable.

It isn't. The historical success-stories within the domain of programming languages are pretty clear.

And there has been no change to the terrain, and so the lessons of the past can be applied without careful thought to present conditions that are different? I certainly won't dream of arguing further with you.


If you presume programmer productivity is the only thing that matters and treat efficiency like a free resource, it's a dead cert that at some point efficiency will no longer be free.

More and more problems are solved by less and less efficiency. Development time, reliability, maintainability, evolution and perceived responsiveness are the most important factors.

mmm.

In the 80s lots of software was close to theoretical throughput. Today, almost no software is anywhere close, because it is waaaay too expensive in terms of developer time as code base sizes increase.

We are speaking of shifts at the margin from where we start today, and about the future, not historical trends over the past decade or two.

D has a chance to gain adoption by picking a direction
so you tend to assert.

focusing on improving the process. But it is a long road, that also requires some cleanup of language.
sure.  hard to find anyone that wouldn't agree with the above.

Reply via email to