On Saturday, 18 July 2015 at 10:29:20 UTC, Laeeth Isharc wrote:
accurate understanding of reality to do so. The propensity to
put things on github and for people to ask questions on
stackoverflow varies according to the problem domain.
StackOverflow has become the de-facto documentation resource for
software engineers. It saves me insane amounts of time, many
other programmers say the same thing. Google has been known to
shut down it's own support-forums in order to get higher activity
on StackOverflow.
You cannot gloss over the importance of this.
Large scale batch-processing cannot drive adoption.
Specialized solutions like Chapel and C++/extensions will take
the batch-throughput market.
I didn't say anything about batch processing. It's also very
intriguing to see you believe you know my problem domain better
than me.
I have no interest in your problem domain, but you say that
throughput is important for you.
I see basically 4 reasons to use languages like C++/D/Rust:
1. Low level hardware/OS access
2. Throughput
3. Lowered memory usage
4. Detailed control over execution patterns.
In the 80s lots of software was close to theoretical
throughput. Today, almost no software is anywhere close,
because it is waaaay too expensive in terms of developer time
as code base sizes increase.
We are speaking of shifts at the margin from where we start
today, and about the future, not historical trends over the
past decade or two.
The trend in consumer hardware is that GPUs share memory with the
CPU or have fast paths. So efficiency means GPGPU programming,
Metal, Vulkan..
The trend is that those few that can pay for throughput moves
towards FPGA and other specialized solutions where it matters.
The trend in performant server-hardware is that CPUs have local
memory.
The historical trend is that those expensive solutions become
commoditized in one way or another over time. Meaning, consumer
hardware adopt some features from high-end specialized hardware
over time.