Re: DIP 1028 "Make @safe the Default" is dead

2020-05-29 Thread bpr via Digitalmars-d-announce

On Friday, 29 May 2020 at 20:36:12 UTC, Walter Bright wrote:

Kenny G (famous clarinet player)


Soprano saxophone, not clarinet. They look similar, and are both 
Bb instruments (I know there are non Bb clarinets), but they 
don't sound that similar to me. Kenny G is also sometimes heard 
on other saxes, but  I've never heard him on clarinet.




Re: Beta 2.087.0

2019-06-26 Thread bpr via Digitalmars-d-announce

On Wednesday, 26 June 2019 at 06:53:03 UTC, Martin Nowak wrote:

On Sunday, 16 June 2019 at 22:47:57 UTC, Martin Nowak wrote:
Glad to announce the first beta for the 2.087.0 release, ♥ to 
the 66 contributors.


Second beta is live since yesterday.

http://dlang.org/download.html#dmd_beta 
http://dlang.org/changelog/2.087.0.html


As usual please report any bugs at
https://issues.dlang.org

-Martin


I'm curious. There was some controversy about the fix to the now 
infamous Issue 5710, and doubt about whether the PR to fix it 
would be merged. I see that the fix is now merged (great!) but I 
see a PR (9702) to revert this, which IMO would be a shame.


Is this a feature we shouldn't get used to?


Re: B Revzin - if const expr isn't broken (was Re: My Meeting C++ Keynote video is now available)

2019-01-17 Thread bpr via Digitalmars-d-announce

On Thursday, 17 January 2019 at 01:59:29 UTC, Walter Bright wrote:

On 1/16/2019 4:19 PM, H. S. Teoh wrote:
On Wed, Jan 16, 2019 at 11:43:19PM +, John Carter via 
Digitalmars-d-announce wrote:

[...]


Yes, that's one of the outstanding qualities of D, and one 
that I was
immensely impressed with when I perused the Phobos source code 
for the

first time.
Bartosz Milewski is a C++ programmer and a Haskell fan. He once 
gave a presentation at NWCPP where he wrote a few lines of 
Haskell code. Then, he showed the same code written using C++ 
template metaprogramming.


The Haskell bits in the C++ code were highlighted in red. It 
was like a sea of grass with a shrubbery here and there. 
Interestingly, by comparing the red dots in the C++ code with 
the Haskell code, you could understand what the C++ was doing. 
Without the red highlighting, it was a hopeless wall of < > :-)


Was that a pre C++11 version of C++, or a more modern one?

It would be instructive to see that example with C++17 or even 20 
and D

next to each other.


Re: Dicebot on leaving D: It is anarchy driven development in all its glory.

2018-08-25 Thread bpr via Digitalmars-d

On Saturday, 25 August 2018 at 22:55:05 UTC, RhyS wrote:

Be honest, how many people will use BetterC in production!


Much, MUCH more likely than that I would ever use full D with GC 
in production.


Really, if I want a language with a GC, D is not that good. Why 
wouldn't I use a JVM language (Java, Kotlin, Scala) or Go or 
something else? Notice how they all have precise GCs? Or maybe 
I'd use a functional language like OCaml or Haskell. Same deal, 
precise GC.


The truth is that D is by design NOT a replacement for C++ when a 
low level systems programming language must be used. DasBetterC 
is close to what I'd like when I have to use C++, but not yet 
ideal. I'm starting to think that Rust (or, C++ 17 and beyond) 
will win this battle because every other language shows up with 
stuff I don't want.





Re: Reimplementing software building blocks like malloc and free in D

2018-08-13 Thread bpr via Digitalmars-d

On Monday, 13 August 2018 at 01:49:35 UTC, Mike Franklin wrote:

On Sunday, 12 August 2018 at 06:35:17 UTC, Eugene Wissner wrote:

P.S. In the last weeks I had thoughts to split low-level stuff 
from tanya and put it into a separate library, kind of native, 
gc-free x86-64 (and maybe aarch64) Linux runtime for D. 
Probably I should go for it :)


In recent months, I've been thinking about something like that 
as well.


I think it's a very promising idea. Why not start with DasBetterC 
and build a Phobos like library from there?





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-30 Thread bpr via Digitalmars-d

On Saturday, 28 July 2018 at 21:44:10 UTC, Abdulhaq wrote:

On Saturday, 28 July 2018 at 21:27:12 UTC, bpr wrote:


I hear you. You're looking (roughly) for a better 
Java/Go/Scala, and I'm looking for a better C/C++/Rust, at 
least for what I work on now. I don't think D can be both 
right now, and that the language which can satisfy both of us 
doesn't exist yet, though D is close.


Yes, this. In the light of D's experience, is it even possible 
to have a language that satisfies both?


I believe that the tension between low and high level features 
makes it nearly impossible, that tracing GC is one of those 
difficult problems that rulses out satisfying both sets of users 
optimally, and that the best D (and C++ and Nim) can do is to be 
"mediocre to good, but not great" at both the low level (C/Rust) 
domain and high level domains simultaneously. There are far fewer 
players in the low level space, which is why I see D more as a 
competitor there, and welcome DasBetterC and the noGC initiatives 
so that D can be a great low level and maybe just a good high 
level language.





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread bpr via Digitalmars-d

On Saturday, 28 July 2018 at 20:34:37 UTC, Abdulhaq wrote:

On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:

On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
I think that I no longer fall into the category of developer 
that D is after. D is targeting pedal-to-the-metal 
requirements, and I don't need that. TBH I think 99% of 
developers don't need it.


I'm 99% sure you just made that number up ;-)



Sure, I plucked it out of thin air. But I do think of the 
software development world as an inverted pyramid in terms of 
performance demands and headcount. At the bottom of my inverted 
pyramid I have Linux and Windows. This code needs to be as 
performant as possible and bug free as possible. C/C++/D shine 
at this stuff. However, I number those particular developers in 
the thousands.


The developers at Mozilla working on the browser internals, for 
example, are unaccounted for in your analysis. As are the 
developers where I work.


I think a great bulk of developers, though, sit at the 
application development layer. They are pumping out great 
swathes of Java etc. Users of Spring and dozens of other 
frameworks. C++ is usually the wrong choice for this type of 
work, but can be adopted in a mistaken bid for performance.


I don't know that the great bulk of developers work in Java.


Any how many are churning out all that javascript and PHP code?

Hence I think that the number of developers who really need top 
performance is much smaller than the number who don't.


I'd be willing to accept that, but I have no idea what the actual 
numbers are.


If I had to write CFD code, and I'd love to have a crack, then 
I'd really be wanting to use D for its expressiveness and 
performance. But because of the domain that I do work in, I 
feel that I am no longer in D's target demographic.


If I had to write CFD code, and I wanted to scratch an itch to 
use a new language,
I'd probably pick Julia, because that community is made up of 
scientific computing
experts. D might be high on my list, but not likely the first 
choice. C++ would be in there too :-(.




I remember the subject of write barriers coming up in order (I 
think?) to improve the GC. Around that time Walter said he 
would not change D in any way that would reduce performance by 
even 1%.


Here we kind of agree. If D is going to support a GC, I want a 
state of the art precise GC like Go has. That may rule out some D 
features, or incur some cost that
high performance programmers don't like, or even suggest two 
kinds of pointer (a la Modula-3/Nim), which Walter also dislikes.


Hence I feel that D is ruling itself out of the application 
developer market.


At this stage in its life, I don't think D should try to be all 
things to all programmers, but rather focus on doing a few things 
way better than the competition.


That's totally cool with me, but it me a long time to realise 
that it was the case and that therefore it was less promising 
to me than it had seemed before.


I hear you. You're looking (roughly) for a better Java/Go/Scala, 
and I'm looking for a better C/C++/Rust, at least for what I work 
on now. I don't think D can be both right now, and that the 
language which can satisfy both of us doesn't exist yet, though D 
is close.







Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread bpr via Digitalmars-d

On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
I think that I no longer fall into the category of developer 
that D is after. D is targeting pedal-to-the-metal 
requirements, and I don't need that. TBH I think 99% of 
developers don't need it.


I'm 99% sure you just made that number up ;-)

For those developers who don't need the performance usually 
achieved with C or C++, and can tolerate GC overheads, there are, 
IMO, better languages than D. I'm not saying that here to be 
inflammatory, just that I believe performance is a very big part 
of the attractiveness of D.


If you're mostly working on Android, then Kotlin seems like your 
best option for a non-Java language. It seems OK, there's a 
Kotlin native in the works, the tooling is fine, there's a REPL, 
etc. I like it better than I like Go.


We like to think we do and we love to marvel at the speed of 
improved code, but like prediction, it's overrated ;-)


For you, perhaps. I currently work mostly at a pretty low level 
and I'm pretty sure it's not just self delusion that causes us to 
use C++ at that low level. Perhaps you've noticed the rise of 
Rust lately? Are the Mozilla engineers behind it deluded in that 
they eschew GC and exceptions? I doubt it. I mostly prefer higher 
level languages with GCs, but nothing in life is free, and GC has 
significant costs.





Re: C's Biggest Mistake on Hacker News

2018-07-25 Thread bpr via Digitalmars-d

On Wednesday, 25 July 2018 at 17:23:40 UTC, Ecstatic Coder wrote:

But don't be too optimistic about BetterC...


I'm too old to get optimistic about these things. In the very 
best case, D has quite an uphill battle for market share. Any non 
mainstream language does. If I were a betting man, I'd bet on 
Rust.


Honestly, considering D's leadership current priorities, I 
don't see how it could become soon a true C++ or Go competitor, 
even with the half-baked BetterC initiative...


There are a few ways I can see, and doubtless others can see 
different ones. Here's one: use Mir and BetterC to write a 
TensorFlow competitor for use in developing and deploying ML 
models. I'm sure you can shoot holes in that idea, but you get 
the point. Try lots of things and see what works, and keep doing 
more of those things. Worked for Python.


For instance, I've suggested they consider using reference 
counting as an alternative default memory management scheme, 
and add it to the lists of scolarship and crowdsourced project, 
and of course they have added all the other suggestion, but not 
this one. What a surprise ;)


I'm pretty sure D leadership is pursuing such things. In fact,

https://wiki.dlang.org/Vision/2018H1

rather prominently mentions it.

Despite this is probably one of the most used allocation 
management scheme in typical C++ development, as this 
drastically reduces the risks of memory leaks and dangling 
pointers...


Anyway, meanwhile D remains a fantastic strongly-typed 
scripting language for file processing and data analysis, and 
its recent adoption at Netflix has once again clearly proved 
it...


For this and similar uses, tracing GC is fine, better in fact 
than the alternatives. I'm only making noise about betterC for 
the cases where C++ dominates and tracing GC is a showstopper.


In an alternative timeline, DasBtterC would have been released 
before D with GC, and the main libraries would have been nogc, 
and maybe there'd be a split between raw pointers and traced refs 
(like Nim and Modula-3) and then maybe there'd have been no 
strong desire for Rust since D could have filled that niche.





Re: C's Biggest Mistake on Hacker News

2018-07-25 Thread bpr via Digitalmars-d

On Tuesday, 24 July 2018 at 17:24:41 UTC, Seb wrote:

On Tuesday, 24 July 2018 at 17:14:53 UTC, Chris M. wrote:

On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder 
wrote:

[...]


No. For many C++ users, tracing GC is absolutely not an 
option. And, if it were, D's GC is not a shining example of a 
good GC. It's not even precise, and I would bet that it never 
will be. If I'm able to tolerate a GC, there are languages 
with much better GCs than the D one, like Go and Java.


[...]


There was a precise GC in the works at one point, no clue what 
happened to it.


The newest PR is:

https://github.com/dlang/druntime/pull/1977

Though there's already a bit of precise scanning on Windows, 
e.g. https://github.com/dlang/druntime/pull/1798 and IIRC 
Visual D uses a precise GC too.


Well, this is a big problem with D IMO. There are a lot of 
unfinished, half baked features which linger in development for 
years. How long for precise GC now, over 5 years? I don't think D 
was really designed to be friendly to GC, and it just isn't 
realistic to expect that there will *ever* be a production 
quality precise GC for all of D. Maybe giving up on some things 
and finishing/fixing others would be a better strategy? I think 
so, which is why I think DasBetterC is the most appealing thing 
I've seen in D lately.




Re: C's Biggest Mistake on Hacker News

2018-07-24 Thread bpr via Digitalmars-d

On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder wrote:

On Tuesday, 24 July 2018 at 13:23:32 UTC, 12345swordy wrote:

On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder wrote:
So, at the moment, I don't see how you can EASILY convince 
people to use BetterC for C/C++ use cases, like programming 
games, microcontrollers, etc.


*Extremely powerful meta programming that blows c++ meta 
programming out of the water

*Clean readable syntax
*No header file nonsense
*Standard keyword for ASM if you really need the performance 
boost.

*Compiler enforce memory safety.

-Alex


I know.

And D's builtin strings/arrays/slices/maps/etc and automatic 
memory deallocation are part of what makes D a better 
alternative to C++ too.


No. For many C++ users, tracing GC is absolutely not an option. 
And, if it were, D's GC is not a shining example of a good GC. 
It's not even precise, and I would bet that it never will be. If 
I'm able to tolerate a GC, there are languages with much better 
GCs than the D one, like Go and Java.


I work in a mostly C++ shop where exceptions are intolerable in 
C++ code, and in many places we use CRTP to eliminate dispatch 
overhead. DasBetterC would be usable here but it's too late given 
the existing investment in C++. Obviously there's no CRTP in 
DasBetterC without struct inheritance, but there are other 
designs to address this issue.


Besides having more betterC libraries, I'd like to see some kind 
of restricted approach to exception handling, like the ones being 
investigated in 
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0709r1.pdf. If you want a better C++, look at what people who have to use C++ use it for, and where the pain points are.







Re: gRPC, Better C and D

2018-07-02 Thread bpr via Digitalmars-d

On Monday, 2 July 2018 at 10:25:07 UTC, Andre Pany wrote:
Small update. With the help of evilrat I was able to translate 
the C headers to D.

https://github.com/andre2007/grpc/tree/feature/d/src/d/source/grpc/c

While the source code compiles without errors, everything is 
completely untested.

As next step I will try to rewrite the python grpc code with D:
https://github.com/andre2007/grpc/tree/feature/d/src/python/grpcio/grpc/_cython/_cygrpc


That's really interesting! Are you thinking about translating 
some part of gRPC to DasBetterC?





Re: Remember the Vasa! by Bjarne Stroustrup

2018-05-29 Thread bpr via Digitalmars-d

On Tuesday, 29 May 2018 at 11:31:53 UTC, Guillaume Piolat wrote:

On Tuesday, 29 May 2018 at 05:11:27 UTC, Dmitry Olshansky wrote:
D is probably at the edge of what I can tollerate 
complexity-wise. And we’ll get to simplify a few things soon I 
believe.


What are the things that you think will be simplified? I thought 
that D had some of the same issues about breaking backward 
compatibility that C++ had.


Within D, there is a bit smaller and cleaner language 
struggling to get out!


Ha, one of my favorite Stroustrup quotes about C++!

One of the reasons I like the betterC switch is that it does 
simplify the language, perhaps too much, but preserves some of 
the best parts of D, like metaprogramming and modules.




Re: My choice to pick Go over D ( and Rust ), mostly non-technical

2018-02-05 Thread bpr via Digitalmars-d

On Monday, 5 February 2018 at 01:38:13 UTC, psychoticRabbit wrote:

On Sunday, 4 February 2018 at 20:15:47 UTC, bpr wrote:


Which benefits of C are lost?



The ability to program on 16-bit platforms (yeah.. they still 
exist ;-)


Thanks, that's a good answer!

I did put a bit of effort in trying out betterC (I don't write C 
these days and it was fun) but I admit I don't program in the 16 
bit realm. I'm more interested in HPC than tiny embedded systems, 
so my concerns are probably different.


Re: My choice to pick Go over D ( and Rust ), mostly non-technical

2018-02-04 Thread bpr via Digitalmars-d

On Sunday, 4 February 2018 at 11:14:43 UTC, JN wrote:

On Friday, 2 February 2018 at 15:06:35 UTC, Benny wrote:
You want to produce PDFs? fpdf 2015-Apr-06, a very limited PDF 
generation tool last updated 3 years go.




While not as trivial as just using a dub package, D easy 
interop with C means you can use C libraries for PDF like 
libharu or w/e.



* Are you targeting C developers?

Sure BetterC is a way towards that but again, what do you 
offer more then Rust?


Overloading, templates, compile time features are arguably "more 
than Rust".



I see C developers more going for Rust then D on this point.


Maybe, Rust is a decent language, and it appears to be getting 
better faster than D is. I recall the announcement of an 
experimental precise GC for D in 2013 or so, and Andrei at the 
time made it clear that a precise GC would be worth it even at 
some cost in performance. I don't think D will ever get a precise 
GC. Maybe the Rust and "Modern C++" guys are right and it's not 
worth it in a systems programming language?


Personally I agree that BetterC isn't a good alternative for C 
programmers. Sure, you get some benefits of D, but you will 
lose many benefits of C


Which benefits of C are lost?

and you'll have to constantly fight "wait, can I use this in 
BetterC or not" kind of thing.


Fair point, but that's a quality of implementation thing. I can 
imagine that in 6 months betterC is better supported on all 
platforms, and better documented.




core.sys.posix.setjmp unavailable for OS X?

2018-01-15 Thread bpr via Digitalmars-d-learn
Is there a reason that it's unavailable on OS X when it works 
fine on Linux? The functions exist on OS X, and it's easy enough 
to compile C programs using setjmp there; but not D programs. I 
don't think I'm getting a betterC experience on the Mac.


I'd also ask why the there are no D docs for core.sys.posix but I 
read the responses to the last time the question was asked and 
now I'm D-pressed. :-(


Re: D as a betterC a game changer ?

2017-12-24 Thread bpr via Digitalmars-d

On Sunday, 24 December 2017 at 10:57:28 UTC, Mike Parker wrote:
The motivation behind the -betterC flag is *not* to write new 
programs in the general case.


Whether you agree or not, that is *exactly* how some programmers 
would like to use it. I think that's a good thing, and I hope 
lots of other people do too, and start writing lots of betterC 
code. Let a thousand flowers bloom.


From that perspective, comparing "D as a better C" to C++ is 
not appropriate. Comparing regular D (without -betterC) to C++ 
makes more sense.


If you think that GCs suck, or at least, that the D GC sucks, 
then D doesn't fare well in the comparison with C++. Rust might, 
by that criteria.


I'd have preferred not to have worded it that way, since I don't 
think GCs suck in general, but it seems there's a big failure to 
communicate between different camps of programmers, which affects 
D more than other languages since it is intended to be a very 
general purpose language, much like C++. GC is not an unqualified 
positive


That said, some things that currently are not supported in 
-betterC mode may be in the future. For example, the upcoming 
2.078.0 release will add RAII support (including 
scope(exit))[1] to -betterC mode. Future versions might support 
exception handling of some kind.


That's great!

But for new code, D is already a better C without the -betterC 
flag.


In general, GCs and runtimes don't compose well, so if you write 
a D library that you want to be called from a language with its 
own GC and runtime, that's yet another hoop to jump through. Go 
might be a better C for some people, but extending Python with Go 
libraries hasn't caught on. New D libraries written as betterC 
handle this case.




Re: Maybe D is right about GC after all !

2017-12-23 Thread bpr via Digitalmars-d
On Saturday, 23 December 2017 at 09:10:25 UTC, Walter Bright 
wrote:

On 12/22/2017 7:23 AM, Russel Winder wrote:
I think we are now in a world where Rust is the zero cost 
abstraction
language to replace C and C++, except for those who are 
determined to

stay with C++ and evolve it.


Maybe it is. But that is not because D isn't up to the task. 
I've converted a large program from C to D (Digital Mars C++'s 
front end) with -betterC and it really is a zero cost 
abstraction. The memory safety benefits are there (DIP 1000), 
RAII is there, nested functions, array bounds checking, 
template metaprogramming, CTFE, etc.


Is it planned to add more (I'm thinking of exceptions, which I 
guess means classes too) of full D into betterC? As I wrote 
earlier, it should be possible to achieve some rough kind of 
feature parity with modern C++.


I really do like Rust; I think it's a brilliant language. There 
are domains that require zero cost abstractions that are 
currently NOT covered well by Rust though; D's Mir library is 
pushing into one of those.


D as betterC really is a game changer, for anyone who cares to 
give it a try.


Yes, it really is.

I really wish that D had a performant fully precise GC, but I'm 
beginning to think that is unlikely to ever happen. Maybe being a 
betterModernC++ in that regard would be good enough?





Re: Maybe D is right about GC after all !

2017-12-22 Thread bpr via Digitalmars-d

On Friday, 22 December 2017 at 13:38:25 UTC, Dan Partelly wrote:
It works as a "betterC" it seems, but you loose a lot of 
functionality which should be in a "better C" and again, a lot 
from the standard libraries is lost. Template C++ 2017 works 
well for a better C as well, and I retain 0 cost abstraction, 
decent (yet inferior to D meta-programming), closures,

exceptions, scopes...


It seems that there's an effort from the top to bring more higher 
level features into --betterC. I agree with you that more should 
be there, that it should really be betterC++ and strive for 
feature parity with modern C++.




Re: GSoC 2018 - Your project ideas

2017-12-13 Thread bpr via Digitalmars-d-announce

On Tuesday, 5 December 2017 at 18:20:40 UTC, Seb wrote:
I am looking forward to hearing (1) what you think can be done 
in three months by a student and (2) will have a huge impact on 
the D ecosystem.


Of the projects in [2], I like the general purpose betterC 
libraries most, and I think it's something where students could 
make a real impact in that time period.



[1] https://developers.google.com/open-source/gsoc/timeline
[2] https://wiki.dlang.org/GSOC_2018_Ideas





Re: Thoughts about D

2017-11-29 Thread bpr via Digitalmars-d

On Wednesday, 29 November 2017 at 16:57:36 UTC, H. S. Teoh wrote:
On Tue, Nov 28, 2017 at 06:18:20PM -0800, Walter Bright via 
Digitalmars-d wrote: [...]
BetterC is a door-opener for an awful lot of areas D has been 
excluded from, and requiring druntime is a barrier for that.


Doesn't this mean that we should rather focus our efforts on 
improving druntime instead of throwing out the baby with the 
bathwater with BetterC?


Isn't it possible to do both? For example, make D's GC a precise 
one (thus improving the runtime) and making the experience of 
using D sans GC and runtime a simple one?


In answer to your question, if D is excluded from a lot of areas 
on account of requiring druntime, then it may be that no version 
of what you expect from druntime (I'll use GC as an obvious 
example) will remove that barrier.




Re: My two cents

2017-10-23 Thread bpr via Digitalmars-d

On Monday, 23 October 2017 at 11:21:13 UTC, Martin Nowak wrote:

On Saturday, 21 October 2017 at 18:52:15 UTC, bitwise wrote:

On Wednesday, 18 October 2017 at 08:56:21 UTC, Satoshi wrote:

async/await (vibe.d is nice but useless in comparison to C# 
or js async/await idiom)




Reference counting when we cannot use GC...



If I understand correctly, both of these depend on 
implementation of 'scope' which is being worked on right now.


Scope is about preventing pointer escaping, ref-counting also 
needs to make use-after-free safe which is currently in the 
early spec phase.


FYI, in cousin (imperative, GC'ed, systems PL) language Nim, 
there's currently some similar discussion around how to make seq 
and string GC free:


https://nim-lang.org/araq/destructors.html

so these efforts in D are timely.



Re: [OT] LLVM 5.0 released - LDC mentioned in release notes

2017-09-07 Thread bpr via Digitalmars-d-announce

On Thursday, 7 September 2017 at 20:55:22 UTC, Nordlöw wrote:
Are there any new code-generation features in LLVM 5.0 that LDC 
will make use of?


Given that LLVM has direct support for coroutines since 4.0 
(https://llvm.org/docs/Coroutines.html) I've wondered if D (even 
just LDC D for starters) could use that to implement async/await 
or a similar feature.





Re: D easily overlooked?

2017-07-14 Thread bpr via Digitalmars-d

On Friday, 14 July 2017 at 09:02:58 UTC, Stefan Koch wrote:

The beauty of D lies in it's holistic approach.

The one unique feature to point out would be CTFE which is not 
to be found in other compiled langauges.


CTFE is found in Nim, as well as inline assembler. Relatively 
easy to use AST macros are also found in Nim.


I don't know another language with full D-like scope guards.

I agree that it's the combination of features that make D 
appealing. I generally think of D as "C++ done righter" and don't 
think about specific unique features.





Re: Types: The Next Generation (Was: Why is phobos so wack?)

2017-07-10 Thread bpr via Digitalmars-d

On Monday, 10 July 2017 at 16:16:40 UTC, bpr wrote:

On Monday, 10 July 2017 at 01:21:08 UTC, Nick Sabalausky wrote:
Ah, I guess it is very similar after all, except it'd be based 
on top of and coexist with all of D's design by introspection 
stuff (rather than exist without it as with C++), thus 
avoiding a lot of the downsides and getting best of both 
worlds.



You've seen this, right?

https://wiki.dlang.org/User:9rnsr/DIP:_Template_Parameter_Constraint

A small step in one such direction, influenced by C++ concepts. 
That proto-DIP also raises a question I always had about why D 
doesn't allow chained template instantiation, but that's 
another DIP for another time.


Sorry about the repeat posting, I could observe the software 
hiccuping on my browser...


Re: Types: The Next Generation (Was: Why is phobos so wack?)

2017-07-10 Thread bpr via Digitalmars-d

On Monday, 10 July 2017 at 01:21:08 UTC, Nick Sabalausky wrote:
Ah, I guess it is very similar after all, except it'd be based 
on top of and coexist with all of D's design by introspection 
stuff (rather than exist without it as with C++), thus avoiding 
a lot of the downsides and getting best of both worlds.



You've seen this, right?

https://wiki.dlang.org/User:9rnsr/DIP:_Template_Parameter_Constraint

A small step in one such direction, influenced by C++ concepts. 
That proto-DIP also raises a question I always had about why D 
doesn't allow chained template instantiation, but that's another 
DIP for another time.




Re: Types: The Next Generation (Was: Why is phobos so wack?)

2017-07-10 Thread bpr via Digitalmars-d

On Monday, 10 July 2017 at 01:21:08 UTC, Nick Sabalausky wrote:
Ah, I guess it is very similar after all, except it'd be based 
on top of and coexist with all of D's design by introspection 
stuff (rather than exist without it as with C++), thus avoiding 
a lot of the downsides and getting best of both worlds.


You've seen this, right?

https://wiki.dlang.org/User:9rnsr/DIP:_Template_Parameter_Constraint

A small step in one such direction, influenced by C++ concepts. 
That proto-DIP also raises a question I always had about why D 
doesn't allow chained template instantiation, but that's another 
DIP for another time.




Re: Types: The Next Generation (Was: Why is phobos so wack?)

2017-07-10 Thread bpr via Digitalmars-d

On Monday, 10 July 2017 at 01:21:08 UTC, Nick Sabalausky wrote:
Ah, I guess it is very similar after all, except it'd be based 
on top of and coexist with all of D's design by introspection 
stuff (rather than exist without it as with C++), thus avoiding 
a lot of the downsides and getting best of both worlds.


You've seen this, right?

https://wiki.dlang.org/User:9rnsr/DIP:_Template_Parameter_Constraint

A small step in one such direction, influenced by C++ concepts. 
That proto-DIP also raises a question I always had about why D 
doesn't allow chained template instantiation, but that's another 
DIP for another time.




Re: Types: The Next Generation (Was: Why is phobos so wack?)

2017-07-10 Thread bpr via Digitalmars-d
On Sunday, 9 July 2017 at 20:22:16 UTC, Nick Sabalausky 
(Abscissa) wrote:
Obviously this is all very incomplete, but it's an idea I think 
is rather interesting.


You've seen this, right?

https://wiki.dlang.org/User:9rnsr/DIP:_Template_Parameter_Constraint

A small step in one such direction, influenced by C++ concepts. 
That proto-DIP also raises a question I always had about why D 
doesn't allow chained template instantiation, but that's another 
DIP for another time.




Re: Go 1.9

2017-06-19 Thread bpr via Digitalmars-d

On Monday, 19 June 2017 at 13:24:00 UTC, Russel Winder wrote:
Go gets parallel compilation, at last, and better garbage 
collection. The former is not a problem for D, but the latter…





It should also be noted that, even though it's still a research 
project, Scala native just recently upgraded it's Boehm GC to an 
Immix based one. Scala native would be yet another language 
competing with D, and might compete in even more domains than Go 
would.


Re: Isn't it about time for D3?

2017-06-10 Thread bpr via Digitalmars-d

On Saturday, 10 June 2017 at 23:30:18 UTC, Liam McGillivray wrote:

I'd be fascinated by a revised D like language, say D3 or 
whatever.



Here are some ways that D3 can be an improvement of D2:
-Final by default


Wow, after all that, this is it? I think final by default would 
be an improvement, and I wish it had gone through, but it's not a 
big enough deal to make a new language.


-A standard library that is effective with or without garbage 
collection


That's being worked on with D right now, isn't it?

If you're not going to be very bold, what's the point of a D3? 
Let's really change stuff!



Structs, enums, and pattern matching, like Rust and ML
Type follows name like Ada, Scala, Rust, ...
Macros like Nim
Parallel features from Chapel


I think it's a huge uphill battle for a new language these days, 
and that there's more to be gained from fixing the current D and 
others, but I encourage you to design the next D.





Re: Trip notes from Israel

2017-05-23 Thread bpr via Digitalmars-d-announce

On Monday, 22 May 2017 at 15:05:24 UTC, Andrei Alexandrescu wrote:

http://dlang.org/blog/2017/05/22/introspection-introspection-everywhere/ -- 
Andrei


That was a great read, thanks!

At the end, you mention a successful serial entrepreneur who 
counsels pursuing the great rather than the good ideas being 
advanced in the D community. Did he happen to mention which ideas 
were great, and which just good?


Re: Interpolated strings

2017-04-18 Thread bpr via Digitalmars-d

On Wednesday, 19 April 2017 at 00:30:31 UTC, Walter Bright wrote:
I'm not saying you cannot do cool and useful things with AST 
macros. My position is it encourages absolutely awful code as 
(usually inexperienced) programmers compete to show how clever 
their macros are.


I'd think that that's a problem with community coding standards.

The language gets balkanized into a collection of dialects that 
are unrecognizable across user groups.


I'm pretty sure that hasn't happened with every language that 
supports macros. Even in the case of Scheme, I don't think it's 
the macros that are responsible for all of the dialects. It's the 
fact that the core language never includes enough (no records, 
exceptions, modules, ...) so every group adds their own versions 
of these features. Maybe if macros didn't make that easier then 
Schemers would have added those things to the core, but that's a 
counterfactual that I don't find convincing.


As a compiler dev who gets stuck figuring out users' bug 
reports, dealing with templates is bad enough (the first thing 
I do with a bug report is redo it to remove all the templates). 
I do not want to deal with some custom syntax. If I may pull 
the "I'm older" card, programmers will find as they gain 
experience that the AST macros are just not worth it.


Some programmers will not find that. Others will find that other 
features you value are just not worth it. There are absolutely no 
categorical statements. :-)


This disastrous state of affairs has occurred with every 
language that supports macros.


I don't think I've ever heard from Common Lisp, Scheme or Clojure 
programmers that they'd like to remove macros from their 
respective languages for the reasons you mention. I don't see the 
disasters there. The Julia folks looked at the Lisp experience 
and decided to include macros.


Both Rust and Nim support macros. Scala too. Not long enough for 
the disaster yet?


It's certainly not all roses, and writing and debugging macros 
can be a PITA. I try to avoid them, and consider them a tool of 
last resort. But they're very powerful, and sometimes I'm not 
smart enough to figure out how to do what I want cleanly with 
less powerful features.


If you want a nauseous example, check out the Boost C 
preprocessor metaprogramming library. Or C++ expression 
templates - so cool, and yet so utterly wretched.


Have you checked out examples of macros that are not so 
nauseating? I find Nim decent, and of course the Lisps have a 
long history of macrology. I think you're drawing a view of 
macros from cpp, and MASM, and such, and not so much from the 
Lisp family, or Nim. cpp macrology is very different!


D is interesting to me mostly because of it's powerful templates 
and CTFE. It seems a shame (to me, obviously) that such a 
powerful static metaprogramming feature as macros will not be a 
part of D, but it's your language!





Re: Interpolated strings

2017-04-18 Thread bpr via Digitalmars-d

On Tuesday, 18 April 2017 at 08:01:14 UTC, Jacob Carlborg wrote:

On 2017-04-18 08:59, Stefan Koch wrote:


The corresponding ast-macros would be extremely complex


No, it's not that complex.


Here's how it's done in Nim, a statically typed language similar 
to D, but with Python syntax, and macros. It takes some knowledge 
to understand, sure, macros are not a beginner tool, but wouldn't 
say this is extremely complex. I bet a D-with-macros would have a 
similar complexity solution.


-- string_interpolation.nim 
--


import macros, parseutils, sequtils

macro exho(text: string{lit}): untyped =
  var nodes: seq[NimNode] = @[]
  # Parse string literal into "stuff".
  for k, v in text.strVal.interpolatedFragments:
if k == ikStr or k == ikDollar:
  nodes.add(newLit(v))
else:
  nodes.add(parseExpr("$(" & v & ")"))
  # Fold individual nodes into a statement list.
  result = newNimNode(nnkStmtList).add(
foldr(nodes, a.infix("&", b)))

const
  multiplier = 3.14
  message = exho"$multiplier times 2.5 is ${multiplier * 2.5}"
  foo = "foo"
  message2 = exho"$foo 3 times is ${foo & foo & foo}"

echo message
echo message2


Running gives

3.14 times 2.5 is 7.851
foo 3 times is foofoofoo





Re: [Tidbit] making your D code more modular & unittestable

2017-03-10 Thread bpr via Digitalmars-d

On Friday, 10 March 2017 at 14:58:09 UTC, Nick Treleaven wrote:

On Thursday, 9 March 2017 at 20:54:23 UTC, Nick Sabalausky
Wishlist for D3: Some brilliant form of sugar for declaring a 
function that takes a range.


auto parseFile()(auto input if isRandomAccessRangeOf!ubyte && 
hasSlicing) {


My spin on an inline parameter constraint idea by Kenji (his 
doesn't use auto and also has more concept-like sugar):


https://wiki.dlang.org/User:9rnsr/DIP:_Template_Parameter_Constraint

As mentioned in the link, inline constraints can help make more 
specific error messages when constraints fail.


That looks like a useful DIP. What has to happen to move it to 
the DIP repository 
https://github.com/dlang/DIPs/blob/master/GUIDELINES.md ?







Re: Interesting paper on composing type definitions

2017-03-06 Thread bpr via Digitalmars-d

On Monday, 6 March 2017 at 21:22:46 UTC, Timon Gehr wrote:

On 06.03.2017 21:49, Enamex wrote:
On Monday, 6 March 2017 at 01:37:18 UTC, Andrei Alexandrescu 
wrote:

On 3/4/17 10:36 PM, Andrei Alexandrescu wrote:

https://pdfs.semanticscholar.org/5de7/591a853ec947f8de7dc70df0b2ecc38b8774.pdf
I haven't read the paper yet but doesn't that sound exactly 
opposite to

what 'sum types' is usually used to mean?

The value of the variable has to be either A or B. If it 
stores the

status of both then it's basically a struct, right?

Probably I'm misunderstanding your point on composition and 
'joint API'.


I don't think you are. The paper is using non-standard 
terminology.


You're right. 'Sum' in that paper joins the interfaces, so it's 
really 'product' in the fairly standard type theory terminology. 
I wish they hadn't done that, it makes communication harder than 
it has to be.


Here's a recent post from our cousins in the Rust belt

http://manishearth.github.io/blog/2017/03/04/what-are-sum-product-and-pi-types/

which may make things clearer to people unfamiliar with that 
terminology.





Re: [OT] Re: Why don't you advertise more your language on Quora etc ?

2017-03-06 Thread bpr via Digitalmars-d

On Monday, 6 March 2017 at 16:42:50 UTC, bachmeier wrote:
Writing up a detailed example with code showing how to avoid 
the GC in the most common situations, posting it on Reddit, and 
then making it easy to find on dlang.org would be a good start. 
Given the importance of these issues, it should be one of the 
first things you see on the homepage.


That's a great idea. In fact, I'd like to see multiple examples, 
with many different approaches to manual memory management, going 
over the common problems (e.g. "How do I do a writefln in a @nogc 
block?") and how to solve them in idiomatic D. Something like 
Rust's guide to unsafe programming. This set of examples could be 
extended as the upcoming DIPs dealing with resource management 
make it into D compilers.





Re: Independent Study at my university using D

2017-03-06 Thread bpr via Digitalmars-d-announce

On Monday, 6 March 2017 at 02:25:41 UTC, Jeremy DeHaan wrote:
Me working on it has effectively stalled because school takes 
up much of my time and I'm still pretty lacking in experience 
with garbage collection. That's pretty much why I'm doing the 
study.


Best of luck to you with the study then! You've chosen an 
interesting and difficult topic.






Re: Independent Study at my university using D

2017-03-04 Thread bpr via Digitalmars-d-announce

On Friday, 3 March 2017 at 19:00:00 UTC, Jeremy DeHaan wrote:
This is exciting for me because I really enjoyed the work I did 
during the last GSoC, so I'm hoping to learn more about garbage 
collection and contribute to D's garbage collector more in the 
future.


What's the status of that work with respect to the D main line? 
Last I checked there's this 
https://github.com/dlang/druntime/pull/1603 which is just hanging.


It would be great if D finally had a precise GC, though from 
https://wiki.dlang.org/Vision/2017H1 it would seem that @nogc has 
higher priority.




Re: Updates to the tsv-utils toolkit

2017-02-22 Thread bpr via Digitalmars-d-announce
On Wednesday, 22 February 2017 at 18:12:50 UTC, Jon Degenhardt 
wrote:

...snip...

Repository: https://github.com/eBay/tsv-utils-dlang
Performance benchmarks: 
https://github.com/eBay/tsv-utils-dlang/blob/master/docs/Performance.md


--Jon


This is very nice code, and a good result for D. I'll study this 
carefully. So much of data analysis is reading/transforming 
files...


I wish you didn't anonymize the specialty toolkits. I think I 
understand why you chose to do so, but it makes the comparison 
less valuable. Still, great work! Looking forward to a blogpost.




Re: D future ...

2017-02-15 Thread bpr via Digitalmars-d
On Wednesday, 15 February 2017 at 17:53:43 UTC, Ola Fosheim 
Grøstad wrote:
Typo: I mean't that one cannot assume that Apple hardware has 
more than 2 cores (so one has to write applications that 
perform well with only 2 cores).


You're missing what I consider to be 'the Big Picture', namely 
that Swift will become popular on non-Apple platforms, and it 
needs to be fairly capable to compete with Go, Java, and C++, and 
others. IBM is already backing server side Swift to some degree.




Re: D future ...

2017-02-15 Thread bpr via Digitalmars-d
On Wednesday, 15 February 2017 at 14:44:55 UTC, Ola Fosheim 
Grøstad wrote:
Another example is Swift. Swift managed to take over 
Objective-C rather quickly IMO, but Swift has also absorbed the 
non-C semantics of Objective-C, thus it did not require 
changing existing practice significantly.


Swift took over quickly because Apple has mandated it. While I'm 
happy about that, there's no denying that Swift wouldn't be where 
it is without the weight of Apple behind it. I'd go as far as to 
say that Swift's success is assured (unless Apple drops it, which 
looks unlikely) and that because Swift has money behind it, more 
money will follow, and so will a thriving ecosystem, on and off 
OS X.


As a PL, Swift looks nice, but they'll have to come up with a 
more complete story around concurrency.






Re: Questionnaire

2017-02-08 Thread bpr via Digitalmars-d-announce

On Wednesday, 8 February 2017 at 21:41:24 UTC, Mike wrote:
Suggesting D would be an exercise in futility, unless I can 
create a notable project in D in my spare time that 
demonstrates its advantages and appeal to the masses.  I tried 
to do this 2 years ago, but D failed me, primarily due to 
https://issues.dlang.org/show_bug.cgi?id=14758


I read this comment from you on another thread too, and (caveat: 
I'm not working in such resource constrained domains as you are) 
it seems sensible. It seems like it may be a good GSOC project to 
modify dmd as you suggest elsewhere. Have you considered trying 
to find someone to do that?


I believe D has the potential to bury all other emerging 
languages out there, but only if it drops its historical 
baggage.
 At the moment, I'm of the opinion that D will remain an 
obscure language until someone forks D and takes it in a 
different direction (unlikely), or the D Foundation decides to 
"reboot" and start working on D3 with a new, updated 
perspective (more unlikely).


I'd love to see a D3, but that seems unlikely, and more unlikely 
if D2 languishes. It seems though that your issues are with the 
implementation, not the language itself, so if you got your 
wishes below



Instead I suggest following through on things like
https://issues.dlang.org/show_bug.cgi?id=12270 and considering 
this proposal

(http://forum.dlang.org/post/psssnzurlzeqeneag...@forum.dlang.org) instead.


wouldn't you be mostly satisfied with D2?



Re: [Semi-OT] I don't want to leave this language!

2016-12-06 Thread bpr via Digitalmars-d-learn
On Tuesday, 6 December 2016 at 22:47:34 UTC, Jonathan M Davis 
wrote:
On Tuesday, December 06, 2016 22:13:54 bpr via 
Digitalmars-d-learn wrote:

On Tuesday, 6 December 2016 at 17:00:35 UTC, Jonathan M Davis

wrote:
Sure, there are folks who would prefer not to have to deal with 
the GC but throw out the runtime and std lib? You lose out on 
too much for it to be at all worth it for many folks. At that 
point, C++11/14/17 looks far more appealing, especially as it 
continues to improve.


It's a counterfactual at this point, but I would guess that if D 
had left out the GC in 2010 when D2 came out it would have been 
ahead of C++ in many ways and perhaps would have been able to 
peel off more C++ programmers and achieve the momentum that Rust 
appears to have now. Yes, it would be missing some features on 
account of omitting GC, but D2 -GC in 2010 is still much better 
than C++ 2011. As C++ absorbs D features, the case for D seems 
weaker.


We get plenty of folks who aren't big C/C++ programmers who are 
interested in D. Yes, the majority seem to have a C++ 
background, but we also get folks from C#, python, ruby, etc.


It would be nice to see a breakdown. From where I sit, it appears 
that most of the interest in D is from C++ users, and it doesn't 
appear that D popularity is rising so much. Any data that belies 
that sad assessment?





Re: [Semi-OT] I don't want to leave this language!

2016-12-06 Thread bpr via Digitalmars-d-learn

On Tuesday, 6 December 2016 at 22:23:25 UTC, bachmeier wrote:

On Tuesday, 6 December 2016 at 22:13:54 UTC, bpr wrote:
Those programmers who are comfortable working in a GC-ed 
language will likely eschew D because D's GC is really not 
that great.


So someone working with Ruby is not going to want to work with 
D because of GC performance?


Ruby programmers are probably not concerned with performance at 
all ever. It's a slow interpreted language with a GIL. But if 
you're on a Rails project, that's what you'll use.


If I really *want* to use a GC, say I'm writing a server and I 
believe that a well tuned GC will allow my server to stay alive 
much longer with less fragmentation, I'll probably skip D and 
pick Go or maybe (hmmm...) even Java because their GCs have had a 
lot of engineering effort.


I wonder what percentage of Ruby programmers have thought about 
garbage collection ever.


Why would a Ruby or Python programmer unconcerned with 
performance want to switch to D? I'm sure there are some who 
would, but I'd imagine they're rare.





Re: [Semi-OT] I don't want to leave this language!

2016-12-06 Thread bpr via Digitalmars-d-learn
On Tuesday, 6 December 2016 at 17:00:35 UTC, Jonathan M Davis 
wrote:
So, while there are certainly folks who would prefer using D as 
a better C without druntime or Phobos, I think that you're 
seriously overestimating how many folks would be interested in 
that. Certainly, all of the C++ programmers that I've worked 
with professionally would have _zero_ interest in D as a better 
C.


I would guess that the vast majority of interest shown in Rust is 
from people who essentially want a better C or C++, with no 
runtime/GC. So, I think Ilya's point is very plausible. D with no 
GC, but with modules, templates, overloading, CTFE, and some 
other features might have been more tempting to the no-GC crowd, 
which includes many hardcore C++ programmers.


Those programmers who are comfortable working in a GC-ed language 
will likely eschew D because D's GC is really not that great.





Re: consequences of removing semicolons in D like in Python

2016-09-19 Thread bpr via Digitalmars-d
On Saturday, 17 September 2016 at 16:43:13 UTC, Nick Sabalausky 
wrote:
If semicolons are such a terrible drain, there's always JS and 
Python.


For someone who likes D, Nim (http://www.nim-lang.org) would be 
better a choice for a language at the same level with Python like 
syntax. Changing D surface syntax is an obvious non-starter.




Re: Avoid GC with closures

2016-05-26 Thread bpr via Digitalmars-d

On Thursday, 26 May 2016 at 18:53:35 UTC, Iakh wrote:
Functions with lambdas cannot be @nogc as far as they allocates 
closures.


Counterexample:

//  Note that this is NOT a good way to do numerical quadrature!

double integrate(scope double delegate(double x) @nogc f,
 double lo, double hi, size_t n) @nogc {
  double result = 0.0;
  double dx = (hi - lo) / n;
  double dx2 = dx * 0.5;
  for (size_t i = 1; i <= n; i++) {
result += f(lo + i * dx2) * dx;
  }
  return result;
}

double integrate(scope double delegate(double, double) @nogc f,
double x0, double x1,
double y0, double y1,
size_t nX, size_t nY) @nogc {
  return integrate((y) => integrate((x) => f(x,y), x0, x1, nX), 
y0, y1, nY);

}

Functions with @nogc downward funarg lambdas (delegates) can be 
@nogc.


I can't parse the rest of your post, maybe I misunderstand you.