Re: Andrei's list of barriers to D adoption

2016-06-14 Thread Peter Lewis via Digitalmars-d
As someone learning D, I thought I would give my insight in how I 
came to D.


My biggest reason for choosing D is the GC. I have come from Java 
and don't quite believe that I'm ready to manage my own memory 
throughout an entire program, but the ability to disconnect from 
the GC is a great way to start. I'm not saying that D should be a 
stopgap language, it has far too much potential for that.


I think that D definitely has many positives but that there is 
still work that needs to go into it. But all languages need work, 
no language is perfect.


I don't have much insight onto how the long term development and 
goals have gone, but I see that D is moving in a good direction 
and hope it will be around for many years to come. I also wish 
that D would have a wider adoption.


As it goes for tools. I agree work needs to be done on them but 
that it is not as important as a well done, competent compiler 
set and great documentation. D has great docs, and a quite 
competent compiler group.


Re: Andrei's list of barriers to D adoption

2016-06-13 Thread Bruno Medeiros via Digitalmars-d

On 06/06/2016 09:15, Russel Winder via Digitalmars-d wrote:

* Tooling is immature and of poorer quality compared to the
> competition.

This is true.

We have too many half-finished attempt at things, basically because
everything is volunteer, not directly associated with work, activity.
Nothing wrong with this per se, but an obvious explanation why it is
so. Unless an oirganization or seven put some work-oriented effort into
the tooling, nothinkg will change.

I would suggest three ways forward:

1. Get effort on the IntelliJ IDEA and CLion plugin. Kingsley has made
a start. I suggest using his work as a basis and doing a new version
written in Kotlin instead of Java. Kotlin will be easier than Java for
D people to work with and easy for Java people to work with.

2. Get effort on the DDT Eclipse plugin. Bruno has declared it
finished, which is fine, but I would say it should not be treated that
way.

3. Have one lightweight D realized cross platform IDE. Qt us probably
the best widget set to use for this. My model here is LiteIDE which is
a Qt-based Go IDE realized in C++. It should of course be realized in
Go, but there are no Qt bindings for Go, only QML ones.



If anything is to be done about improving the IDE tooling, it should be 
work on a tool like D Completion Daemon (DCD) - that is, an IDE-agnostic 
tool for code completion and other language analysis functionality 
(find-references, refactor, etc.)


The IDE space is just too fragmented - there are now even more popular 
IDEs that than say 5 years ago - like VS Code, Atom, etc.. Even 
SublimeText is a relatively recent player.  As such it's not feasible to 
be focusing a work on just a few IDEs. You have to make the core of IDE 
functionality available in an IDE-agnostic tool.


VSCode for example has even defined a language-agnostic protocol for 
such language servers: 
https://github.com/Microsoft/vscode-languageserver-protocol/ , and there 
is work in a few other IDEs to adopt that protocol as well, and write 
their own IDE client implementation (Eclipse for example, but it's all 
very early stages).



In any case, this is all of secondary importance, IMO. The GC issue is 
much more critical. If people think D has a worthwhile future for 
building apps in real world scenarios, then the tooling will get there 
eventually, it will catch up. But if people think other languages will 
work much better for their needs (like Rust or others), no amout of 
exceptional tooling will make a difference.



BTW, "finished" is not the right word to describe the state of DDT, if 
anything it's now in maintenance mode. (actually not that different from 
what it has been in the last 1 year or so)


--
Bruno Medeiros
https://twitter.com/brunodomedeiros


Re: Andrei's list of barriers to D adoption

2016-06-13 Thread Olivier via Digitalmars-d

On Tuesday, 7 June 2016 at 08:09:49 UTC, Ethan Watson wrote:

On Tuesday, 7 June 2016 at 07:57:09 UTC, Walter Bright wrote:

C++ still suffers from:

http://www.digitalmars.com/articles/b44.html

and probably always will.


template< size_t size > void function( char (  )[ size ] 
);


It's horrible syntax (no surprise), and being a templated 
function means it's recompiled N times... but there's at least 
something for it now.


There is array_view and string_view that have been proposed (and 
I even think  accepted) for C++17.


But if you want a fat pointer, you can just write one yourself:

template< typename T >
struct FatPtr
{
   template< std::size_t N >
   FatPtr( T ()[N] )
  : p (a)
  , n (N)
   { }

   T *p;
   std::size_t n;
};

void function( FatPtr a );

int main( )
{
   char a[128];
   function(a);

   function("foo");
}


Re: Andrei's list of barriers to D adoption

2016-06-11 Thread Nick Sabalausky via Digitalmars-d

On 06/06/2016 04:15 AM, Russel Winder via Digitalmars-d wrote:


3. Have one lightweight D realized cross platform IDE. Qt us probably
the best widget set to use for this. My model here is LiteIDE which is
a Qt-based Go IDE realized in C++. It should of course be realized in
Go, but there are no Qt bindings for Go, only QML ones.



One thing I've been really wanting to do for awhile (and even moreso 
after switching my main desktop to Linux) is take Programmer's Notepad 2 
(a windows program, but very lightweight and very nice) and try porting 
it to D+Qt (or maybe libui if it gets a Qt backend). Although I don't 
know how realistic Qt on D in right now, and I haven't been able to 
justify the personal time & energy investment, even as much as I'd like 
to :( Just can't find a linux editor I like as much as PN2 :(




Re: Andrei's list of barriers to D adoption

2016-06-11 Thread Observer via Digitalmars-d

On Tuesday, 7 June 2016 at 23:05:49 UTC, Walter Bright wrote:

On 6/7/2016 2:28 PM, Steven Schveighoffer wrote:
I can attest that figuring out why something isn't inferred 
@safe isn't always
easy, and the "slap a @safe: tag at the top" isn't always 
going to help.


Having a -safe compiler switch to make @safe the default won't 
improve that in the slightest.


I think it's useful here to compare one aspect of Perl's approach 
to

security, its "taint" mode.  It tags insecure data to make sure it
does not affect the security of the application, and blocks 
actions

where insecure actions would otherwise occur.  The Perl invocation
accepts a couple of flags to control how taint mode works:

  -t  Like -T, but taint checks will issue warnings rather than 
fatal
  errors.  These warnings can now be controlled normally with 
"no

  warnings qw(taint)".

  Note: This is not a substitute for "-T"! This is meant to 
be used
  only as a temporary development aid while securing legacy 
code:
  for real production code and for new secure code written 
from

  scratch, always use the real -T.

  -T  turns on "taint" so you can test them.  Ordinarily these 
checks
  are done only when running setuid or setgid.  It's a good 
idea to
  turn them on explicitly for programs that run on behalf of 
someone
  else whom you might not necessarily trust, such as CGI 
programs or
  any internet servers you might write in Perl.  See perlsec 
for
  details.  For security reasons, this option must be seen by 
Perl

  quite early; usually this means it must appear early on the
  command line or in the "#!" line for systems which support 
that

  construct.

The point being, such flags provide a very simple means for the 
user
to check the execution of their code, without being terribly 
intrusive.
They can be a great convenience as a stepstone to discovering 
where

problems exist and addressing them.


Re: Andrei's list of barriers to D adoption

2016-06-11 Thread Observer via Digitalmars-d

On Tuesday, 7 June 2016 at 20:41:21 UTC, Jonathan M Davis wrote:
In principle, I think that you're very right that @safe needs 
to be implemented as a whitelist. Security in general does not 
work as a blacklist, and I think that @safe has the same 
problem. The problem is code breakage. Even assuming that the 
change in implementation were straightforward (and I have no 
idea whether it is or not), it would be pretty much guranteed 
that we would break a lot of code marked @safe if we were to 
switch to a whitelist. Some of that code is not truly @safe and 
really should be fixed, but just throwing the switch like that 
is too sudden. We'd probably be forced to have both a whitelist 
and a blaklist and treat the whitelist results as warnings 
temporarily before switching fully to the whitelist 
implementation. And that's likely feasible, but it seems like 
it would be a bit of a mess. So, I don't know if we reasonably 
can switch to a whitelist or not. But I think that it's clearly 
that we ideally would.


I think you meant "treat the non-whitelist results as warnings".

Seems to me the proper answer is simple.  Stuff on the whitelist
should pass without comment.  Stuff on neither the whitelist nor
the blacklist should generate warnings.  Stuff on the blacklist
should generate errors.  A compiler flag similar to gcc's -Werror
that turns all warnings into errors would allow the end-user to
select whether or not to worry, during a phase of transition.

This way, those warnings could be pushed back upstream to the
compiler maintainers as "hey, your whitelist/blacklist division
omits certain real-world cases".  And gradually, the graylist
would be narrowed over successive compiler releases.


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Saturday, 11 June 2016 at 12:19:52 UTC, Jonathan M Davis wrote:
LOL. 10x that would be cheap in the US, and I don't think that 
your average school will let folks sit in on courses (though 
some will). For your average college in the US, I would only 
expect anyone to take classes if they're actually working 
towards a degree, though I'm sure that there are exceptions in 
some places.


I remember that we sometimes had older programmers taking some 
courses, maybe to ge a degree? But not often. The $100/semester 
fee isn't for tuition though, it is for student 
activities/facilities, paper copies etc. Tuition is free.


It works better when the school itself is really hard to get 
into. For instance, my dad went to MIT, and according to him, 
you usually don't have much of a need for weeder courses there, 
because it was already hard enough to get into the school that 
folks who can't hack it won't even be there - and it's an 
engineering school, so you're typically going to get very 
smart, well-prepared students who want to do engineering.


It sorts itself out at higher levels, although I once had a 
project group at the master level that came to the hallway 
outside my office to get help, and it eventually dawned on me 
that none of the three boys knew that they should end sentences 
with ";"... I couldn't help laughing... and I kinda felt bad 
about it, but they laughed too, so I guess it was ok. I was so 
totally unprepared for that kind of problem at a master level 
course. I assume they came from some other field, as it was a 
web-oriented course.


These things with uneven knowledge levels are more of a problem 
in "hip" project oriented courses, not so much in the courses 
that are proper compsci and are based on individual final exams. 
It kinda work out ok as long as students of the same level go on 
the same group, but it creates a lot of friction if you get a 
mixed group where the better students feel the other ones are 
freeloaders.


You freqeuntly either end up with the school trying to weed out 
a lot of folks up front by having very hard beginning courses 
or making their beginning classes easy in an attempt to make it 
so that everyone has a chance, though I think that tends to 
just delay the inevitable for many students.


Yep, exactly, but the problem was that the introduction course in 
programming was required by other fields such as getting a master 
in bio-chemistry or so. That didn't go very well when the 
lecturer once came up with a "clever exam" where you got stuck if 
you didn't master the first task. So 40% failed on their final 
exam, 200 angry students? That would've made me feel bad. After 
that they softened the tasks a bit... making failure harder.


In the math department they had one more narrow calculus course 
for those who wanted to specialise in math and a broader more 
pragmatic calculus course for those who were more to use math as 
an applied tool in other fields. Probably a better solution.


to be able to program. So, I agree that it would be nice if 
there were some sort of aptitude test up front that at least 
indicated whether you were likely have a very hard time with 
programming. But I don't think that I've ever heard of any 
schools doing anything like that (though obviously, some could 
be, and I just haven't heard of it). And I don't know how you 
would even go about making such a test, though I expect that 
there are computer science professors out there who would.


Well, I don't know.  I guess having average or above in math 
would work out. Not that you have to know math, but general 
problem solving. I noticed that people from other fields that was 
working on their master picked up programming faster, perhaps 
because they had acquired skills in structuring and problem 
solving.


Then again, pure theoretical topics kill motivation for me. Like, 
I could never find any interest in solving tricky integrals 
analytically as it seemed like a pointless exercise. And to be 
honest, I've never found the need to do it. But as you said, some 
topics become more meaningful later in life and I'd probably put 
more energy into topics like program verification and 
combinatorics today than I did in my youth.




Re: Andrei's list of barriers to D adoption

2016-06-11 Thread Chris via Digitalmars-d

On Saturday, 11 June 2016 at 12:44:54 UTC, ketmar wrote:

On Friday, 10 June 2016 at 15:29:01 UTC, Chris wrote:
DScript. Your scripting language already fulfills things that 
were on my wishlist (easy D interop).


hey, both GML and DACS has some of that too! ;-)

VM["print"] = (string s) { writeln(s); };
VM["add"] = (int a, int b) => a+b;

wow, now we can print things from script, and (for some unknown 
reason) use function to add two numbers. with DACS you still 
have to declare function prototypes, but with GML it will "just 
work" (including conversion from internal nan-boxed doubles to 
strings and ints, and back).


GML is somewhat limited, but can be extended, and it almost as 
fast as Lua. DACS, with it's JIT, is sometimes even comparable 
to gcc -O2 (but only sometimes, lol; and LuaJIT still makes it 
look like a snail).


Cool. Maybe we should continue this here

http://forum.dlang.org/thread/njfdch$2627$1...@digitalmars.com


Re: OT – the Javaverse [was Andrei's list of barriers to D adoption]

2016-06-11 Thread Chris via Digitalmars-d

On Friday, 10 June 2016 at 17:09:18 UTC, Russel Winder wrote:

Whatever you read, the writer didn't really know what they were 
talking about. At least not in general, and if they were 
talking of the Javaverse as a whole. Java 8 features such as 
lambda expressions, Streams, method references, etc. are no 
longer even controversial. There is a world-wide activity in 
transforming Java 6 and Java 7 code to Java 8. Yes some of this 
is pull rather than push, and I am sure there are islands of 
intransigence (*). However the bulk of Java programmers will 
eventually get and use the features.


Of course many people have stopped using Java and use Kotlin, 
Ceylon, or Scala (**). The crucial point here is that the 
Javaverse is much, much more than just the Java language.


This only proves my point. This happens in languages that are 
"feature resistant". For years you have to write awkward code[1], 
and once a feature got accepted you have to revise your old code 
and jazz it up. And then of course you get conservative 
programmers who loath changes, they are a direct consequence of 
feature resistance. The more progressive ones turn to other 
languages like Clojure and Kotlin.


All this proves that being feature resistant is not healthy for a 
language.


[1] E.g. Java event listeners, Runnable etc.

(*) Usually people who think Java 5 was a bad move and stick 
with Java

1.4.2.

(**) There are others but these are the main players.





Re: Andrei's list of barriers to D adoption

2016-06-11 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 15:29:01 UTC, Chris wrote:
DScript. Your scripting language already fulfills things that 
were on my wishlist (easy D interop).


hey, both GML and DACS has some of that too! ;-)

VM["print"] = (string s) { writeln(s); };
VM["add"] = (int a, int b) => a+b;

wow, now we can print things from script, and (for some unknown 
reason) use function to add two numbers. with DACS you still have 
to declare function prototypes, but with GML it will "just work" 
(including conversion from internal nan-boxed doubles to strings 
and ints, and back).


GML is somewhat limited, but can be extended, and it almost as 
fast as Lua. DACS, with it's JIT, is sometimes even comparable to 
gcc -O2 (but only sometimes, lol; and LuaJIT still makes it look 
like a snail).


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-11 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 15:35:32 UTC, jmh530 wrote:

On Friday, 10 June 2016 at 15:14:02 UTC, ketmar wrote:


1. this is heavily OT. ;-)


I didn't forget to mark it! :-)

2. you may take a look at my gml engine. it has clearly 
separated language parser and AST builder (gaem.parser), and 
AST->VM compiler (gaem.runner/compiler.d).


I couldn't for the life of me find a link to this.


sorry. Wyatt kindly fixed that for me. ;-)

also, you can replace code generation in compiler with direct 
execution, and you will get AST-based interpreter. just create a 
new AA with local variables on NodeFCall (this will serve as 
"stack frame"), and make `compileExpr` return value instead of 
stack slot index. then it is as easy as:


  (NodeBinarySub n) => compileExpr(n.el)-compileExpr(n.er);

and so on. also, fix `compileVarAccess` and `compileVarStore` to 
use your "stack frame" AA.


this whole bussines is not hard at all. i'd say it is easier than 
many other programming tasks.


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-11 Thread Jonathan M Davis via Digitalmars-d
On Saturday, June 11, 2016 08:06:21 Ola Fosheim Grøstad via Digitalmars-d 
wrote:
> On Friday, 10 June 2016 at 18:59:02 UTC, Jonathan M Davis wrote:
> > then as it is later. In some ways, it would actually be very
> > beneficial to actually go back to school to study that stuff
> > after having programmed professionally for a while, but that's
> > a pain to pull off time-wise, and the classes aren't really
> > designed with that in mind anyway.
>
> I am definitively considering it, maybe on some topics that I've
> read on my own, to fill in the missing bits. Or topics that has
> had some advances since the 90s. It wouldn't be too much of a
> pain as I could get there in 15 minutes on a bike, so it would
> just be exercise. I believe lectures at the University of Oslo
> are open to the public if there is enough space, and the fee at
> the University of Oslo is at $100/semester so the threshold for
> signing up is low. And I don't even have to do the exam, which
> probably makes it more enjoyable.

LOL. 10x that would be cheap in the US, and I don't think that your average
school will let folks sit in on courses (though some will). For your average
college in the US, I would only expect anyone to take classes if they're
actually working towards a degree, though I'm sure that there are exceptions
in some places.

> > world.  I don't envy teachers having to figure out how to teach
> > basic programming concepts.
>
> Yes,  some people are simply never going to be able to do
> programming well... I'm talking having trouble reading code with
> basic input - output loops (not even writing it) after having it
> carefully explained to them many times. With some students you
> know they will never be able to pass the exam after the first few
> sessions. But you cannot tell them to quit... so you have to
> encourage them, basically encouraging them to strive towards a
> certain failure. That's frustrating.
>
> Educational institutions should probably have an aptitude test.
> At the entry level courses maybe 30% are never going to be able
> to become even mediocre programmers.

It works better when the school itself is really hard to get into. For
instance, my dad went to MIT, and according to him, you usually don't have
much of a need for weeder courses there, because it was already hard enough
to get into the school that folks who can't hack it won't even be there -
and it's an engineering school, so you're typically going to get very smart,
well-prepared students who want to do engineering.

Contrast that with schools where almost anyone can get in, and there are
always problems with folks entering engineering programs where they can't
hack it - especially computer science, since it doesn't obviously involve
the really hard math and science that would scare many of those folks away.
You freqeuntly either end up with the school trying to weed out a lot of
folks up front by having very hard beginning courses or making their
beginning classes easy in an attempt to make it so that everyone has a
chance, though I think that tends to just delay the inevitable for many
students.

I can appreciate wanting to give everyone a chance, and I'm sure that there
are folks who have a hard time at first who get it later, but many folks
just don't seem to think the right way to be able to program. So, I agree
that it would be nice if there were some sort of aptitude test up front that
at least indicated whether you were likely have a very hard time with
programming. But I don't think that I've ever heard of any schools doing
anything like that (though obviously, some could be, and I just haven't
heard of it). And I don't know how you would even go about making such a
test, though I expect that there are computer science professors out there
who would.

- Jonathan M Davis




Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Friday, 10 June 2016 at 18:59:02 UTC, Jonathan M Davis wrote:
then as it is later. In some ways, it would actually be very 
beneficial to actually go back to school to study that stuff 
after having programmed professionally for a while, but that's 
a pain to pull off time-wise, and the classes aren't really 
designed with that in mind anyway.


I am definitively considering it, maybe on some topics that I've 
read on my own, to fill in the missing bits. Or topics that has 
had some advances since the 90s. It wouldn't be too much of a 
pain as I could get there in 15 minutes on a bike, so it would 
just be exercise. I believe lectures at the University of Oslo 
are open to the public if there is enough space, and the fee at 
the University of Oslo is at $100/semester so the threshold for 
signing up is low. And I don't even have to do the exam, which 
probably makes it more enjoyable.


When I started out in school, C++ was the main language, but it 
quickly changed to Java, which removes all kinds of certain 
problems, but it still has a lot of extra cruft (like forcing 
everything to be in a class and a ton of attributes forced to 
be on main), and it doesn't at all prepare students to properly 
deal with pointers and memory. So, students starting out with 
Java have some fun problems when they then have to deal with C 
or C++. Alternatively, there are folks in favor of starting 
with functional languages, which has certain advantages, but 
it's so different from how folks would program normally that 
I'm not sure that it's ultimately a good idea.


I went to a high school that had programming/digital circuits on 
the curriculum. In the programming we started with Logo, which 
actually is kind of neat, as you are working with very concrete 
intuitive geometric problems. Then we went on to Turbo Pascal. It 
wasn't challenging enough, so the better students went with 
digital circuits and machine language for the last year. At the 
uni the entry level courses used Simula, but later courses used 
C, Beta, Scheme, etc, based on the topic. In several courses I 
could use whatever language I wanted for projects as long as the 
assistant teacher could read it. Made sense since the grades 
usually were based on a final exam only.


world.  I don't envy teachers having to figure out how to teach 
basic programming concepts.


Yes,  some people are simply never going to be able to do 
programming well... I'm talking having trouble reading code with 
basic input - output loops (not even writing it) after having it 
carefully explained to them many times. With some students you 
know they will never be able to pass the exam after the first few 
sessions. But you cannot tell them to quit... so you have to 
encourage them, basically encouraging them to strive towards a 
certain failure. That's frustrating.


Educational institutions should probably have an aptitude test. 
At the entry level courses maybe 30% are never going to be able 
to become even mediocre programmers.




Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Walter Bright via Digitalmars-d

On 6/10/2016 3:55 AM, Chris wrote:

Cool. I'd love to see `DScript` one day - and replace JS once and for all ...
well ... just day dreaming ...


Started a new thread for that.


[OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Jonathan M Davis via Digitalmars-d
On Friday, June 10, 2016 17:20:29 Ola Fosheim Grøstad via Digitalmars-d wrote:
> On Friday, 10 June 2016 at 15:27:03 UTC, Jonathan M Davis wrote:
> > Most developers have titles like "Software Engineer" or "Senior
> > Softweer Engineer." They'e frequently called programmers and/or
> > software developers when not talking about titles.
>
> Neither academia or businesses use Computer Scientist as a job
> title... tough?

In academia, you'd be a professor of Computer Science or a professor in the
Computer Science department. You wouldn't normally be called a Computer
Scientist - certainly not as a job title.  And in businesses, the only
companies that even _might_ have Computer Scientist as a title would be
where it was a very research-heavy job, which would not be at all normal.
Research-heavy jobs like that do exist in some large companies, but in the
vast majority of cases, programmers are hired as Software Engineers to write
code for actual products.

> > Yeah. Most universities in the US have a Computer Science
> > degree, but some have Software Engineering as a separate
> > degree. My college had Computer Science, Software Engineer, and
> > Computer Engineering, which is atypical. All of them took
> > practical courses, but the SE guys didn't have to take some of
> > the more theoretical stuff and instead took additional classes
> > focused on working on projects in teams and whatnot.
>
> Sounds like a good setup. At my uni we could pick freely what
> courses we wanted each semester, but needed a certain combination
> of fields and topics to get a specific degree. Like for entering
> computer science you would need the most feared topic, Program
> Verification taught by Ole-Johan Dahl (co-creator of Simula) who
> was very formal on the blackboard... I felt it was useless at the
> time, but there are some insights you have to be force-fed...
> only to be appreciated later in life. It is useless, but still
> insightful.
>
> Not sure if those more narrow programs are doing their students a
> favour, as often times the hardest part is getting a good
> intuition for the basics of a topic, while getting the "expert"
> knowledge for a specific task is comparatively easier. Especially
> now we have the web. So, being "forced" to learning the basics of
> a wider field is useful.

I tend to be of the opinion that the best colloge program has all of the
more theoretical stuff, because it provides a solid base for real life
programming, but project-based, real world stuff is also very important to
help prepare students for actual jobs. Too many college programs do very
little with helping prepare students for actual programming jobs, but at the
same time, I think that skipping a lot of the theoretical stuff will harm
students in the long run. But striking a good balance isn't exactly easy,
and it's definitely the case that a lot of the more theoretical stuff isn't
as obviously useful then as it is later. In some ways, it would actually be
very beneficial to actually go back to school to study that stuff after
having programmed professionally for a while, but that's a pain to pull off
time-wise, and the classes aren't really designed with that in mind anyway.

> I'm rather sceptical of choosing C++ as a language for instance.
> Seems like you would end up wasting a lot of time on trivia and
> end up students hating programming...

Choosing the right language for teaching is an endless debate with all kinds
of pros and cons. Part of the problem is that good languages for
professional work tend to be complicated with advantages aimed at getting
work done rather than teaching, which causes problems for teaching, but
picking a language that skips a lot of the compilications means that
students aren't necessarily well-prepared to deal with the more complicated
aspects of a language.

When I started out in school, C++ was the main language, but it quickly
changed to Java, which removes all kinds of certain problems, but it still
has a lot of extra cruft (like forcing everything to be in a class and a ton
of attributes forced to be on main), and it doesn't at all prepare students
to properly deal with pointers and memory. So, students starting out with
Java have some fun problems when they then have to deal with C or C++.
Alternatively, there are folks in favor of starting with functional
languages, which has certain advantages, but it's so different from how
folks would program normally that I'm not sure that it's ultimately a good
idea. All around, it's a difficult problem, and I don't know wha the right
choice is. In general, there are serious problems with teaching with real
world languages, and teaching with a language that was designed for teaching
doesn't necessarily prepare students for the real world.  I don't envy
teachers having to figure out how to teach basic programming concepts.

Regardless, I think that students should be at least exposed to both the
imperative/OO languages and the functional languages 

Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread jmh530 via Digitalmars-d

On Friday, 10 June 2016 at 17:59:15 UTC, Adam D. Ruppe wrote:

What's the PrototypeObject sc I see everywhere doing?


sc is short for "scope" - it refers to the chain of local 
variables. So consider the following:

[snip]


Cool. Thanks.



Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Adam D. Ruppe via Digitalmars-d

On Friday, 10 June 2016 at 17:36:02 UTC, jmh530 wrote:

Ah, it produces mixin("1+2") and evaluates that.


Sort of, 1 and 2 are both runtime variables there so it really 
produces mixin("a+b") after setting a = 1 and b = 2 above.


But yeah, that's the idea - it just hoists that mixin to runtime 
for scripting.



What's the PrototypeObject sc I see everywhere doing?


sc is short for "scope" - it refers to the chain of local 
variables. So consider the following:


var a = 1;
function foo() {
  var b = 4;
  var c = a + b;
}

foo();


So as this is interpreted by my thing, it is like it runs the 
following D code:


// this happens as part of the interpreter initialization
auto globalScope = new 
PrototypeObject(globals_the_d_programmer_passed);


// now it starts running
auto currentScope = globalScope;

// var a = 1;
currentScope["a"] = 1; // it holds the local variables!


call_function("foo", []); // script foo();

// when we enter the new scope inside the function, it
// creates a new object, based on the old one
currentScope = new PrototypeObject(currentScope);

// var b = 4;
currentScope["b"] = 4; // remember the scope changed above, so 
this is local to the function now


// var c = a + b;
currentScope["c"] = currentScope["a"] + currentScope["b"];

/*
  OK, so at this point, we get two variables: a and b. That's
  what the sc object in the script.d source represents - what
  I called currentScope here.

  The opIndex does two things: check the current scope for the
  name. If it is there, return that value. If not, go up to
  the parent scope and look there. Continue until you find it,
  of if it isn't there, throw a "no such variable" exception.

  It'd find b in the current scope and return the
  function-local variable, and it'd find a in the parent scope.
*/

// and now that the function is over, we pop off the local
// variables from the function by setting the current back
// to the old parent
currentScope = currentScope.parent;



So yeah, the sc in the interpreter is just the currentScope from 
the pseudocode, a chain of AAs holding the local variables.


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread jmh530 via Digitalmars-d

On Friday, 10 June 2016 at 17:02:06 UTC, Adam D. Ruppe wrote:


https://github.com/adamdruppe/arsd/blob/master/script.d#L879

The function is pretty simple: interpret the left hand side 
(here it is 1, so it yields int(1)), interpret the right hand 
side (yields int(2)), combine them with the operator ("+") and 
return the result.


Notice that interpreting the left hand side is a recursive call 
to the interpret function - it can be arbitrarily complex, and 
the recursion will go all the way down, then all the way back 
up to get the value.


Ah, it produces mixin("1+2") and evaluates that.

What's the PrototypeObject sc I see everywhere doing?


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Wyatt via Digitalmars-d

On Friday, 10 June 2016 at 17:10:39 UTC, Adam D. Ruppe wrote:

On Friday, 10 June 2016 at 15:30:19 UTC, Wyatt wrote:
I use it in my toml parser and it's very pleasant.  I figured 
it probably isn't very fast, but it works and that's important.


kewl! Did you use the script component for interpreting or just 
the jsvar part for the data?


Just the jsvar; I've got a Ppegged grammar mixin doing most of 
the heavy lifting.  IIRC, you actually wrote it around the time I 
was fighting a losing battle with nested Variant arrays and it 
saved me a lot of headache.


-Wyatt


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Ola Fosheim Grøstad via Digitalmars-d

On Friday, 10 June 2016 at 15:27:03 UTC, Jonathan M Davis wrote:
Most developers have titles like "Software Engineer" or "Senior 
Softweer Engineer." They'e frequently called programmers and/or 
software developers when not talking about titles.


Neither academia or businesses use Computer Scientist as a job 
title... tough?


Yeah. Most universities in the US have a Computer Science 
degree, but some have Software Engineering as a separate 
degree. My college had Computer Science, Software Engineer, and 
Computer Engineering, which is atypical. All of them took 
practical courses, but the SE guys didn't have to take some of 
the more theoretical stuff and instead took additional classes 
focused on working on projects in teams and whatnot.


Sounds like a good setup. At my uni we could pick freely what 
courses we wanted each semester, but needed a certain combination 
of fields and topics to get a specific degree. Like for entering 
computer science you would need the most feared topic, Program 
Verification taught by Ole-Johan Dahl (co-creator of Simula) who 
was very formal on the blackboard... I felt it was useless at the 
time, but there are some insights you have to be force-fed... 
only to be appreciated later in life. It is useless, but still 
insightful.


Not sure if those more narrow programs are doing their students a 
favour, as often times the hardest part is getting a good 
intuition for the basics of a topic, while getting the "expert" 
knowledge for a specific task is comparatively easier. Especially 
now we have the web. So, being "forced" to learning the basics of 
a wider field is useful.


I'm rather sceptical of choosing C++ as a language for instance. 
Seems like you would end up wasting a lot of time on trivia and 
end up students hating programming...




Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Adam D. Ruppe via Digitalmars-d

On Friday, 10 June 2016 at 15:30:19 UTC, Wyatt wrote:

globals.write = &(writeln!string);


Woah, I never thought of using it like that!


Yeah, since writeln is a template, you need to instantiate it 
with some arguments. This isn't the ideal way to do it in the 
script btw, it'd be like:


globals.write = (var this_, var[] args) { writeln(args); };

or something like that - this signature gives a variadic function 
to the scripting language, whereas writeln!string just has a 
single argument.


But, of course, the script language cannot instantiate D 
templates itself, so you gotta do that before assigning it to the 
runtime var. But from there, the jsvar.d reflection code will 
handle the rest of var<->string conversions.


I use it in my toml parser and it's very pleasant.  I figured 
it probably isn't very fast, but it works and that's important.


kewl! Did you use the script component for interpreting or just 
the jsvar part for the data?


Scripting in D (was Andrei's list of barriers to D adoption)

2016-06-10 Thread Adam D. Ruppe via Digitalmars-d

On Friday, 10 June 2016 at 15:29:01 UTC, Chris wrote:
But seriously, would you like to work on something like 
DScript. Your scripting language already fulfills things that 
were on my wishlist (easy D interop).


I'm best when working on something that I'm actively using, since 
then I find the bugs myself and have some personal thing to gain 
(a lot of times, I can take time out of the day job to do it 
then, since it contributes directly back to it)...


and alas, right now, I'm not actively using it. I do have some 
plans for it, but no set schedule.


That said though, it is already fairly useful... if you guys use 
it and report bugs/feature requests, I can probably respond to 
that.


Re: OT – the Javaverse [was Andrei's list of barriers to D adoption]

2016-06-10 Thread Russel Winder via Digitalmars-d
On Tue, 2016-06-07 at 15:15 +, Chris via Digitalmars-d wrote:
> 
[…]
> Java has lambdas now (since version 8, I think) and I read 
> somewhere that it's not certain that Java programmers will adopt 
> (i.e. use) them at all. D has the advantage that its users are 
> […]

Whatever you read, the writer didn't really know what they were talking
about. At least not in general, and if they were talking of the
Javaverse as a whole. Java 8 features such as lambda expressions,
Streams, method references, etc. are no longer even controversial.
There is a world-wide activity in transforming Java 6 and Java 7 code
to Java 8. Yes some of this is pull rather than push, and I am sure
there are islands of intransigence (*). However the bulk of Java
programmers will eventually get and use the features.

Of course many people have stopped using Java and use Kotlin, Ceylon,
or Scala (**). The crucial point here is that the Javaverse is much,
much more than just the Java language.


(*) Usually people who think Java 5 was a bad move and stick with Java
1.4.2. 

(**) There are others but these are the main players.

-- 

Russel.
=
Dr Russel Winder  t: +44 20 7585 2200   voip: sip:russel.win...@ekiga.net
41 Buckmaster Roadm: +44 7770 465 077   xmpp: rus...@winder.org.uk
London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder

signature.asc
Description: This is a digitally signed message part


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Adam D. Ruppe via Digitalmars-d

On Friday, 10 June 2016 at 15:03:30 UTC, jmh530 wrote:
Let's say you have something simple like 1+2, you would build 
an AST that looks something like

   +
  / \
 1   2
What would be the next step?


https://github.com/adamdruppe/arsd/blob/master/script.d#L879

The function is pretty simple: interpret the left hand side (here 
it is 1, so it yields int(1)), interpret the right hand side 
(yields int(2)), combine them with the operator ("+") and return 
the result.


Notice that interpreting the left hand side is a recursive call 
to the interpret function - it can be arbitrarily complex, and 
the recursion will go all the way down, then all the way back up 
to get the value.


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Adam D. Ruppe via Digitalmars-d

On Friday, 10 June 2016 at 15:35:32 UTC, jmh530 wrote:

On Friday, 10 June 2016 at 15:14:02 UTC, ketmar wrote:

1. this is heavily OT. ;-)


I didn't forget to mark it! :-)


Well, yeah, we should start a new thread, but compiler 
programming isn't really off topic at all on a forum where we 
talk about programming a compiler! Knowing the idea helps reading 
dmd source too.


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread jmh530 via Digitalmars-d

On Friday, 10 June 2016 at 15:40:45 UTC, Wyatt wrote:


He linked it earlier:
http://repo.or.cz/gaemu.git/tree/HEAD:/gaem/parser

-Wyatt


Cheers.


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Wyatt via Digitalmars-d

On Friday, 10 June 2016 at 15:35:32 UTC, jmh530 wrote:

On Friday, 10 June 2016 at 15:14:02 UTC, ketmar wrote:


2. you may take a look at my gml engine. it has clearly 
separated language parser and AST builder (gaem.parser), and 
AST->VM compiler (gaem.runner/compiler.d).


I couldn't for the life of me find a link to this.


He linked it earlier:
http://repo.or.cz/gaemu.git/tree/HEAD:/gaem/parser

-Wyatt


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread jmh530 via Digitalmars-d

On Friday, 10 June 2016 at 15:14:02 UTC, ketmar wrote:


1. this is heavily OT. ;-)


I didn't forget to mark it! :-)

2. you may take a look at my gml engine. it has clearly 
separated language parser and AST builder (gaem.parser), and 
AST->VM compiler (gaem.runner/compiler.d).


I couldn't for the life of me find a link to this.


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Wyatt via Digitalmars-d

On Friday, 10 June 2016 at 14:34:53 UTC, Adam D. Ruppe wrote:

var globals = var.emptyObject;
globals.write = &(writeln!string);


Woah, I never thought of using it like that!

The downside though is that it is something I basically slapped 
together in a weekend to support var.eval on a lark... it has a 
few weird bugs


And yet it somehow seems to _work_ better than std.variant. :/

tho idk if I'd recommend it for serious work. Just use D for 
that!


I use it in my toml parser and it's very pleasant.  I figured it 
probably isn't very fast, but it works and that's important.


-Wyatt


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Chris via Digitalmars-d

On Friday, 10 June 2016 at 14:25:37 UTC, Adam D. Ruppe wrote:

On Friday, 10 June 2016 at 13:55:28 UTC, Chris wrote:
I have neither time nor the required expertise to write a 
scripting language from scratch ;) You on the other hand ... 
:-)


Oh, it isn't that hard, at least to do a quick basic thing. You 
might want to start with the various math parsers. A postfix 
one is relatively easy:


2 3 +

break it up into tokens, read them in, build a syntax tree 
(well, for the postfix thing, it is probably a stack!).


That approach will even work for a Lisp-like language!

Then try an infix one. You'd use the same tokenizer, but the 
parser is different... and this kind of parser gets you started 
for a typical script language.


2 + 3

The way this works is you read the token, peek ahead, create an 
object and build a tree. You'd use different functions for 
different contexts. So it might start with readExpression which 
readFactor. Then readFactor might call readAddend...


If you look at the D grammar: 
http://dlang.org/spec/grammar.html you'll find the various 
terms are defined as WhateverExpressions and often 
recursively...


you can write the parser to follow that basically the same way! 
You end up with one of these: 
https://en.wikipedia.org/wiki/Recursive_descent_parser


Once you get addition and multiplication working with correct 
order of operations, you just kinda start adding stuff! Make a 
function call and an if/loop statement and boom, you have a 
simple programming language.


After that, it is basically just adding more token recognition 
and AST classes.




To make an interpreter, you can just add a method to the AST 
objects that interprets and gives a result boom, it works! 
Compiling is basically the same idea, just spitting out 
something other than the result of the expression - spitting 
out code that gives you the result. That gets harder to get 
into all the fancy techniques, but it builds on the same 
foundation.




It is a good thing to know how to do, at least the basic parts!


I agree. It's good to know how to do it. But don't get me 
started, else I'll have a new obsession ... ;)


But seriously, would you like to work on something like DScript. 
Your scripting language already fulfills things that were on my 
wishlist (easy D interop).


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Jonathan M Davis via Digitalmars-d
On Friday, June 10, 2016 07:45:03 Ola Fosheim Grøstad via Digitalmars-d wrote:
> On Friday, 10 June 2016 at 05:37:37 UTC, Jonathan M Davis wrote:
> > I assume that you're not from the US?
>
> Right, I am in Oslo (Norway).
>
> > In the US at least, professional programmers are almost always
> > referred to officially as software engineers (though they use
> > the term programmers informally all the time), whereas the
> > terms computer science and computer scientist are generally
> > reserved for academics
>
> Well, I don't know what is "official". Some norwegian companies
> seem to use convoluted "international business" terminology for
> everything, which is just weird and "snobbish". I think "system
> developer" ("systemutvikler") is the broad general term here.

Well, I meant official as in what someone's job title would be. Most
developers have titles like "Software Engineer" or "Senior Softweer
Engineer." They'e frequently called programmers and/or software developers
when not talking about titles.

> So you can be an "informatiker" (broad term for your education)
> with an education in the fields of "computer science" and
> "software engineering", and work in the role of a "system
> developer".
>
> If you have a bachelor that fulfills the requirements for
> starting on a comp.sci. master then you are a computer scientist,
> but if you have a bachelor that doesn't and focus more on
> practical computing then you are a software engineer?
>
> You can have an education that is all about discrete math and
> still be a computer scientist. You couldn't then say you have a
> bachelor in software engineering, as it would be wrong. Likewise,
> you can have a bachelor in software engineering and barely know
> anything about complexity theory.

Yeah. Most universities in the US have a Computer Science degree, but some
have Software Engineering as a separate degree. My college had Computer
Science, Software Engineer, and Computer Engineering, which is atypical. All
of them took practical courses, but the SE guys didn't have to take some of
the more theoretical stuff and instead took additional classes focused on
working on projects in teams and whatnot. And CPE was basically a hybrid
between Computer Science and Electrical Engineering with an aim towards
embedded systems. But all of them had more of a practical focus than is the
case at many schools, because the school's motto is "learn by doing," and
they focus a lot on the practical side of things, whereas many Computer
Science programs suffer from not enough practical skills being taught. The
college in the city where I lived for my last job is notoriously bad at
teaching their Computer Science students much in the way of practical
skills.

I think that it's by far the most typical though that someone gets a degree
in Computer Science (with varying degrees of practical skils involved) and
then takes a job as a Software Engineer. And if you got a good focus on
pratical skills in school in addition to the theory, then you went to a good
school, whereas some schools do a very poor job on the practical side of
things.

> > And while the term informatics (or very similar terms) are used
> > in several other languages/countries, I've never heard the term
> > used in the US except to mention that some other
> > languages/countries use the term informatics for computer
> > science, and I'm willing to bet that relatively few programmers
> > in the US have ever even heard the term informatics.
>
> Yes, but it makes sense to distinguish between "computer science"
> (the timeless math and concepts behind computing) and "software
> engineering" (contemporary development methodology and practice).
> Although I think an education should cover both. "Informatics"
> just covers it all (as an educational field).

Agreed. A good education covers both the theoritical stuff and the practical
stuff, and some schools do distinguish based on what the focus of their
program is, but in the US at least, it's very common to have a Computer
Science program and less common to have a Software Engineering program
(though I think that Software Engineering degrees are becoming more common).

- Jonathan M Davis




Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Ola Fosheim Grøstad via Digitalmars-d

On Friday, 10 June 2016 at 15:03:30 UTC, jmh530 wrote:

On Friday, 10 June 2016 at 14:25:37 UTC, Adam D. Ruppe wrote:


To make an interpreter, you can just add a method to the AST 
objects that interprets and gives a result boom, it works!


Given my limited knowledge of compilers/interpreters, this part 
kind of seems like magic.


Let's say you have something simple like 1+2, you would build 
an AST that looks something like

   +
  / \
 1   2
What would be the next step?


https://en.wikipedia.org/wiki/Tree_traversal#Post-order


Re: [OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 15:03:30 UTC, jmh530 wrote:

On Friday, 10 June 2016 at 14:25:37 UTC, Adam D. Ruppe wrote:


To make an interpreter, you can just add a method to the AST 
objects that interprets and gives a result boom, it works!


Given my limited knowledge of compilers/interpreters, this part 
kind of seems like magic.


Let's say you have something simple like 1+2, you would build 
an AST that looks something like

   +
  / \
 1   2
What would be the next step?


1. this is heavily OT. ;-)
2. you may take a look at my gml engine. it has clearly separated 
language parser and AST builder (gaem.parser), and AST->VM 
compiler (gaem.runner/compiler.d).


[OT] Re: Andrei's list of barriers to D adoption

2016-06-10 Thread jmh530 via Digitalmars-d

On Friday, 10 June 2016 at 14:25:37 UTC, Adam D. Ruppe wrote:


To make an interpreter, you can just add a method to the AST 
objects that interprets and gives a result boom, it works!


Given my limited knowledge of compilers/interpreters, this part 
kind of seems like magic.


Let's say you have something simple like 1+2, you would build an 
AST that looks something like

   +
  / \
 1   2
What would be the next step?


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Adam D. Ruppe via Digitalmars-d

On Friday, 10 June 2016 at 11:11:49 UTC, Chris wrote:
Nice. Anyone interested in turning this into "DScript"? Having 
a scripting language powered by D would also boost D's 
prestige. And it would be easy to write modules in pure D.


So I use my toy thing from time to time and it is pretty cool. My 
favorite part (and the reason I made it) is the easy interop with 
D itself: you basically just assign your D functions and values 
to a global object and get them out via the same var type - in D!


var globals = var.emptyObject;
globals.write = &(writeln!string);
var result = interpret(your_script_string, globals);
writeln(result);


where the script string looks like:

write("Hi!");
10 + 3 * 4;


and it will work:

$ dmd test.d arsd/script.d arsd/jsvar.d
$ ./test
Hi!
22



So really easy to use in all three ways: D interop is easy, the 
script lang itself is easy, and compiling it is easy, it is just 
the two modules.



I've even did a bit of GUI and DOM wrapping with it and my 
simpledisplay.d and dom.d in toys... a surprisingly big chunk of 
things just work.




The downside though is that it is something I basically slapped 
together in a weekend to support var.eval on a lark... it has a 
few weird bugs and the code is no longer beautiful as it has 
grown organically, and it isn't very fast, it is a simple AST 
interpreter that makes liberal use of new objects in D (even like 
a null object is allocated on the D side), but it is REALLY easy 
to use and coupled with native D functions for real work, it 
might just be interesting enough to play with.


tho idk if I'd recommend it for serious work. Just use D for that!


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Adam D. Ruppe via Digitalmars-d

On Friday, 10 June 2016 at 13:55:28 UTC, Chris wrote:
I have neither time nor the required expertise to write a 
scripting language from scratch ;) You on the other hand ... :-)


Oh, it isn't that hard, at least to do a quick basic thing. You 
might want to start with the various math parsers. A postfix one 
is relatively easy:


2 3 +

break it up into tokens, read them in, build a syntax tree (well, 
for the postfix thing, it is probably a stack!).


That approach will even work for a Lisp-like language!

Then try an infix one. You'd use the same tokenizer, but the 
parser is different... and this kind of parser gets you started 
for a typical script language.


2 + 3

The way this works is you read the token, peek ahead, create an 
object and build a tree. You'd use different functions for 
different contexts. So it might start with readExpression which 
readFactor. Then readFactor might call readAddend...


If you look at the D grammar: http://dlang.org/spec/grammar.html 
you'll find the various terms are defined as WhateverExpressions 
and often recursively...


you can write the parser to follow that basically the same way! 
You end up with one of these: 
https://en.wikipedia.org/wiki/Recursive_descent_parser


Once you get addition and multiplication working with correct 
order of operations, you just kinda start adding stuff! Make a 
function call and an if/loop statement and boom, you have a 
simple programming language.


After that, it is basically just adding more token recognition 
and AST classes.




To make an interpreter, you can just add a method to the AST 
objects that interprets and gives a result boom, it works! 
Compiling is basically the same idea, just spitting out something 
other than the result of the expression - spitting out code that 
gives you the result. That gets harder to get into all the fancy 
techniques, but it builds on the same foundation.




It is a good thing to know how to do, at least the basic parts!


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Kagamin via Digitalmars-d

On Friday, 10 June 2016 at 08:29:50 UTC, Yura wrote:
Another things where I do almost all my mistakes is: array 
bounding/calling the memory which was free => the result is 
undefined behavior. If I remember correctly the D is better 
with that respect?


I think slices and automatic bound checking is the most important 
improvement of D over C. An important concern in simulations 
(mentioned by one using D in bioinformatics) is correctness: if 
you have a bug, the program is not guaranteed to crash, it can 
just give an incorrect result.


Anyway, super easy way to use all C libraries + super active 
support would clearly help to increase D popularity/adoption. 
All other point I raised are perhaps not that important.


I'm not as optimistic about binding C libraries as others :) I 
think it requires skills.


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 13:55:28 UTC, Chris wrote:
I have neither time nor the required expertise to write a 
scripting language from scratch ;) You on the other hand ... :-)


so just use Adam's code as the starting point then! ;-)


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Chris via Digitalmars-d

On Friday, 10 June 2016 at 11:20:35 UTC, ketmar wrote:

On Friday, 10 June 2016 at 11:11:49 UTC, Chris wrote:
Nice. Anyone interested in turning this into "DScript"? Having 
a scripting language powered by D would also boost D's 
prestige. And it would be easy to write modules in pure D.


DScript could then be used by scientists, game developers etc. 
à la Python and if speed is crucial, just write a module in 
pure D.


it looks like you just described a project you can start 
yourself! ;-)


I have neither time nor the required expertise to write a 
scripting language from scratch ;) You on the other hand ... :-)


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Russel Winder via Digitalmars-d
On Tue, 2016-06-07 at 11:21 +, ketmar via Digitalmars-d wrote:
> On Tuesday, 7 June 2016 at 11:11:31 UTC, Russel Winder wrote:
> > On Tue, 2016-06-07 at 09:55 +, ketmar via Digitalmars-d 
> > wrote:
> > > 
> > […]
> > > considering that the whole package, including dlangUI, is one 
> > > man work... give it a chance! ;-)
> > 
> > A project starting as a one person thing is quite natural, a 
> > project aiming to gain traction remaining a one person project 
> > is a dead end siding.
> 
> not everybody is good at promoting their work. yes, this skill is 
> required to make your project wide-known (and then wide-used), 
> but... this is where other people can step in to help. i'm sux in 
> promoting things too, so i'm doing as much as i can: mentioning 
> the project occasionally here and there.

My point was not so much a direct marketing one more an indirect one:
if a project is claiming to be a production thing usable by all and
sundry but is a one-person project, then it isn't actually production
ready. It may actually be production ready, but it will not be
perceived as that: person under a bus scenario and all that.

-- 

Russel.
=
Dr Russel Winder  t: +44 20 7585 2200   voip: sip:russel.win...@ekiga.net
41 Buckmaster Roadm: +44 7770 465 077   xmpp: rus...@winder.org.uk
London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder

signature.asc
Description: This is a digitally signed message part


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 11:11:49 UTC, Chris wrote:
Nice. Anyone interested in turning this into "DScript"? Having 
a scripting language powered by D would also boost D's 
prestige. And it would be easy to write modules in pure D.


DScript could then be used by scientists, game developers etc. 
à la Python and if speed is crucial, just write a module in 
pure D.


it looks like you just described a project you can start 
yourself! ;-)


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Chris via Digitalmars-d

On Friday, 10 June 2016 at 11:05:27 UTC, ketmar wrote:
Adam has scripting engine in his arsd repo[1]. it's not a speed 
demon, but it is much more like JS, it even has exceptions, and 
it is very easy to integrate it with D code. you may take a 
look at it too. afair, you only need jsvar.d and script.d 
modules to use it.


[1] https://github.com/adamdruppe/arsd


Nice. Anyone interested in turning this into "DScript"? Having a 
scripting language powered by D would also boost D's prestige. 
And it would be easy to write modules in pure D.


DScript could then be used by scientists, game developers etc. à 
la Python and if speed is crucial, just write a module in pure D.


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 10:55:42 UTC, Chris wrote:
Cool. I'd love to see `DScript` one day - and replace JS once 
and for all ... well ... just day dreaming ...


Adam has scripting engine in his arsd repo[1]. it's not a speed 
demon, but it is much more like JS, it even has exceptions, and 
it is very easy to integrate it with D code. you may take a look 
at it too. afair, you only need jsvar.d and script.d modules to 
use it.


[1] https://github.com/adamdruppe/arsd


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Chris via Digitalmars-d

On Friday, 10 June 2016 at 10:21:07 UTC, ketmar wrote:


i have several of them, actually. yet they are very specialized 
— i.e. designed to support my game engines, not to be 
"wide-area scripting languages".


publicly accessible are:

DACS[1] — statically typed, with modules and UFCS, and JIT 
compiler built with LibJIT[2], optionally supports internal 
stack-based VM.


GML[3] — part of my WIP Game Maker 8 emulator, register-based 
3-op VM, no JIT.



[1] http://repo.or.cz/dacs.git
[2] https://www.gnu.org/software/libjit/
[3] http://repo.or.cz/gaemu.git


Cool. I'd love to see `DScript` one day - and replace JS once and 
for all ... well ... just day dreaming ...


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 10:03:29 UTC, Chris wrote:
A scripting language based on D? Is it open source? I've always 
dreamt of something like that.


i have several of them, actually. yet they are very specialized — 
i.e. designed to support my game engines, not to be "wide-area 
scripting languages".


publicly accessible are:

DACS[1] — statically typed, with modules and UFCS, and JIT 
compiler built with LibJIT[2], optionally supports internal 
stack-based VM.


GML[3] — part of my WIP Game Maker 8 emulator, register-based 
3-op VM, no JIT.



[1] http://repo.or.cz/dacs.git
[2] https://www.gnu.org/software/libjit/
[3] http://repo.or.cz/gaemu.git


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Chris via Digitalmars-d

On Friday, 10 June 2016 at 09:46:11 UTC, ketmar wrote:

On Friday, 10 June 2016 at 09:35:32 UTC, Chris wrote:
And also, always use ldc or gdc, once your project is ready to 
go. dmd generated code is slow as it is only a reference 
compiler.


but not *dog* *slow*. ;-) if one don't really need to squeeze 
every possible cycle out of CPU, DMD-generated code is more 
than acceptable. i, for example, managed to make my scripting 
language almost as efficient with DMD -O as Lua with gcc -O3. 
;-)


No not slow slow. Even in debugging mode it produces acceptable 
results for testing.


A scripting language based on D? Is it open source? I've always 
dreamt of something like that.


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 09:35:32 UTC, Chris wrote:
And also, always use ldc or gdc, once your project is ready to 
go. dmd generated code is slow as it is only a reference 
compiler.


but not *dog* *slow*. ;-) if one don't really need to squeeze 
every possible cycle out of CPU, DMD-generated code is more than 
acceptable. i, for example, managed to make my scripting language 
almost as efficient with DMD -O as Lua with gcc -O3. ;-)


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Chris via Digitalmars-d
And also, always use ldc or gdc, once your project is ready to 
go. dmd generated code is slow as it is only a reference compiler.


http://dlang.org/download.html


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Chris via Digitalmars-d

On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:

Hello,

I have to stress I am beginner in programming, mainly 
interested in number crunching in academia (at least so far). I 
started to write a small project in D, but had to switch to C 
for few reasons:


1) Importance for my CV. I know Python, if I add also C - it 
sounds, and could be useful since the C language is, apart from 
the other reasons, is popular and could help me wit the future 
job, both in academia and industry, since there are many C/C++ 
projects.


I wouldn't worry too much about the CV. Maybe in a year or two 
companies will demand you know Ruby or Javascript :) Once you 
know who to program it's not so hard to pick up other languages. 
The basic concepts of handling / mapping data are always the same 
(hash tables, arrays ...)


2) The libraries - in the scientific world you can find 
practically everything which has already been coded in C, => 
many C libraries. To link it to be used within D code requires 
some work/efforts, and since I am not that confident in my IT 
skills, I decided that C code calling C libraries is much safer.


It's a piece of cake most of the time, it's really easy.[1] When 
I first tried it, I couldn't believe that it was _that_ simple. I 
use some C code too in one of my projects and it's easy to either 
call individual C functions or, if needs be, you can turn a C 
header file into a D interface file with only a few changes (they 
will almost look identical).


There is absolutely no reason not to use D because of existing C 
libraries. The seamless interfacing to C is one of D's major 
advantages. In this way you can write elegant D code and still 
take advantage of the plethora of C libraries that are available.


Here are examples of porting C libraries that have D interfaces:

https://github.com/D-Programming-Deimos?page=1

If you need help, you can always ask on the forum. Nobody will 
bite you :-)


There are even D frameworks that enable you to interact with 
Python [2] and Lua. I'd say give it a shot.


[1] http://dlang.org/spec/interfaceToC.html
[2] https://github.com/ariovistus/pyd

Other links:

http://dlang.org/ctod.html

http://dlang.org/articles.html


3) For C - a lot of tutorials, everything has been explained at 
stack overflow many times, huge community of people. E.g. you 
want to use OpenMP, Open MPI - everything is there, explained 
many times, etc.


4) The C language is well tested and rock solid stable. 
However, if you encounter a potential bug in D, I am not sure 
how long would it take to fix.


5) Garbage collector - it will slow my number crunching down.


You should test it first, gut feeling is not a proof. If it 
really does slow down your code, write GC free code, as ketmar 
suggested. You can always ask on the .learn forum.


Please, do not take it as criticism, I like D language, I tried 
it before C and I find it much much easier, and user friendly. 
I feel it is more similar to Python. On the other hand C++ is 
too complex for me, and D would be the perfect option for the 
scientific community, if the above points would be fixed 
somehow..


Best luck with your work!





Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 08:29:50 UTC, Yura wrote:

something tells me that GC would slow you down
because in this field people are fighting for every
percent of the performance (many simulations are
running for weeks).


yes, GC will have an effect for such things. but then, people 
fighting for performance will resort to "dirty tricks" in any 
language, i believe, and in D it is not that hard to avoid GC 
(i'm doing something like that in my videogame and sound 
engines). but the great thing — at least for me — that you can 
easily prototype your solution with GC, debug it, and then 
gradually replace data structures with other data structures that 
does manual memory management. this way you can debug your data 
structures and algorithms independently.


of course, it is possible in C and C++ too, but i found that with 
D it is somewhat easier.



Another point is to link the libraries, with my poor
background in IT, even to link the C library to the
C code is a challenge, and something tells me that
to link it to D would be even more challenging


i found that D is better here too, it just require some... 
discipline. thanks to UFCS, one can write D code that *looks* 
like OOP, but actualy only using structs and free functions. this 
way you can use `export(C)` on your public API, and still use 
`myvar.myfunc()` syntax in D, but have C-ready `myfunc()` 
syntax to export. also, with some CTFE magic one can even 
generate such wrappers in compile time.


Another things where I do almost all my mistakes is: array 
bounding/calling the memory which was free => the result is 
undefined behavior. If I remember correctly the D is better 
with that respect?


yes. with GC, you won't hit "use after free" error. and D does 
bound checking on array access (this can be turned off once you 
debugged your code), so you will get a stack trace on RangeError.


Anyway, super easy way to use all C libraries + super active 
support would clearly help to increase D popularity/adoption.


and as for C libraries... i'm actively using alot of C libs in my 
D projects, and most of the time i do wrappers for them with sed. 
;-) i.e. i'm just taking C header file, run some regular 
expression replaces on it, and then do some manual cleanup. that 
is, even without specialized tools one is able to produce a 
wrapper with very small time and effort investement.


tbh, i even translated some C libraries to D mostly with sed. for 
example, enet and libotr.


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Yura via Digitalmars-d

On Friday, 10 June 2016 at 06:37:08 UTC, ketmar wrote:

On Friday, 10 June 2016 at 06:25:55 UTC, ketmar wrote:

On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:

4) The C language is well tested and rock solid stable.


loool.


ah, sorry, let me explain myself. i hit ALOT of gcc bugs in my 
life. and i never fixed any of them myself, 'cause gcc is very 
huge, and i don't feel that it worth it. even with bugs that 
blocked my work i used workarounds and hand-written asm.


i hit some bugs in D too. curiously, it comparable with gcc in 
numbers (maybe this tells us something, and maybe it is just a 
coincidence). some of them i was able not only report, but fix. 
usually, official fix comes later, and was better than mine 
hacky patch, but hey... DMD compiler is less complex than gcc, 
*alot* less complex.


now, why i loled: i thinked about what you wrote, and found 
that gcc bugs blocks my work/pet projects more often than dmd 
bugs. it may sounds strange, but dmd bug is usually either 
fixed fast enough (and building new dmd+druntime+phobos from 
sources takes less than two minutes on my PC), or i know a 
workaround.


contrary to that, even if gcc bug was fixed fast (and usually 
they don't), rebuilding gcc takes 20‒30 minutes. and most of 
the time i can't even understand what fix does, due to huge and 
unknown codebase.


so no, C languange is not "rock solid stable". it just has alot 
less features, and if you will use the same feature set in DMD, 
you will hardly hit a bug too.


Thanks all of you for the constructive discussion, I am a chemist 
studying the programming by myself since I need it to explore 
chemistry at the molecular level and to check my chemical ideas. 
The professional software engineer(SE)/computer scientist(CS) 
would do the job much faster, and the resulting code would look 
much better - but, to do that you need to explain all the 
chemistry behind to SE/CS which would be tricky, and the most 
important (drastic) approximations come exactly from chemistry - 
not from the particular language. I hope you excuse me for the 
long introduction. In my area there are 3 languages dominating: 
Python, Fortran and C/C++. The first is easy (many libraries are 
available), but might be very slow. Fortran is used by the old 
professors, tons of libraries, but is not used outside of 
academia - and this stops young people from studying it because 
everyone at some point may quit an academia. C and C++ perhaps 
dominate the field, but especially the second one is very tough 
for people coming from non-IT background.


I believe D has very high chances to be competitive in this 
field. Regarding the GC, I will try to check it when I have some 
time, but since most of the codes are written in non-GC languages 
(https://en.wikipedia.org/wiki/List_of_quantum_chemistry_and_solid-state_physics_software), something tells me that GC would slow you down because in this field people are fighting for every percent of the performance (many simulations are running for weeks). Another point is to link the libraries, with my poor background in IT, even to link the C library to the C code is a challenge, and something tells me that to link it to D would be even more challenging => to make linking the C libraries as easy as possible (Fortran or C++ libraries are not as important) and to have active support forum when you can as for help in linking the libraries to your D code would be helpful. As people have this support, then they are confident to start their new projects in D. Then, at the conferences/ in the scientific papers people can advertise and promote the language, which is more user friendly than C, Fortran and C++ and is modern enough.
However, perhaps, only enthusiasm is not sufficient for all this, 
you need the sponsors... I agree the C subset for sure guarantees 
(almost) absence of bugs.
Another things where I do almost all my mistakes is: array 
bounding/calling the memory which was free => the result is 
undefined behavior. If I remember correctly the D is better with 
that respect?
Anyway, super easy way to use all C libraries + super active 
support would clearly help to increase D popularity/adoption. All 
other point I raised are perhaps not that important.


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread Ola Fosheim Grøstad via Digitalmars-d

On Friday, 10 June 2016 at 05:37:37 UTC, Jonathan M Davis wrote:

I assume that you're not from the US?


Right, I am in Oslo (Norway).

In the US at least, professional programmers are almost always 
referred to officially as software engineers (though they use 
the term programmers informally all the time), whereas the 
terms computer science and computer scientist are generally 
reserved for academics


Well, I don't know what is "official". Some norwegian companies 
seem to use convoluted "international business" terminology for 
everything, which is just weird and "snobbish". I think "system 
developer" ("systemutvikler") is the broad general term here.


So you can be an "informatiker" (broad term for your education) 
with an education in the fields of "computer science" and 
"software engineering", and work in the role of a "system 
developer".


If you have a bachelor that fulfills the requirements for 
starting on a comp.sci. master then you are a computer scientist, 
but if you have a bachelor that doesn't and focus more on 
practical computing then you are a software engineer?


You can have an education that is all about discrete math and 
still be a computer scientist. You couldn't then say you have a 
bachelor in software engineering, as it would be wrong. Likewise, 
you can have a bachelor in software engineering and barely know 
anything about complexity theory.



And while the term informatics (or very similar terms) are used 
in several other languages/countries, I've never heard the term 
used in the US except to mention that some other 
languages/countries use the term informatics for computer 
science, and I'm willing to bet that relatively few programmers 
in the US have ever even heard the term informatics.


Yes, but it makes sense to distinguish between "computer science" 
(the timeless math and concepts behind computing) and "software 
engineering" (contemporary development methodology and practice). 
Although I think an education should cover both. "Informatics" 
just covers it all (as an educational field).




Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Friday, 10 June 2016 at 06:25:55 UTC, ketmar wrote:

On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:

4) The C language is well tested and rock solid stable.


loool.


ah, sorry, let me explain myself. i hit ALOT of gcc bugs in my 
life. and i never fixed any of them myself, 'cause gcc is very 
huge, and i don't feel that it worth it. even with bugs that 
blocked my work i used workarounds and hand-written asm.


i hit some bugs in D too. curiously, it comparable with gcc in 
numbers (maybe this tells us something, and maybe it is just a 
coincidence). some of them i was able not only report, but fix. 
usually, official fix comes later, and was better than mine hacky 
patch, but hey... DMD compiler is less complex than gcc, *alot* 
less complex.


now, why i loled: i thinked about what you wrote, and found that 
gcc bugs blocks my work/pet projects more often than dmd bugs. it 
may sounds strange, but dmd bug is usually either fixed fast 
enough (and building new dmd+druntime+phobos from sources takes 
less than two minutes on my PC), or i know a workaround.


contrary to that, even if gcc bug was fixed fast (and usually 
they don't), rebuilding gcc takes 20‒30 minutes. and most of the 
time i can't even understand what fix does, due to huge and 
unknown codebase.


so no, C languange is not "rock solid stable". it just has alot 
less features, and if you will use the same feature set in DMD, 
you will hardly hit a bug too.


Re: Andrei's list of barriers to D adoption

2016-06-10 Thread ketmar via Digitalmars-d

On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:

4) The C language is well tested and rock solid stable.


loool.


Re: Andrei's list of barriers to D adoption

2016-06-09 Thread Jonathan M Davis via Digitalmars-d
On Friday, June 10, 2016 02:38:28 Ola Fosheim Grøstad via Digitalmars-d wrote:
> On Thursday, 9 June 2016 at 21:54:05 UTC, Jack Stouffer wrote:
> > On Thursday, 9 June 2016 at 21:46:28 UTC, Walter Bright wrote:
> >> Programming is a mix of engineering and craft. There are
> >> people who do research into programming theory, and those are
> >> computer scientists. I'm not one of them. Andrei is.
> >
> > Unfortunately, the term "software engineer" is a LOT less
> > popular than "computer scientist".
>
> How so? I only hear people use the term "programmer" or
> "informatics".

I assume that you're not from the US?

In the US at least, professional programmers are almost always referred to
officially as software engineers (though they use the term programmers
informally all the time), whereas the terms computer science and computer
scientist are generally reserved for academics. And while the term
informatics (or very similar terms) are used in several other
languages/countries, I've never heard the term used in the US except to
mention that some other languages/countries use the term informatics for
computer science, and I'm willing to bet that relatively few programmers in
the US have ever even heard the term informatics.

- Jonathan M Davis




Re: Andrei's list of barriers to D adoption

2016-06-09 Thread Ola Fosheim Grøstad via Digitalmars-d

On Thursday, 9 June 2016 at 21:54:05 UTC, Jack Stouffer wrote:

On Thursday, 9 June 2016 at 21:46:28 UTC, Walter Bright wrote:
Programming is a mix of engineering and craft. There are 
people who do research into programming theory, and those are 
computer scientists. I'm not one of them. Andrei is.


Unfortunately, the term "software engineer" is a LOT less 
popular than "computer scientist".


How so? I only hear people use the term "programmer" or 
"informatics".


Computer Science -> pure math / classification / concepts.

Software Engineering ->  process of developing software.

At my uni we had the term "informatics"  which covers both 
comps.sci., software engineering and requirements analysis, human 
factors etc. But it IS possible to be a computer scientist and 
only know math and no actual programming. Not common, but 
possible.


But yes, sometimes people who have not studied compsci, but only 
read stuff on wikipedia engage in debates as if they knew the 
topic and then the distinction matters. There are things you 
never have to explain to a person who knows compsci, but you 
almost always have trouble explaining to people who don't know it 
(but think they do, because they are programmers and have seen 
big-oh notation in documentation).


It is like a car engineering listening to a driver claiming that 
you should pour oil on your breaks if they make noise. Or a 
mathematician having to explain what infinity entails. At some 
point it is easier to just make a distinction between those who 
know the fundamental things about how brakes actually are 
constructed, and those who know how to drive a car.


The core difference as far as debates goes, is that comp sci is 
mostly objective (proofs) and software engineering is highly 
subjective (contextual practice).


So, if the topic is compsci then you usually can prove that the 
other person is wrong in a step-by-step irrefutable fashion. 
Which makes a big difference, actually. People who know compsci 
usually think that is ok, because they like to improve their 
knowledge and are used to getting things wrong (that's how you 
learn). People who don't know compsci usually don't like it 
becuase they are being told that they don't know something they 
like to think they know (but actually don't and probably never 
will).


That's just the truth... ;-)



Re: Andrei's list of barriers to D adoption

2016-06-09 Thread Jack Stouffer via Digitalmars-d

On Thursday, 9 June 2016 at 21:46:28 UTC, Walter Bright wrote:
Programming is a mix of engineering and craft. There are people 
who do research into programming theory, and those are computer 
scientists. I'm not one of them. Andrei is.


Unfortunately, the term "software engineer" is a LOT less popular 
than "computer scientist".


Re: Andrei's list of barriers to D adoption

2016-06-09 Thread Walter Bright via Digitalmars-d

On 6/9/2016 1:38 PM, Jack Stouffer wrote:

On Thursday, 9 June 2016 at 18:02:05 UTC, deadalnix wrote:

You are a scientist, so try to measure. GC generally improves throughput at
the cost of latency.


As a side note, I always found it funny that programmers call themselves
"computer scientists" while many write a lot of their programs without tests.


A scientist is someone who does research to make discoveries, while an engineer 
puts scientific discoveries to work.


Programming is a mix of engineering and craft. There are people who do research 
into programming theory, and those are computer scientists. I'm not one of them. 
Andrei is.


Re: Andrei's list of barriers to D adoption

2016-06-09 Thread Walter Bright via Digitalmars-d

On 6/9/2016 9:44 AM, Yura wrote:

4) The C language is well tested and rock solid stable. However, if you
encounter a potential bug in D, I am not sure how long would it take to fix.


Thanks for taking the time to post here.

Yes, there are bugs in D. Having dealt with buggy compilers from every vendor 
for decades, I can speak from experience that almost every bug has workarounds 
that will keep the project moving.


Also, bugs in D tend to be with the advanced features. But there's a C-ish 
subset that's nearly a 1:1 correspondence to C, and if you are content with C 
style it'll serve you very well.


Re: Andrei's list of barriers to D adoption

2016-06-09 Thread deadalnix via Digitalmars-d

On Thursday, 9 June 2016 at 20:38:30 UTC, Jack Stouffer wrote:

On Thursday, 9 June 2016 at 18:02:05 UTC, deadalnix wrote:
You are a scientist, so try to measure. GC generally improves 
throughput at the cost of latency.


As a side note, I always found it funny that programmers call 
themselves "computer scientists" while many write a lot of 
their programs without tests.


A ton of computer science, even the one that is peer reviewed, do 
not publish code. It's garbage...


And then you look at https://twitter.com/RealPeerReview and 
conclude it maybe isn't that bad.




Re: Andrei's list of barriers to D adoption

2016-06-09 Thread Jack Stouffer via Digitalmars-d

On Thursday, 9 June 2016 at 18:02:05 UTC, deadalnix wrote:
You are a scientist, so try to measure. GC generally improves 
throughput at the cost of latency.


As a side note, I always found it funny that programmers call 
themselves "computer scientists" while many write a lot of their 
programs without tests.


Re: Andrei's list of barriers to D adoption

2016-06-09 Thread deadalnix via Digitalmars-d

On Thursday, 9 June 2016 at 16:44:23 UTC, Yura wrote:

5) Garbage collector - it will slow my number crunching down.



You are a scientist, so try to measure. GC generally improves 
throughput at the cost of latency.




Re: Andrei's list of barriers to D adoption

2016-06-09 Thread Yura via Digitalmars-d

On Monday, 6 June 2016 at 02:20:52 UTC, Walter Bright wrote:
Andrei posted this on another thread. I felt it deserved its 
own thread. It's very important.

-
I go to conferences. Train and consult at large companies. 
Dozens every year, cumulatively thousands of people. I talk 
about D and ask people what it would take for them to use the 
language. Invariably I hear a surprisingly small number of 
reasons:


* The garbage collector eliminates probably 60% of potential 
users right off.


* Tooling is immature and of poorer quality compared to the 
competition.


* Safety has holes and bugs.

* Hiring people who know D is a problem.

* Documentation and tutorials are weak.

* There's no web services framework (by this time many folks 
know of D, but of those a shockingly small fraction has even 
heard of vibe.d). I have strongly argued with Sönke to bundle 
vibe.d with dmd over one year ago, and also in this forum. 
There wasn't enough interest.


* (On Windows) if it doesn't have a compelling Visual Studio 
plugin, it doesn't exist.


* Let's wait for the "herd effect" (corporate support) to start.

* Not enough advantages over the competition to make up for the 
weaknesses above.


Hello,

I have to stress I am beginner in programming, mainly interested 
in number crunching in academia (at least so far). I started to 
write a small project in D, but had to switch to C for few 
reasons:


1) Importance for my CV. I know Python, if I add also C - it 
sounds, and could be useful since the C language is, apart from 
the other reasons, is popular and could help me wit the future 
job, both in academia and industry, since there are many C/C++ 
projects.


2) The libraries - in the scientific world you can find 
practically everything which has already been coded in C, => many 
C libraries. To link it to be used within D code requires some 
work/efforts, and since I am not that confident in my IT skills, 
I decided that C code calling C libraries is much safer.


3) For C - a lot of tutorials, everything has been explained at 
stack overflow many times, huge community of people. E.g. you 
want to use OpenMP, Open MPI - everything is there, explained 
many times, etc.


4) The C language is well tested and rock solid stable. However, 
if you encounter a potential bug in D, I am not sure how long 
would it take to fix.


5) Garbage collector - it will slow my number crunching down.

Please, do not take it as criticism, I like D language, I tried 
it before C and I find it much much easier, and user friendly. I 
feel it is more similar to Python. On the other hand C++ is too 
complex for me, and D would be the perfect option for the 
scientific community, if the above points would be fixed somehow..


Best luck with your work!


Re: Andrei's list of barriers to D adoption

2016-06-08 Thread Andrei Alexandrescu via Digitalmars-d

On 6/8/16 3:43 PM, Timon Gehr wrote:

On 08.06.2016 01:59, Walter Bright wrote:

...

I suspect D has long since passed point where it is too complicated for
the rather limited ability of mathematicians to prove things about it.


The main reason why it is currently impractical to prove things about D
is that D is not really a mathematical object. I.e. there is no precise
spec.


Walter and I have spoken about the matter and reached the conclusion 
that work on a formal spec (be it in legalese, typing trees, small step 
semantics etc) on a reduced form of D would be very beneficial.


We are very much supportive of such work.


Andrei



Re: Andrei's list of barriers to D adoption

2016-06-08 Thread Ola Fosheim Grøstad via Digitalmars-d

On Wednesday, 8 June 2016 at 13:43:27 UTC, Timon Gehr wrote:

On 08.06.2016 01:59, Walter Bright wrote:

...

I suspect D has long since passed point where it is too 
complicated for
the rather limited ability of mathematicians to prove things 
about it.


The main reason why it is currently impractical to prove things 
about D is that D is not really a mathematical object. I.e. 
there is no precise spec.


Besides that, even if a @safe checker is slightly flawed, it only 
has to be vetted better than the backend which most likely is 
unverified anyway.


This is different from some of the static analysis done on C, 
which convert the LLVM bitcode or even X86 assembly into a format 
that can be queried using a solver. That way the proof holds even 
in the case where the backend is buggy.




Re: Andrei's list of barriers to D adoption

2016-06-08 Thread Timon Gehr via Digitalmars-d

On 08.06.2016 01:59, Walter Bright wrote:

...

I suspect D has long since passed point where it is too complicated for
the rather limited ability of mathematicians to prove things about it.


The main reason why it is currently impractical to prove things about D 
is that D is not really a mathematical object. I.e. there is no precise 
spec.


Re: Andrei's list of barriers to D adoption

2016-06-08 Thread Timon Gehr via Digitalmars-d

On 08.06.2016 02:39, Walter Bright wrote:

On 6/7/2016 4:07 PM, Andrei Alexandrescu wrote:

It is my opinion that writing off formal proofs of safety is a
mistake. Clearly
we don't have the capability on the core team to work on such.
However, I am
very interested if you'd want to lead such an effort.


On the contrary, I think a formal proof would be very valuable. I am
just skeptical of the notion that a proof is automatically correct. I've
read about mistakes being found in many published mathematical proofs. I
read somewhere that Hilbert made many mistakes in his proofs, even
though the end results turned out correct.




Mathematicians use a semi-formal style of reasoning in publications. 
Most mistakes are minor and most mathematicians don't use tools (such as 
https://coq.inria.fr/) to verify their proofs like computer scientists 
often do when proving properties of formal systems.


The focus of Mathematics isn't necessarily on verification, it is 
usually on aesthetics, understanding, communication etc. Current tools 
are not particularly strong in such areas and it is often more tedious 
to get the proof through than it should be. And certainly Hilbert didn't 
have access to anything like them.


Re: Andrei's list of barriers to D adoption

2016-06-08 Thread Andrei Alexandrescu via Digitalmars-d

On 6/8/16 1:50 PM, Timon Gehr wrote:

I'll probably do it at some point. (However, first I need to figure out
what the formal language specification should actually be, this is one
reason why I'm implementing a D compiler.)


That's very very promising. Looking forward to anything in that area! -- 
Andrei


Re: Andrei's list of barriers to D adoption

2016-06-08 Thread Timon Gehr via Digitalmars-d

On 08.06.2016 01:07, Andrei Alexandrescu wrote:

On 6/8/16 12:53 AM, Timon Gehr wrote:

On 08.06.2016 00:47, Walter Bright wrote:

On 6/7/2016 3:23 PM, Timon Gehr wrote:

Obviously they proved the virtual machine itself memory safe,


As I recall, the proof was broken, not the implementation.


Which time?


That is an old result that has essentially expired and should not be
generalized. See
http://www.seas.upenn.edu/~sweirich/types/archive/1999-2003/msg00849.html.


I think this can't be what Walter is referring to: "the type inference 
system for generic method calls was not subjected to formal proof. In 
fact, it is unsound,"


I.e. no proof, unsound.


I assume the matter has been long fixed by now, do you happen to know?
...


I don't know.

BTW, Java's type system is unsound [1].

class Unsound {
static class Bound {}
static class Bind {
 A bad(Bound bound, B b) {return b;}
}
public static  U coerce(T t) {
Bound bound = null;
Bind bind = new Bind();
return bind.bad(bound, t);
}
public static void main(String[] args){
String s=coerce(0);
}
}



People do
make mistakes and overlook cases with proofs. There's nothing magical
about them.



Obviously, but there are reliable systems that check proofs
automatically.


It is my opinion that writing off formal proofs of safety is a mistake.
Clearly we don't have the capability on the core team to work on such.
However, I am very interested if you'd want to lead such an effort.


Andrei



I'll probably do it at some point. (However, first I need to figure out 
what the formal language specification should actually be, this is one 
reason why I'm implementing a D compiler.)



[1] https://www.facebook.com/ross.tate/posts/10102249566392775.


Re: Andrei's list of barriers to D adoption

2016-06-08 Thread Ola Fosheim Grøstad via Digitalmars-d

On Wednesday, 8 June 2016 at 00:39:54 UTC, Walter Bright wrote:
On the contrary, I think a formal proof would be very valuable. 
I am just skeptical of the notion that a proof is automatically 
correct. I've read about mistakes being found in many published 
mathematical proofs. I read somewhere that Hilbert made many 
mistakes in his proofs, even though the end results turned out 
correct.


Well, you cannot prevent errors in the requirements, but you can 
eliminate errors in the proof, so if the requirements are too 
complex  you have a bad deal. The theorem prover is separate from 
the proof verifier. It works like this:


1. Human specifies the requirements (e.g. assert(...) in D)

2. Theorem prover takes program + requirements + strategies 
(prodding the prover along the right track) and emits a lng 
formal proof in a standard format.


3. The proof is handed to N independently implemented verifiers 
that checks the proof.


But that is impractical for typical user created program. You 
only want to do that once, for your backend or your type-system 
etc.


--

What you can do is, as you've stated before, transform your 
source code into a simpler form and verify that it only can lead 
to situations that are provably safe.


The advantage of this is that you also can prove that specific 
cases of pointer arithmetics are provably safe (say, fixed size 
array on the stack) thus reducing the need for @trusted.


The disadvantage is that it will slow down the compiler and make 
it more complicated, so why have it in the compiler and not as a 
separate program?


Make it a separate program so it works on uninstantiated code and 
prove libraries to be correctly marked @safe before they are 
uploaded to repositories etc.


If @safe does not affect code gen, why have it in the compiler?



Re: Andrei's list of barriers to D adoption

2016-06-08 Thread Adam Wilson via Digitalmars-d

Jack Stouffer wrote:

On Tuesday, 7 June 2016 at 13:39:19 UTC, Steven Schveighoffer wrote:

I just read elsewhere that a GSoC student is working to achieve the
goal of making the GC swappable and adding Reiner's precise scanning
GC. I consider this to be essential work, I hope we can get this
rolling soon!


https://github.com/dlang/druntime/pull/1581



Feedback is greatly appreciated! If you have an opinion on how to 
implement user-selectable Garbage Collection algorithms, please chime 
in. That said, we may not include your feedback, being a GSoC project 
means that we will need to fast-track merges so as not to block the 
student. We'll listen, but we might push that really cool idea of yours 
off until after GSoC.


--
// Adam Wilson
// import quiet.dlang.dev;


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Jonathan M Davis via Digitalmars-d
On Tuesday, June 07, 2016 21:00:06 H. S. Teoh via Digitalmars-d wrote:
> Actually, I'm not sure how much of Phobos actually depends on the GC.
> Most of the stuff I use frequently are from std.range and std.algorithm,
> and we've pretty much gotten rid of GC-dependence from most of the stuff
> there.  Phobos modules that are GC-heavy ought to be avoided in
> high-performance code anyway; the only problematic case I can think of
> being std.string which greedily allocates. But we've been making
> progress on that over the past year or so by turning many of the
> functions into range-based algorithms rather than string-specific.

As I understand it, the big problems relate to lambdas and closures and the
like. As it stands, it's way too easy to end up allocating when using stuff
like std.algorithm even though it doesn't obviously allocate. And on some
level at least, I think that that's more an issue of language improvements
than library improvements. But no, explicit use of the GC in Phobos is not
particularly heavy. Array-related stuff often allocates, and the few places
in Phobos that use classes allocate, but in general, Phobos really doesn't
do much in the way explicit allocations. It's the implicit ones that are the
killer.

- Jonathan M Davis



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread H. S. Teoh via Digitalmars-d
On Tue, Jun 07, 2016 at 07:24:55PM -0700, Jonathan M Davis via Digitalmars-d 
wrote:
> On Tuesday, June 07, 2016 19:04:06 H. S. Teoh via Digitalmars-d wrote:
> > I think far too much energy has been spent arguing for a GC-less
> > language than actually writing the code that would fix its
> > associated performance issues, and my suspicion is that this is
> > mostly caused by your typical C/C++ programmer mindset (of which I
> > used to be a part) that's always obsessed about memory management,
> > rather than any factual basis.
> 
> One of the other big issues is that many of the companies which use D
> and have been active in the community have been doing _very_ high
> performance stuff where they can't afford the GC to kick in even
> occasionally for less than a hundred milliseconds (e.g. IIRC, Manu
> considering 10ms to be too long for what they were doing at Rememdy).

I'm pretty sure there are applications for which GC is verboten, such as
high-performance game engines and what-not (but even for them, it's just
the core code that needs to avoid GC; peripheral things like in-game
scripting may actually be using a forced-GC scripting language -- it's
just that you want to control certain core operations to be maximally
performant).  But these are the exceptions rather than the norm.


[...]
> We need to support GC-less code, and we need to avoid using the GC in
> Phobos stuff where it's not necessary, since it will impede the high
> performance folks otherwise.  And doing some of the stuff like turning
> off the GC in specific code like you discussed will take care of many
> of the folks that would be affected by GC issues. But for your average
> D program, I really think that it's a non-issue, and as the GC
> situation improves, it will be even less of an issue.

Actually, I'm not sure how much of Phobos actually depends on the GC.
Most of the stuff I use frequently are from std.range and std.algorithm,
and we've pretty much gotten rid of GC-dependence from most of the stuff
there.  Phobos modules that are GC-heavy ought to be avoided in
high-performance code anyway; the only problematic case I can think of
being std.string which greedily allocates. But we've been making
progress on that over the past year or so by turning many of the
functions into range-based algorithms rather than string-specific.

Lately I've been considering in my own code that a lot of things
actually don't *need* the GC. Even things like string transformations
generally don't need to allocate if they are structured to be a
range-based pipeline, and the consumer (sink) processes the data as it
becomes available instead of storing everything in intermediate buffers.
Even if you do need to allocate some intermediate buffers these are
generally well-scoped, and can be taken care of with malloc/free and
scope(exit).  The only time you really need the GC is when passing
around large recursive data structures with long lifetimes, like trees
and graphs.


> So, to some extent, I agree that there's too much an issue made over
> the GC - especially by folks who aren't even using the language and
> pretty much just react negatively to the idea of the GC.

As Walter has said before: people who aren't already using the language
are probably only using GC as an excuse to not use the language. They
won't necessarily start using the language after we've bent over
backwards to get rid of the GC. We should be paying attention to current
users (and yes I know Manu has been clamoring for @nogc and he's a
current user).


> But we do still need to do a better job of not requiring the GC when
> it's not actually needed as well as better supporting some of the
> GC-less paradigms like ref-counting.
[...]

IMO a lot of Phobos modules actually could become much higher-quality if
rewritten to be GC-independent.  Some of the less-frequently used
modules are GC-heavy not out of necessity but because of sloppy code
quality, or because they were written prior to major D2 breakthroughs
like ranges and templates / CT introspection.


T

-- 
The early bird gets the worm. Moral: ewww...


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Jonathan M Davis via Digitalmars-d
On Tuesday, June 07, 2016 19:04:06 H. S. Teoh via Digitalmars-d wrote:
> I think far too much energy has been spent arguing for a GC-less
> language than actually writing the code that would fix its associated
> performance issues, and my suspicion is that this is mostly caused by
> your typical C/C++ programmer mindset (of which I used to be a part)
> that's always obsessed about memory management, rather than any factual
> basis.

One of the other big issues is that many of the companies which use D and
have been active in the community have been doing _very_ high performance
stuff where they can't afford the GC to kick in even occasionally for less
than a hundred milliseconds (e.g. IIRC, Manu considering 10ms to be too long
for what they were doing at Rememdy). When you start getting requirements
that are that stringent, you start needing to not use the GC. And when some
of the more visible users of D have requirements like that and raise issues
about how the GC gets in their way, then it becomes a big deal even if it's
not a problem for your average D user.

We need to support GC-less code, and we need to avoid using the GC in Phobos
stuff where it's not necessary, since it will impede the high performance
folks otherwise.  And doing some of the stuff like turning off the GC in
specific code like you discussed will take care of many of the folks that
would be affected by GC issues. But for your average D program, I really
think that it's a non-issue, and as the GC situation improves, it will be
even less of an issue.

So, to some extent, I agree that there's too much an issue made over the GC
- especially by folks who aren't even using the language and pretty much
just react negatively to the idea of the GC. But we do still need to do a
better job of not requiring the GC when it's not actually needed as well as
better supporting some of the GC-less paradigms like ref-counting.

- Jonathan M Davis



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread H. S. Teoh via Digitalmars-d
On Tue, Jun 07, 2016 at 07:00:13PM -0700, Charles Hixson via Digitalmars-d 
wrote:
> On 06/05/2016 09:17 PM, Adam D. Ruppe via Digitalmars-d wrote:
> > On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
> > > Duh! The claim is made that D can work without the GC... but
> > > that's a red herring... If you take about the GC what do you have?
> > 
> > Like 90% of the language, still generally nicer than most the
> > competition.
> > 
> > Though, I wish D would just own its decision instead of bowing to
> > Reddit pressure. GC is a proven success in the real world with a
> > long and impressive track record. Yes, there are times when you need
> > to optimize your code, but even then you aren't really worse off
> > with it than without it.
> > 
> Usually correct, but there are times when you want to suspend the
> garbage collection.  The problem is this should always be a scoped
> decision, because it's easy to accidentally leave it turned off, and
> then it's MUCH worse than not having it.

auto myFunc(Args...)(Args args) {
GC.disable();
scope(exit) GC.enable();

doStuff();
}

On another note, I have found that strategic disabling of the GC and/or
manually triggering GC.collect() at the right times, can give your
programs a big boost in performance, typically around 20% to 50%
depending on the specifics of your memory usage patterns. Arguably this
should be automatic once D's GC is replaced with something better than
the current implementation, but the point is that performance concerns
over the GC aren't insurmountable, and the fix is often not even that
complicated.

I think far too much energy has been spent arguing for a GC-less
language than actually writing the code that would fix its associated
performance issues, and my suspicion is that this is mostly caused by
your typical C/C++ programmer mindset (of which I used to be a part)
that's always obsessed about memory management, rather than any factual
basis.


T

-- 
It said to install Windows 2000 or better, so I installed Linux instead.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Charles Hixson via Digitalmars-d



On 06/05/2016 09:17 PM, Adam D. Ruppe via Digitalmars-d wrote:

On Monday, 6 June 2016 at 02:30:55 UTC, Pie? wrote:
Duh! The claim is made that D can work without the GC... but that's a 
red herring... If you take about the GC what do you have?


Like 90% of the language, still generally nicer than most the 
competition.


Though, I wish D would just own its decision instead of bowing to 
Reddit pressure. GC is a proven success in the real world with a long 
and impressive track record. Yes, there are times when you need to 
optimize your code, but even then you aren't really worse off with it 
than without it.


Usually correct, but there are times when you want to suspend the 
garbage collection.  The problem is this should always be a scoped 
decision, because it's easy to accidentally leave it turned off, and 
then it's MUCH worse than not having it.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Walter Bright via Digitalmars-d

On 6/7/2016 4:07 PM, Andrei Alexandrescu wrote:

It is my opinion that writing off formal proofs of safety is a mistake. Clearly
we don't have the capability on the core team to work on such. However, I am
very interested if you'd want to lead such an effort.


On the contrary, I think a formal proof would be very valuable. I am just 
skeptical of the notion that a proof is automatically correct. I've read about 
mistakes being found in many published mathematical proofs. I read somewhere 
that Hilbert made many mistakes in his proofs, even though the end results 
turned out correct.





Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Adam D. Ruppe via Digitalmars-d

On Tuesday, 7 June 2016 at 20:41:08 UTC, Walter Bright wrote:
The point being that a culture of "best practices" does arise 
and evolve over time, and professional programmers know it.


Sure, that's one of the big advantages C++ has over D: people 
have that institutional knowledge.


But, two important questions:

1) You criticize C++ because it isn't good enough - programmers 
are lazy and just because you can do it right doesn't mean they 
will. The right thing also needs to be the easy thing.


D's attribute spam is not the easy thing. And there's no apparent 
punishment for doing it "wrong" - everything works equally well 
for the author. It is only when a third party comes in and tries 
to slap the attribute on that it becomes an issue.


2) What makes you so sure @nogc will actually be part of the best 
practice? I haven't done a comprehensive study, but my impression 
so far is that it is very rarely used: the biggest win is being 
inferred in templates... which seems to imply that people aren't 
caring enough to actually write it out.


Would you want to use a library where the maintainers refuse to 
use @nogc even if they aren't using the gc?


I think you underestimate the cost of @nogc (and @safe, and pure, 
and nothrow) to the library author. It is littering their code 
with something they don't use themselves (and thus easy to forget 
to put there on new functions) and don't derive any direct 
benefit from.


Moreover, it limits their flexibility in the future: once a 
function is published with one of those attributes, it is part of 
the public interface so the implementation cannot change its mind 
in the future. That might be a good thing to the user begging for 
verified @nogc or whatever, but to the library author it is 
another cost for them to maintain.



Though, the apathy factor I think is bigger than the maintenance 
factor: I don't write it in my code because I just don't care. I 
have had only one user ever complain too... and he seems to have 
changed his mind by now and no longer cares either (probably 
because a critical mass of library authors still just don't care, 
so you can't realistically slap @nogc @safe on that much anyway).


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Jonathan M Davis via Digitalmars-d
On Tuesday, June 07, 2016 16:04:05 Walter Bright via Digitalmars-d wrote:
> On 6/7/2016 1:48 PM, Jonathan M Davis via Digitalmars-d wrote:
> > So, while mass applying something like @safe temporarily to check stuff
> > makes some sense, I really don't think that it's a good idea to do it in
> > any code that you'd ever commit.
>
> The downsides you listed do not apply to @safe.

Sure they do. Regardless of the attribute, if it can be inferred, and
templates are involved, you can't mass apply it, because because you almost
always need the attribute to be inferred. And regardless of whether an
attribute can be inferred, mass applying it tends to mean that it's harder
to figure out which attributes a function is actually marked with. It's
easier when it's just a label at the top of the file, but we've already had
PRs in Phobos where an attribute got applied locally as part the PR, because
the person doing the work did not realize that it was already in effect. And
personally, it always throws me off when attribute labels or blocks are
used, because it looks like the attribute is not being applied to a function
when it actually is. I don't think that it matters what the attribute is.
All of the same downsides apply. The primary difference with @safe over some
of the others is that you can reverse it, whereas you can't with most of
them. But even then, you can't tell a template to infer @safe when you've
marked the whole file with @safe, so while you can change which level of
trust you're applying to a function, you can remove the trust attributes
entirely once one of them has been applied.

Personally, I think that it's almost always a mistake to mass apply
attributes - especially those that can be inferred in templated code. It
does not play well with templates, and it causes maintenance problems.

- Jonathan M Davis



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Walter Bright via Digitalmars-d

On 6/7/2016 4:01 PM, Jonathan M Davis via Digitalmars-d wrote:

Yeah. I recall an article by Joel Spoelsky where he talks about deciding
that proofs of correctness weren't worth much, because they were even harder
to get right than the software.

I do think that there are situations where proofs are valuable, but they do
tend to be very difficult to get right, and their application is ultimately
fairly limited.


My understanding is that academic researchers who need to prove a theory use a 
subset of Java, because the smaller the language, the more practical it is to 
write proofs about it. I also remember bearophile bringing up the Spec# language 
which was supposed to be able to formally prove things, but it turned out not 
much. I fed it some one liners with bit masking which it threw in the towel on.


I suspect D has long since passed point where it is too complicated for the 
rather limited ability of mathematicians to prove things about it.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Steven Schveighoffer via Digitalmars-d

On 6/7/16 7:05 PM, Walter Bright wrote:

On 6/7/2016 2:28 PM, Steven Schveighoffer wrote:

I can attest that figuring out why something isn't inferred @safe
isn't always
easy, and the "slap a @safe: tag at the top" isn't always going to help.


Having a -safe compiler switch to make @safe the default won't improve
that in the slightest.



No, of course not. I don't think anyone has said this.

In my experience, finding the reasons something isn't inferred safe is 
an iterative process with the compiler and temporarily marking targeted 
code. I don't think grep helps here at all, and neither do global safe 
attributes.


-Steve


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Andrei Alexandrescu via Digitalmars-d

On 6/8/16 12:53 AM, Timon Gehr wrote:

On 08.06.2016 00:47, Walter Bright wrote:

On 6/7/2016 3:23 PM, Timon Gehr wrote:

Obviously they proved the virtual machine itself memory safe,


As I recall, the proof was broken, not the implementation.


Which time?


That is an old result that has essentially expired and should not be 
generalized. See 
http://www.seas.upenn.edu/~sweirich/types/archive/1999-2003/msg00849.html. 
I assume the matter has been long fixed by now, do you happen to know?



People do
make mistakes and overlook cases with proofs. There's nothing magical
about them.



Obviously, but there are reliable systems that check proofs automatically.


It is my opinion that writing off formal proofs of safety is a mistake. 
Clearly we don't have the capability on the core team to work on such. 
However, I am very interested if you'd want to lead such an effort.



Andrei



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Walter Bright via Digitalmars-d

On 6/7/2016 3:10 PM, Timon Gehr wrote:

If you think progress on this matters, why are you arguing against it?


I don't believe it is worth the effort. You do. You need to make a better case 
for it, and the best way to do that is to actually write a spec. Demanding that 
someone (i.e. me) who doesn't believe in it, has a poor track record of writing 
specs, and has no academic background in writing proofs, means you aren't going 
to get what you want that route.


You've made some valuable contributions to D, in the form of finding problems. 
Why not contribute something more substantial, like a spec? You have a good idea 
in mind what it should be, just write it.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Walter Bright via Digitalmars-d

On 6/7/2016 1:48 PM, Jonathan M Davis via Digitalmars-d wrote:

So, while mass applying something like @safe temporarily to check stuff
makes some sense, I really don't think that it's a good idea to do it in any
code that you'd ever commit.


The downsides you listed do not apply to @safe.



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Jonathan M Davis via Digitalmars-d
On Tuesday, June 07, 2016 15:47:10 Walter Bright via Digitalmars-d wrote:
> On 6/7/2016 3:23 PM, Timon Gehr wrote:
> > Obviously they proved the virtual machine itself memory safe,
>
> As I recall, the proof was broken, not the implementation. People do make
> mistakes and overlook cases with proofs. There's nothing magical about them.

Yeah. I recall an article by Joel Spoelsky where he talks about deciding
that proofs of correctness weren't worth much, because they were even harder
to get right than the software.

I do think that there are situations where proofs are valuable, but they do
tend to be very difficult to get right, and their application is ultimately
fairly limited.

- Jonathan M Davis



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Timon Gehr via Digitalmars-d

On 08.06.2016 00:47, Walter Bright wrote:

On 6/7/2016 3:23 PM, Timon Gehr wrote:

Obviously they proved the virtual machine itself memory safe,


As I recall, the proof was broken, not the implementation.


Which time?


People do
make mistakes and overlook cases with proofs. There's nothing magical
about them.



Obviously, but there are reliable systems that check proofs automatically.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Timon Gehr via Digitalmars-d

On 08.06.2016 00:44, Walter Bright wrote:

On 6/7/2016 11:32 AM, Timon Gehr wrote:

mixin("@tru"~"sted void foo(){ ... }");


So grep for mixin, too. Not hard.



Huge amounts of false positives.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Walter Bright via Digitalmars-d

On 6/7/2016 3:23 PM, Timon Gehr wrote:

Obviously they proved the virtual machine itself memory safe,


As I recall, the proof was broken, not the implementation. People do make 
mistakes and overlook cases with proofs. There's nothing magical about them.




Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Walter Bright via Digitalmars-d

On 6/7/2016 11:32 AM, Timon Gehr wrote:

mixin("@tru"~"sted void foo(){ ... }");


So grep for mixin, too. Not hard.



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Walter Bright via Digitalmars-d

On 6/7/2016 12:56 PM, Steven Schveighoffer wrote:

Bug? I would have expected @nogc: to permeate.


It did originally, but that was removed. It's deliberate.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Timon Gehr via Digitalmars-d

On 07.06.2016 22:36, Walter Bright wrote:

...

BTW, it is a nice idea to require mathematical proofs of code
properties, but real world programming languages have turned out to be
remarkably resistant to construction of such proofs. As I recall, Java
had initially proven that Java was memory safe, until someone found a
hole in it. And so on and so forth for every malware attack vector
people find. We plug the problems as we find them.


Obviously they proved the virtual machine itself memory safe, not all of 
its implementations. If you have a mechanized proof of memory safety, 
then your language is memory safe.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Timon Gehr via Digitalmars-d

On 07.06.2016 21:52, Walter Bright wrote:

On 6/7/2016 11:32 AM, Timon Gehr wrote:

The @safe subset should be specified and
implemented by inclusion, such that it is obvious that it does the
right thing.
I don't know what's 'unspecific' about this.
Closing holes one-by-one is not the
right approach here. You don't know when you are done and might never be.


I don't see how it is any different painting the fence from one
direction or the other.


The fence is infinitely long, your painting speed is finite and people 
will be looking at the fence mostly at the left end.



There are omissions possible either way.
...


In one way, an omission means you are potentially tracking down memory 
corruption inside a huge codebase by grepping for @trusted, until you 
notice that the issue is in @safe code.


In the other way, an omission means you are getting a spurious compile 
error that is easily worked around.



Another issue is implementing such a spec. The "disapproved" list is how
the compiler works,


It is how the compiler fails to work.


and makes it reasonably straightforward to check the
implementation against the list. It's quite a mess to try to tag
everything the compiler does with approved/disapproved, so you wind up
in exactly the same boat anyway.
...


The compiler should work by inclusion too.


In any case, writing such a large specification covering every semantic
action of the of the language is way, way beyond being a bugzilla issue.
...


Does not apply. The bugzilla issue can be fixed by disallowing all code 
in @safe. Also, why not just close the bugzilla issue _after_ there is a 
more adequate replacement?



If you want to take charge of writing such a specification DIP,
please do so.



If you think progress on this matters, why are you arguing against it?



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Jonathan M Davis via Digitalmars-d
On Tuesday, June 07, 2016 17:28:27 Steven Schveighoffer via Digitalmars-d 
wrote:
> On 6/7/16 5:10 PM, Jonathan M Davis via Digitalmars-d wrote:
> > On Tuesday, June 07, 2016 20:52:15 Dave via Digitalmars-d wrote:
> >> On Tuesday, 7 June 2016 at 20:48:13 UTC, Jonathan M Davis wrote:
> >>> On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:
>  [...]
> >>>
> >>> IMHO, it's bad practice to mass apply attributes with labels or
> >>> blocks. It's far too easy to accidentally mark a function with
> >>> an attribute that you didn't mean to, and it makes it way
> >>> harder to figure out which attributes actually apply to a
> >>> function. And when you add templates into the mix, applying
> >>> attributes en masse doesn't work anyway, because pretty much
> >>> the only time that you want to mark a template function with an
> >>> attribute is when the template arguments have nothing to do
> >>> with whether the attribute is appropriate or not.
> >>>
> >>> [...]
> >>
> >> So we should not follow the advice of Walter?
> >
> > If he's arguing that you should slap an attribute on the top of your
> > module to apply to everything, then no, I don't think that we should
> > follow
> > his advice. He's a very smart guy, but he's not always right. And in my
> > experience, mass applying attributes is a mistake.
>
> The original(?) complaint was that it's hard to grep for @system because
> it's the default.
>
> I think the advice is to put the attribute at the top to see where your
> non-conforming code lies. Not as a permanent fixture.
>
> I can attest that figuring out why something isn't inferred @safe isn't
> always easy, and the "slap a @safe: tag at the top" isn't always going
> to help. But it can be a technique to find such things.

Yeah. It makes sense as a temporary solution to track down problems. It
makes a lot less sense as a way to write your code normally.

- Jonathan M Davis



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Steven Schveighoffer via Digitalmars-d

On 6/7/16 5:10 PM, Jonathan M Davis via Digitalmars-d wrote:

On Tuesday, June 07, 2016 20:52:15 Dave via Digitalmars-d wrote:

On Tuesday, 7 June 2016 at 20:48:13 UTC, Jonathan M Davis wrote:

On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:

[...]


IMHO, it's bad practice to mass apply attributes with labels or
blocks. It's far too easy to accidentally mark a function with
an attribute that you didn't mean to, and it makes it way
harder to figure out which attributes actually apply to a
function. And when you add templates into the mix, applying
attributes en masse doesn't work anyway, because pretty much
the only time that you want to mark a template function with an
attribute is when the template arguments have nothing to do
with whether the attribute is appropriate or not.

[...]


So we should not follow the advice of Walter?


If he's arguing that you should slap an attribute on the top of your
module to apply to everything, then no, I don't think that we should follow
his advice. He's a very smart guy, but he's not always right. And in my
experience, mass applying attributes is a mistake.



The original(?) complaint was that it's hard to grep for @system because 
it's the default.


I think the advice is to put the attribute at the top to see where your 
non-conforming code lies. Not as a permanent fixture.


I can attest that figuring out why something isn't inferred @safe isn't 
always easy, and the "slap a @safe: tag at the top" isn't always going 
to help. But it can be a technique to find such things.


-Steve


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Jonathan M Davis via Digitalmars-d
On Tuesday, June 07, 2016 20:52:15 Dave via Digitalmars-d wrote:
> On Tuesday, 7 June 2016 at 20:48:13 UTC, Jonathan M Davis wrote:
> > On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:
> >> [...]
> >
> > IMHO, it's bad practice to mass apply attributes with labels or
> > blocks. It's far too easy to accidentally mark a function with
> > an attribute that you didn't mean to, and it makes it way
> > harder to figure out which attributes actually apply to a
> > function. And when you add templates into the mix, applying
> > attributes en masse doesn't work anyway, because pretty much
> > the only time that you want to mark a template function with an
> > attribute is when the template arguments have nothing to do
> > with whether the attribute is appropriate or not.
> >
> > [...]
>
> So we should not follow the advice of Walter?

If he's arguing that you should slap an attribute on the top of your
module to apply to everything, then no, I don't think that we should follow
his advice. He's a very smart guy, but he's not always right. And in my
experience, mass applying attributes is a mistake.

- Jonathan M Davis



Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Ola Fosheim Grøstad via Digitalmars-d
On Tuesday, 7 June 2016 at 20:55:12 UTC, Ola Fosheim Grøstad 
wrote:

repeat10:{
   N:<{ n:@int; do 10->n; inner; exit n};
   i:@int;
   do N -> iterate{ do inner; }
}

repeat99:repeat10{
  N:<{ do 99->n; inner; }
}

repeat99{ do i -> print; "bottles of wine" ->print }



Adding some comments, as the example was not clear on its own:

// repeat10 is a new pattern (class) inheriting from object by 
default

repeat10:{
// N is a virtual pattern (function)
N:<{ n:@int; do 10->n; inner; exit n};
   do N -> iterate{ do inner; }  // this is a subclass of the 
previously defined iterate pattern

}

// repeat99 is a new pattern inheriting from repeat10 above
repeat99:repeat10{
// N is a specialization of N in repeat10
// N expands to {n:@int; do 10->n; 99->n; inner; exit n}
   N:<{ do 99->n; inner; }
}

// this is a subclass of repeat 99
repeat99{ do i -> print; "bottles of wine" ->print }


Give or take, I haven't used Beta in 20 years. Abstractions is 
not the problem, a very simple language can provide what most 
programmers need. Perhaps not with a familiar syntax though.




Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Ola Fosheim Grøstad via Digitalmars-d

On Tuesday, 7 June 2016 at 19:52:47 UTC, Chris wrote:
But we agree that templates are a good idea in general, 
regardless of the actual implementation.


Having access to parametric abstractions is a good idea. How to 
best use them is not so obvious... in real projects where things 
changes. (Except for trivial stuff).


What do you mean by `memory management`? GC/RC built into the 
compiler?


Everything related to managing ownership and access, making the 
most out of static analysis. Putting things on the stack, or 
simply not allocating if not used.


What do you mean? Is it a good or a bad thing that the library 
is detached from the core language?


I mean that the standard library features that are closely 
related to the core language semantics are more stable than 
things like HTTP.


Believe me, features will be requested. Would you have an 
example of how such a language, or better still did you have 
time to design and test one? A proof of concept.


C++ with just member functions and wrappers around builtins is 
pretty close. Yes, I have used minimalistic languages, Beta for 
instance. I believe gBeta is closer. I guess dynamic languages 
like Self and even Javascript are there too. Probably also some 
of the dependently typed languages, but I have never used those.


If you stripped down C++  by taking out everything that can be 
expressed using another feature then you would have the 
foundation.


Having to write the same for loop with slight variations all 
over again is not my definition of efficient programming. One 
of D's strengths is that it offers nice abstractions for data 
representation.


Hm? You can have templated functions in C++.

Not special but handy. Before Java 8 (?) you had to use 
inner/anonymous classes to mimic lambdas. Not very efficient. 
Boiler plate, repetition, the whole lot.


Well, that is a matter of implementation. C++ lambdas are exactly 
that, function objects, but there is no inherent performance 
penalty.  A "lambda" is mostly syntactical sugar.


I was not talking about that. Read it again. I said that the D 
community actively demands features or improvements and uses 
them.


Are you talking about the language or the standard library? I 
honestly don't think the latter matter much. Except for memory 
management.


It goes without saying that existing features have to be 
improved. It's a question of manpower.


No, it is a matter of being willing to improve the semantics. 
Many of the improvements that is needed to best C++ are simple, 
but slightly breaking, changes.


D could change floats so that interval arithmetics can be 
implemented. Which is difficult to do in clang/gcc. That would be 
a major selling point. But the basic reasoning is that this is 
not needed, because C/C++ fails to comply with the IEEE standard 
as well.


If the motivation is to trail C/C++, then there is no way to 
surpass C/C++, then there is no real motivation to switch.


There is a learning curve that cannot be made flatter. There 
are concepts that have to be grasped and understood. Any 
language (cf. Nim) that allows you to do sophisticated and 
low-level things is harder to learn than JS or Python.


Not sure what sophisticated things you are referring to.

(JS and Python have complexity issues as well, you just don't 
need to learn them to make good use the languages).


Go forces you to repeat yourself. The less features you have, 
the more you have to write the same type of code all over 
again. Look at all the for loops in a C program.


Loops and writing the same code over is not a major hurdle. 
Getting it right is the major hurdle. So having many loops is not 
bad, but having a way to express that you want to iterate from 1 
to 10 in a less error-prone way matters.


But you can do that in minimalistic languages like Beta that has 
only two core entities:


- a pattern (type/function/subclass)
- an instance of a pattern

Just define a pattern that iterates and prefix your body with it, 
that is the same as subclassing it.


Pseudo-code (not actual Beta syntax, but Cish syntax for 
simplicty).


iterate:{
   N:@int;
   i: @int;
   enter N
   do
  i = 0
  while (i < N)
  do
  inner; // insert prefixed do-part here
  i++
}

10 -> iterate{ do i -> print; }


Or you could just do


repeat10:{
   N:<{ n:@int; do 10->n; inner; exit n};
   i:@int;
   do N -> iterate{ do inner; }
}

repeat99:repeat10{
  N:<{ do 99->n; inner; }
}

repeat99{ do i -> print; "bottles of wine" ->print }

etc...




Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Dave via Digitalmars-d

On Tuesday, 7 June 2016 at 20:48:13 UTC, Jonathan M Davis wrote:

On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:

[...]


IMHO, it's bad practice to mass apply attributes with labels or 
blocks. It's far too easy to accidentally mark a function with 
an attribute that you didn't mean to, and it makes it way 
harder to figure out which attributes actually apply to a 
function. And when you add templates into the mix, applying 
attributes en masse doesn't work anyway, because pretty much 
the only time that you want to mark a template function with an 
attribute is when the template arguments have nothing to do 
with whether the attribute is appropriate or not.


[...]


So we should not follow the advice of Walter?


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread ketmar via Digitalmars-d

On Tuesday, 7 June 2016 at 20:41:08 UTC, Walter Bright wrote:
Would you want to use a library where the maintainers refuse to 
use @nogc even if they aren't using the gc?


yes, i do. i'm actively using Adam's arsd libraries, and they 
doesn't have @nogc spam all over the place, even where functions 
doesn't use gc. more than that: i tend to ignore @nogc in my code 
too, almost never bothering to put it. it just doesn't worth the 
efforts.


Re: Andrei's list of barriers to D adoption

2016-06-07 Thread Jonathan M Davis via Digitalmars-d
On Tuesday, June 07, 2016 18:33:01 Dave via Digitalmars-d wrote:
> On Tuesday, 7 June 2016 at 18:24:33 UTC, Walter Bright wrote:
> > On 6/7/2016 11:19 AM, Jack Stouffer wrote:
> >> On Tuesday, 7 June 2016 at 18:15:28 UTC, Walter Bright wrote:
> >>> [...]
> >>
> >> But you can't grep for @system because 99% of the time it's
> >> implicit. This
> >> problem becomes harder to find when using templates for
> >> everything, which I do.
> >
> > Add:
> >@safe:
> > at the top of your D module and you'll find the @system code.
> > The D compiler is the static analysis tool. It's true that
> > @safe should have been the default, but too much code would
> > break if that were changed. Adding one line to the top of a
> > module is very doable for those that are desirous of adding the
> > safety checks.
> >
> > You can also add:
> >@nogc:
> > at the top, too. It isn't necessary to tediously annotate every
> > function.
>
> Seems fair. But perhaps phobos should also follow this standard?
> Which might be why people get the mindset that they have to
> annotate everything...

IMHO, it's bad practice to mass apply attributes with labels or blocks. It's
far too easy to accidentally mark a function with an attribute that you
didn't mean to, and it makes it way harder to figure out which attributes
actually apply to a function. And when you add templates into the mix,
applying attributes en masse doesn't work anyway, because pretty much the
only time that you want to mark a template function with an attribute is
when the template arguments have nothing to do with whether the attribute is
appropriate or not.

So, while mass applying something like @safe temporarily to check stuff
makes some sense, I really don't think that it's a good idea to do it in any
code that you'd ever commit.

- Jonathan M Davis



  1   2   3   4   >