Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-08-03 Thread Abdulhaq via Digitalmars-d

On Tuesday, 31 July 2018 at 22:55:08 UTC, Laeeth Isharc wrote:

Dpp doesn't work with STL yet.  I asked Atila how long to 
#include vector and he thought maybe two months of full-time 
work.  That's not out of the question in time, but we have too 
much else to do right now.  I'm not sure if recent mangling 
improvements help and how much that changes things.  But DPP 
keeps improving as does extern (C++) and probably one way and 
another it will work for quite a lot.  Calypso makes cpp 
classes work as both value and reference types.  I don't know 
the limit of what's possible without such changes - seems like 
C++ mangling is improving by leaps and bounds but I don't know 
when it will be dependable for templates.




Yes OK, thanks.

It's not that relevant what Andrei or Walter might think 
because it's a community-led project and we will make progress 
if somebody decides to spend their time working on it, or a 
company lends a resource for the same purpose.  I'm sure they 
are all in favour of greater cpp interoperability, but I don't 
think the binding constraint is will from the top, but rather 
people willing and able to do the work.




I think the DIP system has greatly improved the situation, but 
for anyone thinking of embarking on a lot of work for something 
like e.g. the GC, you do need to feel that there will be a good 
chance of it being adopted - otherwise all that work could go to 
waste.


And if one wants to see it go faster then one can logically 
find a way to help with the work or contribute financially.  I 
don't think anything else will make a difference.




Agreed entirely.

Same thing with Calypso.  It's not ready yet to be integrated 
in a production compiler so it's an academic question as to the 
leadership's view about it.


Where I'm coming from is that writing and maintaining something 
as large and complex as Calypso requires a whole heap both of 
motivation and also of encouragement from the sidelines - and 
especially from Walter and/or Andrei. If someone starts to feel 
that the backing is not there then it's very very hard to 
maintain motivation, particularly on infrastructure related code 
that if not integrated by Walter will always be hard for people 
to use and therefore not be widely adopted.


To be fair to Walter though, this is a really intractable problem 
for him. He could adopt something like Calypso, and then find the 
original maintainer loses interest. That would leave Walter 
either needing to maintain someone else's complex code, or try to 
extricate himself from code having already integrated it. Also, 
there is no guarantee, in this particular case, that as C++ 
evolves it will still be possible to use Calypso's strategy. Of 
course there are other very good reasons for why adopting it is 
problematic. Still, it leaves the developer struggling, I expect, 
to maintain motivation.


Considering the above, then knowing the general direction that 
Walter/Andrei want to take D, would be a great help in deciding 
what larger projects are worth undertaking. It seems to me, 
anyway (big caveat).







Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-31 Thread Laeeth Isharc via Digitalmars-d

On Sunday, 29 July 2018 at 09:35:06 UTC, Abdulhaq wrote:
On Saturday, 28 July 2018 at 14:45:19 UTC, Paolo Invernizzi 
wrote:


I forgot the link... here it is:
https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710



An interesting article. I found that Dennet's Consciousness 
Explained, which is presumably debunked old hat by now, is full 
of interesting experiments and speculation about how we model 
things in our mind and how our perceptions feed into that. It's 
a long time since I read it but if I remember correctly he 
shows how we seem to have a kind of mental theatre which has an 
expectation of what will come next from the senses, leading to 
interesting mistakes in perception. It's a useful model of how 
the mind works.


That website often carries good articles about new maths as 
well.




Me and my colleague are pretty different, in the approach to 
that kind of stuff...


Maybe I'll post on the Forum a 'Request for D Advocacy', a-la 
PostgreSQL, so the community can try to address some of his 
concerns about modern D, and lower his discomfort!


:-P


If you can explain to me what is the _direction_ of D in terms 
of interfacing with large C++ libraries it would be very much 
appreciated! I'd love to be using D for some of my projects but 
I have a perception that using e.g. VTK is still a difficult 
thing to do from D. Is that still true? What is the long term 
plan for D, is it extern(C++), a binding technology? Is there 
any interest in Calypso from the upper echelons? I want to know 
where D is trying to go, not just where it is now. I want to 
know if anyone has got their heart in it.


My CV says my main languages are Java, Python and D. That last 
one is mainly wishful thinking at the moment. I wish it wasn't! 
Make me believe, Paulo!


Well we are hiring D programmers in London and HK in case it's 
interesting.


Dpp doesn't work with STL yet.  I asked Atila how long to 
#include vector and he thought maybe two months of full-time 
work.  That's not out of the question in time, but we have too 
much else to do right now.  I'm not sure if recent mangling 
improvements help and how much that changes things.  But DPP 
keeps improving as does extern (C++) and probably one way and 
another it will work for quite a lot.  Calypso makes cpp classes 
work as both value and reference types.  I don't know the limit 
of what's possible without such changes - seems like C++ mangling 
is improving by leaps and bounds but I don't know when it will be 
dependable for templates.


It's not that relevant what Andrei or Walter might think because 
it's a community-led project and we will make progress if 
somebody decides to spend their time working on it, or a company 
lends a resource for the same purpose.  I'm sure they are all in 
favour of greater cpp interoperability, but I don't think the 
binding constraint is will from the top, but rather people 
willing and able to do the work.


And if one wants to see it go faster then one can logically find 
a way to help with the work or contribute financially.  I don't 
think anything else will make a difference.


Same thing with Calypso.  It's not ready yet to be integrated in 
a production compiler so it's an academic question as to the 
leadership's view about it.





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-31 Thread jmh530 via Digitalmars-d

On Tuesday, 31 July 2018 at 12:02:55 UTC, Kagamin wrote:

On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:
Are the Mozilla engineers behind it deluded in that they 
eschew GC and exceptions? I doubt it.


They are trying to outcompete Chrome in bugs too. You're not 
Mozilla. And why you mention exceptions, but not bounds 
checking?


Firefox has been complete garbage on my work computer ever since 
the Quantum update. Works fine at home though.


Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-31 Thread Kagamin via Digitalmars-d

On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:
Are the Mozilla engineers behind it deluded in that they eschew 
GC and exceptions? I doubt it.


They are trying to outcompete Chrome in bugs too. You're not 
Mozilla. And why you mention exceptions, but not bounds checking?


Here we kind of agree. If D is going to support a GC, I want a 
state of the art precise GC like Go has.


Go GC is far from being a state of the art, it trades everything 
for low latency and ease of configuration.


Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-30 Thread bpr via Digitalmars-d

On Saturday, 28 July 2018 at 21:44:10 UTC, Abdulhaq wrote:

On Saturday, 28 July 2018 at 21:27:12 UTC, bpr wrote:


I hear you. You're looking (roughly) for a better 
Java/Go/Scala, and I'm looking for a better C/C++/Rust, at 
least for what I work on now. I don't think D can be both 
right now, and that the language which can satisfy both of us 
doesn't exist yet, though D is close.


Yes, this. In the light of D's experience, is it even possible 
to have a language that satisfies both?


I believe that the tension between low and high level features 
makes it nearly impossible, that tracing GC is one of those 
difficult problems that rulses out satisfying both sets of users 
optimally, and that the best D (and C++ and Nim) can do is to be 
"mediocre to good, but not great" at both the low level (C/Rust) 
domain and high level domains simultaneously. There are far fewer 
players in the low level space, which is why I see D more as a 
competitor there, and welcome DasBetterC and the noGC initiatives 
so that D can be a great low level and maybe just a good high 
level language.





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-29 Thread Abdulhaq via Digitalmars-d

On Saturday, 28 July 2018 at 14:45:19 UTC, Paolo Invernizzi wrote:


I forgot the link... here it is:
https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710



An interesting article. I found that Dennet's Consciousness 
Explained, which is presumably debunked old hat by now, is full 
of interesting experiments and speculation about how we model 
things in our mind and how our perceptions feed into that. It's a 
long time since I read it but if I remember correctly he shows 
how we seem to have a kind of mental theatre which has an 
expectation of what will come next from the senses, leading to 
interesting mistakes in perception. It's a useful model of how 
the mind works.


That website often carries good articles about new maths as well.



Me and my colleague are pretty different, in the approach to 
that kind of stuff...


Maybe I'll post on the Forum a 'Request for D Advocacy', a-la 
PostgreSQL, so the community can try to address some of his 
concerns about modern D, and lower his discomfort!


:-P


If you can explain to me what is the _direction_ of D in terms of 
interfacing with large C++ libraries it would be very much 
appreciated! I'd love to be using D for some of my projects but I 
have a perception that using e.g. VTK is still a difficult thing 
to do from D. Is that still true? What is the long term plan for 
D, is it extern(C++), a binding technology? Is there any interest 
in Calypso from the upper echelons? I want to know where D is 
trying to go, not just where it is now. I want to know if anyone 
has got their heart in it.


My CV says my main languages are Java, Python and D. That last 
one is mainly wishful thinking at the moment. I wish it wasn't! 
Make me believe, Paulo!





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Ali Çehreli via Digitalmars-d

On 07/28/2018 05:43 AM, Laeeth Isharc wrote:

> It's not that bad calling D from Java.

Running D's GC in a thread that is started by an external runtime (like 
Java's) can be problematic. If a D function on another D-runtime thread 
needs to run a collection, then it will not know about this Java thread 
and won't stop it. One outcome is a crash if this thread continues to 
allocate while the other one is collecting.


The solution is having to call thread_attachThis() upon entry to the D 
function and thread_detachThis() upon exit. However, there are bugs with 
these function, which I posted a pull request (and abandoned it because 
of 32-bit OS X test failures.)


I think a better option would be to forget about all that and not do any 
GC in the D function that is called from Java. This simple function 
should just send a message to a D-runtime thread and return back to Java.


Ali



Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Walter Bright via Digitalmars-d

On 7/28/2018 7:09 AM, Laeeth Isharc wrote:
Opportunities are 
abundant where people aren't looking because they don't want to.


My father told me I wasn't at all afraid of hard work. I could lie down right 
next to it and go to sleep.


Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Abdulhaq via Digitalmars-d

On Saturday, 28 July 2018 at 21:27:12 UTC, bpr wrote:


I hear you. You're looking (roughly) for a better 
Java/Go/Scala, and I'm looking for a better C/C++/Rust, at 
least for what I work on now. I don't think D can be both right 
now, and that the language which can satisfy both of us doesn't 
exist yet, though D is close.


Yes, this. In the light of D's experience, is it even possible to 
have a language that satisfies both?





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread bpr via Digitalmars-d

On Saturday, 28 July 2018 at 20:34:37 UTC, Abdulhaq wrote:

On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:

On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
I think that I no longer fall into the category of developer 
that D is after. D is targeting pedal-to-the-metal 
requirements, and I don't need that. TBH I think 99% of 
developers don't need it.


I'm 99% sure you just made that number up ;-)



Sure, I plucked it out of thin air. But I do think of the 
software development world as an inverted pyramid in terms of 
performance demands and headcount. At the bottom of my inverted 
pyramid I have Linux and Windows. This code needs to be as 
performant as possible and bug free as possible. C/C++/D shine 
at this stuff. However, I number those particular developers in 
the thousands.


The developers at Mozilla working on the browser internals, for 
example, are unaccounted for in your analysis. As are the 
developers where I work.


I think a great bulk of developers, though, sit at the 
application development layer. They are pumping out great 
swathes of Java etc. Users of Spring and dozens of other 
frameworks. C++ is usually the wrong choice for this type of 
work, but can be adopted in a mistaken bid for performance.


I don't know that the great bulk of developers work in Java.


Any how many are churning out all that javascript and PHP code?

Hence I think that the number of developers who really need top 
performance is much smaller than the number who don't.


I'd be willing to accept that, but I have no idea what the actual 
numbers are.


If I had to write CFD code, and I'd love to have a crack, then 
I'd really be wanting to use D for its expressiveness and 
performance. But because of the domain that I do work in, I 
feel that I am no longer in D's target demographic.


If I had to write CFD code, and I wanted to scratch an itch to 
use a new language,
I'd probably pick Julia, because that community is made up of 
scientific computing
experts. D might be high on my list, but not likely the first 
choice. C++ would be in there too :-(.




I remember the subject of write barriers coming up in order (I 
think?) to improve the GC. Around that time Walter said he 
would not change D in any way that would reduce performance by 
even 1%.


Here we kind of agree. If D is going to support a GC, I want a 
state of the art precise GC like Go has. That may rule out some D 
features, or incur some cost that
high performance programmers don't like, or even suggest two 
kinds of pointer (a la Modula-3/Nim), which Walter also dislikes.


Hence I feel that D is ruling itself out of the application 
developer market.


At this stage in its life, I don't think D should try to be all 
things to all programmers, but rather focus on doing a few things 
way better than the competition.


That's totally cool with me, but it me a long time to realise 
that it was the case and that therefore it was less promising 
to me than it had seemed before.


I hear you. You're looking (roughly) for a better Java/Go/Scala, 
and I'm looking for a better C/C++/Rust, at least for what I work 
on now. I don't think D can be both right now, and that the 
language which can satisfy both of us doesn't exist yet, though D 
is close.







Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Abdulhaq via Digitalmars-d

On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:

On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
I think that I no longer fall into the category of developer 
that D is after. D is targeting pedal-to-the-metal 
requirements, and I don't need that. TBH I think 99% of 
developers don't need it.


I'm 99% sure you just made that number up ;-)



Sure, I plucked it out of thin air. But I do think of the 
software development world as an inverted pyramid in terms of 
performance demands and headcount. At the bottom of my inverted 
pyramid I have Linux and Windows. This code needs to be as 
performant as possible and bug free as possible. C/C++/D shine at 
this stuff. However, I number those particular developers in the 
thousands.


Then we have driver writers. Performance is important here but as 
I user I feel that I wish they would concentrate on the 
'bug-free' part a bit more. Especially   those cowboys who 
develop printer and bluetooth drivers. Of course, according to 
them it's the hardware that stinks. These guys and galls number 
in the tens of thousands. Yes I made that up.


Then we have a layer up, Libc developers and co. Then platform 
developers. Unity, Lumberyard for games. Apache.


I think a great bulk of developers, though, sit at the 
application development layer. They are pumping out great swathes 
of Java etc. Users of Spring and dozens of other frameworks. C++ 
is usually the wrong choice for this type of work, but can be 
adopted in a mistaken bid for performance.


Any how many are churning out all that javascript and PHP code?

Hence I think that the number of developers who really need top 
performance is much smaller than the number who don't.




For you, perhaps. I currently work mostly at a pretty low level 
and I'm pretty sure it's not just self delusion that causes us 
to use C++ at that low level. Perhaps you've noticed the rise 
of Rust lately? Are the Mozilla engineers behind it deluded in 
that they eschew GC and exceptions? I doubt it. I mostly prefer 
higher level languages with GCs, but nothing in life is free, 
and GC has significant costs.


If I had to write CFD code, and I'd love to have a crack, then 
I'd really be wanting to use D for its expressiveness and 
performance. But because of the domain that I do work in, I feel 
that I am no longer in D's target demographic.


I remember the subject of write barriers coming up in order (I 
think?) to improve the GC. Around that time Walter said he would 
not change D in any way that would reduce performance by even 1%. 
Hence I feel that D is ruling itself out of the application 
developer market. That's totally cool with me, but it me a long 
time to realise that it was the case and that therefore it was 
less promising to me than it had seemed before.





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread bpr via Digitalmars-d

On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
I think that I no longer fall into the category of developer 
that D is after. D is targeting pedal-to-the-metal 
requirements, and I don't need that. TBH I think 99% of 
developers don't need it.


I'm 99% sure you just made that number up ;-)

For those developers who don't need the performance usually 
achieved with C or C++, and can tolerate GC overheads, there are, 
IMO, better languages than D. I'm not saying that here to be 
inflammatory, just that I believe performance is a very big part 
of the attractiveness of D.


If you're mostly working on Android, then Kotlin seems like your 
best option for a non-Java language. It seems OK, there's a 
Kotlin native in the works, the tooling is fine, there's a REPL, 
etc. I like it better than I like Go.


We like to think we do and we love to marvel at the speed of 
improved code, but like prediction, it's overrated ;-)


For you, perhaps. I currently work mostly at a pretty low level 
and I'm pretty sure it's not just self delusion that causes us to 
use C++ at that low level. Perhaps you've noticed the rise of 
Rust lately? Are the Mozilla engineers behind it deluded in that 
they eschew GC and exceptions? I doubt it. I mostly prefer higher 
level languages with GCs, but nothing in life is free, and GC has 
significant costs.





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Abdulhaq via Digitalmars-d

On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:


It's tough when dealing with genuine - Knightian uncertainty or 
even more radical versions.  When one doesn't even know the 
structure of the problem then maximising expected utility 
doesn't work.  One can look at capacities - Choquet and the 
like - but then its harder to say something useful about what 
you should do.




Sounds interesting, I'll look into it.



But it's a loop and one never takes a final decision to master 
D. Also habits, routines and structures _do_ shape perception.




In truth I avoid discussions that are really just arguing 
about definitions of words, but you made a couple of sweeping 
bumper-stickery comments


That's entertaining.  I've not been accused of that before!  
Bear in mind also I tend to write on my phone.




I think I was just in need of a decent conversation. I didn't 
mean it in an accusatory manner :-). TBH I read those comments as 
coming from a D advocate who was in a motivational mood. They 
triggered a debate in me that has been wanting to come out, but I 
rarely contribute to forums these days.


Yes I read Kahneman et al papers for the first time in 92 in 
the university library.  I speed-read his book, and I thought 
it was a bad book.  I work with a specialist in making 
decisions under uncertainty - she was the only person able to 
articulate to George Soros how he made money because he 
certainly couldn't, and she is mentioned in the preface to the 
revised version of Alchemy.  She has the same view as me - 
behavioural finance is largely a dead end.  One learns much 
more by going straight to the neuroeconomics and incorporating 
also the work of Dr Iain Macgilchrist.


Kahneman makes a mistake in his choice of dimension.  There's 
analytic and intuitive/gestalt and in my experience people 
making high stakes decisions are much less purely analytical 
than a believer in the popular Kahneman might suggest.


What I said about prediction being overrated isn't 
controversial amongst a good number of the best traders and 
business people in finance.  You might read Nassim Taleb also.




You're way ahead of me here, obviously. I didn't read any Taleb 
until he made an appearance at the local bookshop. It was Black 
Swan and it didn't say anything that hadn't independently 
occurred to me already. However, for some reason it seemed to be 
a revelation to a lot of people.




Well it's a pity the D Android ecosystem isn't yet mature.  
Still I remain in awe of the stubborn accomplishment of the man 
(with help) who got LDC to run on Android.


It's not that bad calling D from Java.  Some day I will see if 
I can help automate that - Kai started working on it already I 
think.




D as a programming language has numerous benefits over Java, but 
trying to analyse why I would nevertheless choose Kotlin/Java for 
Android development:


* The Android work I do largely does not need high low level 
performance. The important thinking that is done is the user 
interface, how communication with the servers should look for 
good performance, caching etc. Designing good algorithms.


* Having done the above, I want a low friction way of getting 
that into code. That requires a decent expressive language with a 
quality build system that can churn out an APK without me having 
to think too hard about it. Kotlin/JDK8 are good enough and 
Android Studio helps a lot.


* Given the above, choosing D to implement some of the code would 
just be a cognitive and time overhead. It's no reflection on D in 
any way, it's just that all the tooling is for Java and the 
platform API/ABI is totally designed to host Java.


* "The man who (with help) got LDC to run on Android". The team, 
with the best will in the world, is too small to answer all the 
questions that the world of pain known as Android can throw up. 
Why doesn't this build for me? Gradle is killing me... Dub 
doesn't seem to be working right after the upgrade to X.Y... it 
works on my LG but not my Samsung... I've upgraded this but now 
that doesn't work anymore...


* Will there be a functioning team in 5 years time? Will they 
support older versions of Android? Can I develop on Windows? Or 
Linux? Why not?., etc., etc.



Since you already know D you need to answer a different 
question.
 What's the chance the compiler will die on the relevant 
horizon, and how bad will it be for me if that happens.  
Personally I'm not worried.   If D should disappear in a few 
years, it wouldn't be the end of the world to port things.  I 
just don't think that's very likely.




I answered the Android question already, as for engineering 
/scientific work (I design/develop engineering frameworks/tools 
for wing designers) python has bindings to numpy, Qt, CAD 
kernels, data visualisation tools. Python is fast enough to 
string those things together and run the overarching algorithms, 
GUIs, launch trade studies, scipy optimisations. It has even more 

Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Paolo Invernizzi via Digitalmars-d

On Saturday, 28 July 2018 at 14:09:44 UTC, Laeeth Isharc wrote:

On Saturday, 28 July 2018 at 13:55:31 UTC, Paolo Invernizzi


Perceptions, expectations, prediction...   an easy read I 
suggest on the latest trends [1], if someone is interested...


I forgot the link... here it is:
https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710

Yes - it's a competitive advantage, but opportunity often comes 
dressed in work clothes.


Curiosity is the salt of evolution... for example I'm now 
intrigued by the Master and His Emissary, I've to read it.


And another curiosity: I studied in the 90 in Milano, what was 
your thought on Hayek, von Mises, in those time? Classic 
Economics was so boring...


We're in an era when most people are not used to discomfort and 
have an inordinate distaste for it.  If you're fine with that 
and make decisions as best you can based on objective factors 
(objectivity being something quite different from 
'evidence-based' because of the drunk/lamppost issue) then 
there is treasure everywhere (to steal Andrey's talk title).  
Opportunities are abundant where people aren't looking because 
they don't want to.


Me and my colleague are pretty different, in the approach to that 
kind of stuff...


Maybe I'll post on the Forum a 'Request for D Advocacy', a-la 
PostgreSQL, so the community can try to address some of his 
concerns about modern D, and lower his discomfort!


:-P

/Paolo



Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Laeeth Isharc via Digitalmars-d

On Saturday, 28 July 2018 at 13:55:31 UTC, Paolo Invernizzi wrote:

On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:


each project I
start I give some very hard thought about which development 
environment I'm going to use, and D is often one of those 
options. The likely future of D on the different platforms is 
an important part of that assessment, hence 'predicting' the 
future of D, hard and very unreliable though that is, is an 
important element in some of my less trivial decisions.


Since you already know D you need to answer a different 
question.
 What's the chance the compiler will die on the relevant 
horizon, and how bad will it be for me if that happens.  
Personally I'm not worried.   If D should disappear in a few 
years, it wouldn't be the end of the world to port things.  I 
just don't think that's very likely.


Of course it depends on your context.  The people who use D at 
work seem to be more principals who have the right to take the 
best decision as they see it then agents who must persuade 
others who are the real decision-makers.  That's a recipe for 
quiet adoption that's dispersed across many industries 
initially and for the early adopters of D being highly 
interesting people.  Since, as the Wharton professor, Adam 
Grant observes, we are in an age where positive disruptors can 
achieve a lot within an organisation, that's also rather 
interesting.


A very interesting discussion... really.

Perceptions, expectations, prediction...   an easy read I 
suggest on the latest trends [1], if someone is interested...


BTW, Laeeth is right in the last paragraph two. I was one of 
the 'principal' who took the decision to use D in production, 
14 years ago, and he described the reasoning of that era very 
well.


Today I'm still convinced that the adoption of D is a 
competitive advantage for a company, I definitely have to work 
to improve my bad temper (eheh) to persuade my actual CTO to 
give it another change.


/Paolo (btw, I'm the CEO...)


Thanks for the colour, Paolo.

Yes - it's a competitive advantage, but opportunity often comes 
dressed in work clothes.  We're in an era when most people are 
not used to discomfort and have an inordinate distaste for it.  
If you're fine with that and make decisions as best you can based 
on objective factors (objectivity being something quite different 
from 'evidence-based' because of the drunk/lamppost issue) then 
there is treasure everywhere (to steal Andrey's talk title).  
Opportunities are abundant where people aren't looking because 
they don't want to.


Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Paolo Invernizzi via Digitalmars-d

On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:


each project I
start I give some very hard thought about which development 
environment I'm going to use, and D is often one of those 
options. The likely future of D on the different platforms is 
an important part of that assessment, hence 'predicting' the 
future of D, hard and very unreliable though that is, is an 
important element in some of my less trivial decisions.


Since you already know D you need to answer a different 
question.
 What's the chance the compiler will die on the relevant 
horizon, and how bad will it be for me if that happens.  
Personally I'm not worried.   If D should disappear in a few 
years, it wouldn't be the end of the world to port things.  I 
just don't think that's very likely.


Of course it depends on your context.  The people who use D at 
work seem to be more principals who have the right to take the 
best decision as they see it then agents who must persuade 
others who are the real decision-makers.  That's a recipe for 
quiet adoption that's dispersed across many industries 
initially and for the early adopters of D being highly 
interesting people.  Since, as the Wharton professor, Adam 
Grant observes, we are in an age where positive disruptors can 
achieve a lot within an organisation, that's also rather 
interesting.


A very interesting discussion... really.

Perceptions, expectations, prediction...   an easy read I suggest 
on the latest trends [1], if someone is interested...


BTW, Laeeth is right in the last paragraph two. I was one of the 
'principal' who took the decision to use D in production, 14 
years ago, and he described the reasoning of that era very well.


Today I'm still convinced that the adoption of D is a competitive 
advantage for a company, I definitely have to work to improve my 
bad temper (eheh) to persuade my actual CTO to give it another 
change.


/Paolo (btw, I'm the CEO...)




Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Laeeth Isharc via Digitalmars-d

On Saturday, 28 July 2018 at 11:09:28 UTC, Abdulhaq wrote:

On Friday, 27 July 2018 at 23:42:47 UTC, Laeeth Isharc wrote:

For me, I think that managing money is about choosing to 
expose your capital intelligently to the market, balancing the 
risk of loss against the prospective gain and considering this 
in a portfolio sense.


Prediction doesn't really come into that



I think this apparent difference of opinion is down to 
different definitions of the word prediction. When I say 
prediction I mean the assessment of what are the possible 
futures for a scenario and how likely each one is. It can be 
conscious or unconscious. I think my understanding of the word 
is not an uncommon one.


By my definition, when you balance the risk of loss (i.e. 
predict how likely you are to lose money) against the 
prospective gain (i.e. multiply the probability of each 
possible outcome by its reward and sum the total to get a 
prospective value) then you are, by my definition and 
therefore, for me, by definition, making predictions.


It's tough when dealing with genuine - Knightian uncertainty or 
even more radical versions.  When one doesn't even know the 
structure of the problem then maximising expected utility doesn't 
work.  One can look at capacities - Choquet and the like - but 
then its harder to say something useful about what you should do.


And I think when dealing with human action and institutions we 
are in a world of uncertainty more often than not.




It's not the prediction that matters but what you do.  It's 
habits, routines, perception, adaptation and actions that 
matter.


I agree they are integral to our behaviour and habits and 
routines do not involve the element of prediction. Perceptions 
come before and actions take place after the decision process 
is made (conscious or not) and so don't factor into this 
discussion for me.


But it's a loop and one never takes a final decision to master D. 
Also habits, routines and structures _do_ shape perception.




In truth I avoid discussions that are really just arguing about 
definitions of words, but you made a couple of sweeping 
bumper-stickery comments


That's entertaining.  I've not been accused of that before!  Bear 
in mind also I tend to write on my phone.



that trying to predict things was
usually a waste of time and as an alternative we should 'be the 
change...'. I wholeheartedly agree we should 'be the change...' 
but it's not an alternative to making predictions, it goes hand 
in hand with it. I'm sure you've read Kahneman's Thinking, Fast 
and Slow. You made a generalisation that applies to the 'fast' 
part. I'm saying your universal rule is wrong because of the 
slow part.


Yes I read Kahneman et al papers for the first time in 92 in the 
university library.  I speed-read his book, and I thought it was 
a bad book.  I work with a specialist in making decisions under 
uncertainty - she was the only person able to articulate to 
George Soros how he made money because he certainly couldn't, and 
she is mentioned in the preface to the revised version of 
Alchemy.  She has the same view as me - behavioural finance is 
largely a dead end.  One learns much more by going straight to 
the neuroeconomics and incorporating also the work of Dr Iain 
Macgilchrist.


Kahneman makes a mistake in his choice of dimension.  There's 
analytic and intuitive/gestalt and in my experience people making 
high stakes decisions are much less purely analytical than a 
believer in the popular Kahneman might suggest.


What I said about prediction being overrated isn't controversial 
amongst a good number of the best traders and business people in 
finance.  You might read Nassim Taleb also.


I learnt D many years ago just after Andrei's book came out. I 
love it but it's on the shelf at the moment for me. I rarely 
get time for side projects these days but when I do I want them 
to run on Android with easy access to all the APIs and without 
too much ado in the build setup. They must continue to work and 
be supported with future versions of Android. At work, on 
Windows, JDK8/JavaFX/Eclipse/maven and 
python/numpy/Qt/OpenCascade/VTK hit the spot.


Well it's a pity the D Android ecosystem isn't yet mature.  Still 
I remain in awe of the stubborn accomplishment of the man (with 
help) who got LDC to run on Android.


It's not that bad calling D from Java.  Some day I will see if I 
can help automate that - Kai started working on it already I 
think.



each project I
start I give some very hard thought about which development 
environment I'm going to use, and D is often one of those 
options. The likely future of D on the different platforms is 
an important part of that assessment, hence 'predicting' the 
future of D, hard and very unreliable though that is, is an 
important element in some of my less trivial decisions.


Since you already know D you need to answer a different question. 
 What's the chance the compiler will die on the relevant 

[OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Abdulhaq via Digitalmars-d

On Friday, 27 July 2018 at 23:42:47 UTC, Laeeth Isharc wrote:

For me, I think that managing money is about choosing to expose 
your capital intelligently to the market, balancing the risk of 
loss against the prospective gain and considering this in a 
portfolio sense.


Prediction doesn't really come into that



I think this apparent difference of opinion is down to different 
definitions of the word prediction. When I say prediction I mean 
the assessment of what are the possible futures for a scenario 
and how likely each one is. It can be conscious or unconscious. I 
think my understanding of the word is not an uncommon one.


By my definition, when you balance the risk of loss (i.e. predict 
how likely you are to lose money) against the prospective gain 
(i.e. multiply the probability of each possible outcome by its 
reward and sum the total to get a prospective value) then you 
are, by my definition and therefore, for me, by definition, 
making predictions.




It's not the prediction that matters but what you do.  It's 
habits, routines, perception, adaptation and actions that 
matter.


I agree they are integral to our behaviour and habits and 
routines do not involve the element of prediction. Perceptions 
come before and actions take place after the decision process is 
made (conscious or not) and so don't factor into this discussion 
for me.


In truth I avoid discussions that are really just arguing about 
definitions of words, but you made a couple of sweeping 
bumper-stickery comments that trying to predict things was 
usually a waste of time and as an alternative we should 'be the 
change...'. I wholeheartedly agree we should 'be the change...' 
but it's not an alternative to making predictions, it goes hand 
in hand with it. I'm sure you've read Kahneman's Thinking, Fast 
and Slow. You made a generalisation that applies to the 'fast' 
part. I'm saying your universal rule is wrong because of the slow 
part.


I learnt D many years ago just after Andrei's book came out. I 
love it but it's on the shelf at the moment for me. I rarely get 
time for side projects these days but when I do I want them to 
run on Android with easy access to all the APIs and without too 
much ado in the build setup. They must continue to work and be 
supported with future versions of Android. At work, on Windows, 
JDK8/JavaFX/Eclipse/maven and python/numpy/Qt/OpenCascade/VTK hit 
the spot. Each project I start I give some very hard thought 
about which development environment I'm going to use, and D is 
often one of those options. The likely future of D on the 
different platforms is an important part of that assessment, 
hence 'predicting' the future of D, hard and very unreliable 
though that is, is an important element in some of my less 
trivial decisions.