Re: You don't like GC? Do you?

2018-10-16 Thread Stanislav Blinov via Digitalmars-d

On Tuesday, 16 October 2018 at 11:42:55 UTC, Tony wrote:

On Monday, 15 October 2018 at 08:21:11 UTC, Eugene Wissner



He doesn't argue against garbage collection.


Thanks, Eugene, I was starting to lose hope in humanity.


Well, can you state what he does argue against?


I did state what I was arguing against, if you actually read the 
thread and not only pick select statements I'm sure you'll find 
it.


Wouldn't C++ or Rust, with their smart pointers, be a better 
choice for someone who wants to use a compiles-to-object-code 
language, but can't suffer any garbage collector delays?


What is up with people and this thread? Who is talking about 
garbage collector delays? If you do use the GC, they're a given, 
and you work with them. *Just like you should with everything 
else*.


I'm talking about code that doesn't give a  about utilizing 
machine resources correctly. Crap out "new" everywhere, it lets 
you write code fast. Is it actually a good idea to collect here? 
Hell if you know, you don't care, carry on! Crap out classes 
everywhere, it lets you write code fast. Pull in a zillion of 
external dependencies 90% of which you have no idea what they're 
for, what they do and how much cruft they bring with them, they 
let you write code fast. Oh look, you have no threads! Maybe you 
should write a, a... a task system! Yes, full of classes and, and 
futures and... stuff. But no, nononono, writing is too long, 
let's take a ready one. Implemented by another awesome programmer 
just like you! And then spawn... h, a second thread! Yes! Two 
threads are better than one! What for? It doesn't matter what 
for, don't think about it. Better yet! Spawn four! Six! Twelve! 
And then serialize them all with one mutex, because to hell with 
learning that task system you downloaded, you have code to write. 
What did you say? Pointers? Nah, you have twelve threads and a 
mutex. Surely you need reference counted objects. Pointers are 
bad for you, they will have you think...
Then, after this jellyfish wobbly pile of crud is starting to 
swell and smell, then start "optimizing" it. Profile first 
though. Profile, measure! Only first write more cruft in order to 
measure what needs to be measured, otherwise you might 
accidentally measure all those libXXX you used and all those 
cache misses you wrote. And then fix it. By doing more of the 
above, as luck would have it.


"Don't be a computer..." What a joke.


Re: You don't like GC? Do you?

2018-10-16 Thread Tony via Digitalmars-d

On Monday, 15 October 2018 at 08:21:11 UTC, Eugene Wissner wrote:

On Monday, 15 October 2018 at 05:26:56 UTC, Tony wrote:




Ideally you wouldn’t have chosen to even try D. You (and 
others who spend so much time arguing against garbage 
collection on a forum for a language designed with garbage 
collection) would be better off using a non-garbage collected 
language.


He doesn't argue against garbage collection.


Well, can you state what he does argue against?

And D is one of the few languages that can be used without 
garbage collection, so it can be a non-garbage collected 
language and can be used as such.


Wouldn't C++ or Rust, with their smart pointers, be a better 
choice for someone who wants to use a compiles-to-object-code 
language, but can't suffer any garbage collector delays?




Re: You don't like GC? Do you?

2018-10-15 Thread 12345swordy via Digitalmars-d
On Monday, 15 October 2018 at 20:22:54 UTC, Stanislav Blinov 
wrote:

Neither are of any particular use.


Pot called, he wants to see Mr. kettle.


Re: You don't like GC? Do you?

2018-10-15 Thread Stanislav Blinov via Digitalmars-d

On Monday, 15 October 2018 at 20:12:47 UTC, 12345swordy wrote:
On Monday, 15 October 2018 at 19:57:59 UTC, Stanislav Blinov 
wrote:
If you want to have an argument, I suggest you stop quote 
mining and start paying attention.


If you wanted an argument from me, then you need to stop with 
the "LOL YOU MAD BRO" rhetoric.


...and again he grabs a single quote.

Look, so far all you've contributed to this thread is one poor 
excuse and a ton of victim act. Neither are of any particular use.


Re: You don't like GC? Do you?

2018-10-15 Thread 12345swordy via Digitalmars-d
On Monday, 15 October 2018 at 19:57:59 UTC, Stanislav Blinov 
wrote:
If you want to have an argument, I suggest you stop quote 
mining and start paying attention.


If you wanted an argument from me, then you need to stop with the 
"LOL YOU MAD BRO" rhetoric.


Re: You don't like GC? Do you?

2018-10-15 Thread Stanislav Blinov via Digitalmars-d

On Monday, 15 October 2018 at 18:00:24 UTC, 12345swordy wrote:
On Monday, 15 October 2018 at 17:30:28 UTC, Stanislav Blinov 
wrote:

On Monday, 15 October 2018 at 16:46:45 UTC, 12345swordy wrote:
On Monday, 15 October 2018 at 00:02:31 UTC, Stanislav Blinov 
wrote:

I'm arrogants, huh?

When you say statements like this.

you don't give a flying duck about your impact on the 
industry.

It come across as condescending and arrogant.


Yep, and everything else that's inconvenient you'd just cut 
out.
You mean the part that you straw man me, and resort to personal 
attacks? No need for me to address it.


Pfff... *I* am "straw man"ing you? That's just hilarious.

"Not everything needs to be fast" - I never said everything needs 
to be fast. I'm saying everything *doesn't need to be slow* due 
to lazy people doing lazy things because they "don't want to 
think about it". So who's straw man-ing who, exactly? Do you even 
understand the difference?


By saying that you're more interested in saving your development 
time as opposed to processing time *for web apps, no less*, 
you're admitting that you don't care about the consequences of 
your actions. You finding it "personal" only supports that 
assessment, so be my guest, be offended. Or be smart, and stop 
and think about what you're doing.



Did I hit a nerve?..


Case in point.


If you want to have an argument, I suggest you stop quote mining 
and start paying attention.


Re: You don't like GC? Do you?

2018-10-15 Thread 12345swordy via Digitalmars-d
On Monday, 15 October 2018 at 17:30:28 UTC, Stanislav Blinov 
wrote:

On Monday, 15 October 2018 at 16:46:45 UTC, 12345swordy wrote:
On Monday, 15 October 2018 at 00:02:31 UTC, Stanislav Blinov 
wrote:

I'm arrogants, huh?

When you say statements like this.

you don't give a flying duck about your impact on the 
industry.

It come across as condescending and arrogant.


Yep, and everything else that's inconvenient you'd just cut out.
You mean the part that you straw man me, and resort to personal 
attacks? No need for me to address it.



Did I hit a nerve?..


Case in point.


Re: You don't like GC? Do you?

2018-10-15 Thread Stanislav Blinov via Digitalmars-d

On Monday, 15 October 2018 at 16:46:45 UTC, 12345swordy wrote:
On Monday, 15 October 2018 at 00:02:31 UTC, Stanislav Blinov 
wrote:

I'm arrogants, huh?

When you say statements like this.


you don't give a flying duck about your impact on the industry.

It come across as condescending and arrogant.


Yep, and everything else that's inconvenient you'd just cut out. 
Did I hit a nerve?..


Re: You don't like GC? Do you?

2018-10-15 Thread 12345swordy via Digitalmars-d
On Monday, 15 October 2018 at 00:02:31 UTC, Stanislav Blinov 
wrote:

I'm arrogants, huh?

When you say statements like this.


you don't give a flying duck about your impact on the industry.

It come across as condescending and arrogant.



Re: You don't like GC? Do you?

2018-10-15 Thread Stanislav Blinov via Digitalmars-d

On Monday, 15 October 2018 at 10:11:15 UTC, rjframe wrote:

On Sat, 13 Oct 2018 12:22:29 +, Stanislav Blinov wrote:


And?.. Would you now go around preaching how awesome the GC is 
and that everyone should use it?


For something like I did, yes.

The article the OP links to may want GC for everything; the 
excerpt the OP actually quoted is talking about applications 
where memory management isn't the most important thing. I 
completely agree with that excerpt.


Yeah, well, what's the title of this thread, and what's the 
conclusion of that post?


Automation is what literally all of us do. But you should not 
automate something you don't understand.


Re: You don't like GC? Do you?

2018-10-15 Thread rjframe via Digitalmars-d
On Sat, 13 Oct 2018 12:22:29 +, Stanislav Blinov wrote:

> On Saturday, 13 October 2018 at 12:15:07 UTC, rjframe wrote:
> 
>> ...I didn't even keep the script; I'll never need it again. There are
>> times when the easy or simple solution really is the best one for the
>> task at hand.
> 
> And?.. Would you now go around preaching how awesome the GC is and that
> everyone should use it?

For something like I did, yes.

The article the OP links to may want GC for everything; the excerpt the OP 
actually quoted is talking about applications where memory management 
isn't the most important thing. I completely agree with that excerpt.


Re: You don't like GC? Do you?

2018-10-15 Thread Eugene Wissner via Digitalmars-d

On Monday, 15 October 2018 at 05:26:56 UTC, Tony wrote:
On Sunday, 14 October 2018 at 07:51:09 UTC, Stanislav Blinov 
wrote:


That's a lamest excuse if I ever seen one. If you can't be 
bothered to acquire one of the most relevant skills for 
writing code for modern systems, then:


a) Ideally, you shouldn't be writing code
b) At the very least, you're not qualified to give any advice 
pertaining to writing code


PS. "Correctness" also includes correct use of the machine and 
it's resources.


Ideally you wouldn’t have chosen to even try D. You (and others 
who spend so much time arguing against garbage collection on a 
forum for a language designed with garbage collection) would be 
better off using a non-garbage collected language.


He doesn't argue against garbage collection. And D is one of the 
few langauges that can be used without garbage collector, so it 
can be non-garbage collected language and can be used as such.


Re: You don't like GC? Do you?

2018-10-14 Thread Tony via Digitalmars-d
On Sunday, 14 October 2018 at 07:51:09 UTC, Stanislav Blinov 
wrote:


That's a lamest excuse if I ever seen one. If you can't be 
bothered to acquire one of the most relevant skills for writing 
code for modern systems, then:


a) Ideally, you shouldn't be writing code
b) At the very least, you're not qualified to give any advice 
pertaining to writing code


PS. "Correctness" also includes correct use of the machine and 
it's resources.


Ideally you wouldn’t have chosen to even try D. You (and others 
who spend so much time arguing against garbage collection on a 
forum for a language designed with garbage collection) would be 
better off using a non-garbage collected language.


Re: You don't like GC? Do you?

2018-10-14 Thread Stanislav Blinov via Digitalmars-d

On Sunday, 14 October 2018 at 20:26:10 UTC, 12345swordy wrote:

It not an excuse, it's reality. The d language have multiple 
issues, the idea to have the language to have built in support 
for GC is NOT one of them.


Read this thread again then, carefully. You *have to* understand 
D's GC in order to use it correctly, efficiently, and safely. And 
to do that, you *have to* understand your data and what you're 
doing with it. And to do that you *have to* work with the 
machine, not in spite of it. At which point you may well 
reconsider using the GC in the first place. Or you may not. But 
at least that will be an informed decision based on actual value, 
not this "save time" fallacy.


We develop our software using C# and the GC is a huge time 
saver for us as we are developing web apps.


So you're in this for a quick buck, and to hell with everything 
else. Got it. And C#, so likely also everything is an "object", 
and screw the heap wholesale, right?.. Save time writing code, 
waste time processing data. Cool choice.



I find your side remarks to be very arrogant and condescending.


I'm arrogant, huh? It's people like you who think that "the" way 
to program is produce crappy code fast.


It's so funny how all of you guys seem to think that I'm against 
the GC. I'm not. I'm against stupid "advice" like the one given 
in the OP. Almost all of you seem like you're in the same boat: 
you don't give a flying duck about your impact on the industry.


Re: You don't like GC? Do you?

2018-10-14 Thread 12345swordy via Digitalmars-d
On Sunday, 14 October 2018 at 07:51:09 UTC, Stanislav Blinov 
wrote:

On Saturday, 13 October 2018 at 21:44:45 UTC, 12345swordy wrote:

Not everyone have the time nor skills of doing manual memory 
management. Even more so when correctness is way more 
important than speed.


Not everything needs to be fast.


That's a lamest excuse if I ever seen one.


It not an excuse, it's reality. The d language have multiple 
issues, the idea to have the language to have built in support 
for GC is NOT one of them.
We develop our software using C# and the GC is a huge time saver 
for us as we are developing web apps.


I find your side remarks to be very arrogant and condescending.


Re: You don't like GC? Do you?

2018-10-14 Thread Stanislav Blinov via Digitalmars-d

On Saturday, 13 October 2018 at 21:44:45 UTC, 12345swordy wrote:

Not everyone have the time nor skills of doing manual memory 
management. Even more so when correctness is way more important 
than speed.


Not everything needs to be fast.


That's a lamest excuse if I ever seen one. If you can't be 
bothered to acquire one of the most relevant skills for writing 
code for modern systems, then:


a) Ideally, you shouldn't be writing code
b) At the very least, you're not qualified to give any advice 
pertaining to writing code


PS. "Correctness" also includes correct use of the machine and 
it's resources.


Re: You don't like GC? Do you?

2018-10-13 Thread 12345swordy via Digitalmars-d
On Saturday, 13 October 2018 at 14:43:22 UTC, Stanislav Blinov 
wrote:

On Saturday, 13 October 2018 at 13:17:41 UTC, Atila Neves wrote:


[...]


It rarely does indeed. Usually it's someone else that has to 
sift through your code and fix your bugs years later. Because 
by that time you're long gone on another job, happily writing 
more code without thinking about it.


[...]


Not everyone have the time nor skills of doing manual memory 
management. Even more so when correctness is way more important 
than speed.


Not everything needs to be fast.


Re: You don't like GC? Do you?

2018-10-13 Thread Stanislav Blinov via Digitalmars-d

On Saturday, 13 October 2018 at 13:17:41 UTC, Atila Neves wrote:

Then five years later, try and hunt down that mysterious heap 
corruption. Caused by some destructor calling into buggy 
third-party code. Didn't want to think about that one either?


That hasn't happened to me.


It rarely does indeed. Usually it's someone else that has to sift 
through your code and fix your bugs years later. Because by that 
time you're long gone on another job, happily writing more code 
without thinking about it.


There is no "sometimes" here. You're writing programs for 
specific machines. All. The. Time.


I am not. The last time I wrote code for a specific machine it 
was for my 386, probably around 1995.


Yes you are. Or what, you're running your executables on a 1990 
issue calculator? :P Somehow I doubt that.


Precisely where in memory your data is, how it got there and 
how it's laid out should be bread and butter of any D 
programmer.


Of any D programmer writing code that's performance sensitive.


All code is performance sensitive.


If that were true, nobody would write code in Python. And yet...


Nobody would write code in Python if Python didn't exist. That it 
exists means there's a demand. Because there are an awful lot of 
folks who just "don't want to think about it".
Remember 2000s? Everybody and their momma was a developer. Web 
developer, Python, Java, take your pick. Not that they knew what 
they were doing, but it was a good time to peddle crap.
Now, Python in an of itself is not a terrible language. But 
people write *system* tools and scripts with it. WTF? I mean, if 
you could care less how the machine works, you have *no* business 
developing *anything* for an OS.



If it's not speed, it's power consumption. Or memory. Or I/O.


Not if it's good enough as it is. Which, in my my experience, 
is frequently the case. YMMV.


That is not a reason to intentionally write *less* efficient code.

"Not thinking" about any of that means you're treating your 
power champion horse as if it was a one-legged pony.


Yes. I'd rather the computer spend its time than I mine. I 
value the latter far more than the former.


And what if your code wastes someone else's time at some later 
point? Hell with it, not your problem, right?


Advocating the "not thinking" approach makes you an outright 
evil person.


Is there meetup for evil people now that I qualify? :P


Any gathering of like-minded programmers will do.


Re: You don't like GC? Do you?

2018-10-13 Thread Stanislav Blinov via Digitalmars-d

On Saturday, 13 October 2018 at 13:08:30 UTC, Atila Neves wrote:

On Friday, 12 October 2018 at 23:24:56 UTC, Stanislav Blinov


Funny. Now for real, in a throwaway script, what is there to 
gain from a GC? Allocate away and forget about it.


In case you run out of memory, the GC scans. That's the gain.


Correction, in case the GC runs out of memory, it scans.


In fact, the GC runtime will only detract from performance.

Demonstrably untrue. It puzzles me why this myth persists.

Myth, is it now?

Yes.


Please demonstrate.

Unless all you do is allocate memory, which isn't any kind of 
useful application, pretty much on each sweep run the GC's 
metadata is *cold*.


*If* the GC scans.


"If"? So... ahem... what, exactly, is the point of a GC that 
doesn't scan? What are you even arguing here? That you can 
allocate and never free? You can do that without GC just as well.


There are trade-offs, and one should pick whatever is best 
for the situation at hand.


Exactly. Which is *not at all* what the OP is encouraging to 
do.


I disagree. What I got from the OP was that for most code, the 
GC helps. I agree with that sentiment.


Helps write code faster? Yes, I'm sure it does. It also helps 
write slower unsafe code faster, unless you're paying attention, 
which, judging by your comments, you're not and aren't inclined 
to.


Alright, from one non-native English speaker to another, well 
done, I salute you.


The only way I'd qualify as a non-native English speaker would 
be to pedantically assert that I can't be due to not having 
learned it first. In any case, I'd never make fun of somebody's 
English if they're non-native, and that's most definitely not 
what I was trying to do here - I assume the words "simple" and 
"easy" exist in most languages. I was arguing about semantics.


Just FYI, they're the same word in my native language :P

To the point: *that* is a myth. The bugs you're referring to 
are not *solved* by the GC, they're swept under a rug.


Not in my experience. They've literally disappeared from the 
code I write.


Right. At the expense of introducing unpredictable behavior in 
your code. Unless you thought about that.


Because the bugs themselves are in the heads, stemming from 
that proverbial programmer laziness. It's like everyone is 
Scarlett O'Hara with a keyboard.



IMHO, lazy programmers are good programmers.


Yes, but not at the expense of users and other programmers who'd 
use their code.


For most applications, you *do* know how much memory you'll 
need, either exactly or an estimation.


I don't, maybe you do. I don't even care unless I have to. See 
my comment above about being lazy.


Too bad. You really, really should.

Well, I guess either of those do take more arguments than a 
"new", so yup, you do indeed write "less" code. Only that you 
have no clue how much more code is hiding behind that "new",


I have a clue. I could even look at the druntime code if I 
really cared. But I don't.


You should.

how many indirections, DLL calls, syscalls with libc's 
wonderful poison that is errno... You don't want to think 
about that.


That's right, I don't.


You should. Everybody should.

Then two people start using your script. Then ten, a hundred, 
a thousand. Then it becomes a part of an OS distribution. And 
no one wants to "think about that".


Meh. There are so many executables that are part of 
distributions that are written in Python, Ruby or JavaScript.


Exactly my point. That's why we *must not* pile more crap on top 
of that. That's why we *must* think about the code we write. Just 
because your neighbour sh*ts in a public square, doesn't mean 
that you must do that too.


For me, the power of tracing GC is that I don't need to think 
about ownership, lifetimes, or manual memory management.


Yes you do, don't delude yourself.


No, I don't. I used to in C++, and now I don't.


Yes you do, you say as much below.

Pretty much the only way you don't is if you're writing purely 
functional code.


I write pure functional code by default. I only use 
side-effects when I have to and I isolate the code that does.



But we're talking about D here.
Reassigned a reference? You thought about that. If you didn't, 
you just wrote a nasty bug. How much more hypocrisy can we 
reach here?


I probably didn't write a nasty bug if the pointer that was 
reassigned was to GC allocated memory. It lives as long as it 
has to, I don't think about it.


In other words, you knew what you were doing, at which point I'd 
ask, what's the problem with freeing the no-longer-used memory 
there and then? There's nothing to "think" about.


"Fun" fact: it's not @safe to "new" anything in D if your 
program uses any classes. Thing is, it does unconditionally 
thanks to DRuntime.


I hardly ever use classes in D, but I'd like to know more about 
why it's not @safe.


rikki's example isn't exactly the one I was talking about, so 
here goes:


module mycode;

import std.stdio;

import 

Re: You don't like GC? Do you?

2018-10-13 Thread rikki cattermole via Digitalmars-d

On 14/10/2018 2:08 AM, Atila Neves wrote:
"Fun" fact: it's not @safe to "new" anything in D if your program uses 
any classes. Thing is, it does unconditionally thanks to DRuntime.


I hardly ever use classes in D, but I'd like to know more about why it's 
not @safe.


void main() @safe {
Foo foo = new Foo(8);
foo.print();
}

class Foo {
int x;

this(int x) @safe {
this.x = x;
}

void print() @safe {
import std.stdio;

try {
writeln(x);
} catch(Exception) {
}
}
}


Re: You don't like GC? Do you?

2018-10-13 Thread Atila Neves via Digitalmars-d
On Friday, 12 October 2018 at 23:35:19 UTC, Stanislav Blinov 
wrote:

On Friday, 12 October 2018 at 21:39:13 UTC, Atila Neves wrote:

D isn't Java. If you can, put your data on the stack. If you 
can't, `new` away and don't think about it.


Then five years later, try and hunt down that mysterious heap 
corruption. Caused by some destructor calling into buggy 
third-party code. Didn't want to think about that one either?


That hasn't happened to me.

I mean come on, it's 2018. We're writing code for multi-core 
and multi-processor systems with complex memory interaction.


Sometimes we are. Other times it's a 50 line script.


There is no "sometimes" here. You're writing programs for 
specific machines. All. The. Time.


I am not. The last time I wrote code for a specific machine it 
was for my 386, probably around 1995.


Precisely where in memory your data is, how it got there and 
how it's laid out should be bread and butter of any D 
programmer.


Of any D programmer writing code that's performance sensitive.


All code is performance sensitive.


If that were true, nobody would write code in Python. And yet...


If it's not speed, it's power consumption. Or memory. Or I/O.


Not if it's good enough as it is. Which, in my my experience, is 
frequently the case. YMMV.


"Not thinking" about any of that means you're treating your 
power champion horse as if it was a one-legged pony.


Yes. I'd rather the computer spend its time than I mine. I value 
the latter far more than the former.


Advocating the "not thinking" approach makes you an outright 
evil person.


Is there meetup for evil people now that I qualify? :P

https://www.youtube.com/watch?v=FVAD3LQmxbw=42


Re: You don't like GC? Do you?

2018-10-13 Thread Atila Neves via Digitalmars-d
On Friday, 12 October 2018 at 23:24:56 UTC, Stanislav Blinov 
wrote:

On Friday, 12 October 2018 at 21:34:35 UTC, Atila Neves wrote:


---
When writing a throwaway script...


...there's absolutely no need for a GC.


True. There's also absolutely no need for computer languages 
either, machine code is sufficient.


Funny. Now for real, in a throwaway script, what is there to 
gain from a GC? Allocate away and forget about it.


In case you run out of memory, the GC scans. That's the gain.


In fact, the GC runtime will only detract from performance.



Demonstrably untrue. It puzzles me why this myth persists.


Myth, is it now?


Yes.

Unless all you do is allocate memory, which isn't any kind of 
useful application, pretty much on each sweep run the GC's 
metadata is *cold*.


*If* the GC scans.

There are trade-offs, and one should pick whatever is best for 
the situation at hand.


Exactly. Which is *not at all* what the OP is encouraging to do.


I disagree. What I got from the OP was that for most code, the GC 
helps. I agree with that sentiment.


Alright, from one non-native English speaker to another, well 
done, I salute you.


The only way I'd qualify as a non-native English speaker would be 
to pedantically assert that I can't be due to not having learned 
it first. In any case, I'd never make fun of somebody's English 
if they're non-native, and that's most definitely not what I was 
trying to do here - I assume the words "simple" and "easy" exist 
in most languages. I was arguing about semantics.


To the point: *that* is a myth. The bugs you're referring to 
are not *solved* by the GC, they're swept under a rug.


Not in my experience. They've literally disappeared from the code 
I write.


Because the bugs themselves are in the heads, stemming from 
that proverbial programmer laziness. It's like everyone is 
Scarlett O'Hara with a keyboard.


IMHO, lazy programmers are good programmers.

For most applications, you *do* know how much memory you'll 
need, either exactly or an estimation.


I don't, maybe you do. I don't even care unless I have to. See my 
comment above about being lazy.


Well, I guess either of those do take more arguments than a 
"new", so yup, you do indeed write "less" code. Only that you 
have no clue how much more code is hiding behind that "new",


I have a clue. I could even look at the druntime code if I really 
cared. But I don't.


how many indirections, DLL calls, syscalls with libc's 
wonderful poison that is errno... You don't want to think about 
that.


That's right, I don't.

Then two people start using your script. Then ten, a hundred, a 
thousand. Then it becomes a part of an OS distribution. And no 
one wants to "think about that".


Meh. There are so many executables that are part of distributions 
that are written in Python, Ruby or JavaScript.


For me, the power of tracing GC is that I don't need to think 
about ownership, lifetimes, or manual memory management.


Yes you do, don't delude yourself.


No, I don't. I used to in C++, and now I don't.

Pretty much the only way you don't is if you're writing purely 
functional code.


I write pure functional code by default. I only use side-effects 
when I have to and I isolate the code that does.



But we're talking about D here.
Reassigned a reference? You thought about that. If you didn't, 
you just wrote a nasty bug. How much more hypocrisy can we 
reach here?


I probably didn't write a nasty bug if the pointer that was 
reassigned was to GC allocated memory. It lives as long as it has 
to, I don't think about it.


"Fun" fact: it's not @safe to "new" anything in D if your 
program uses any classes. Thing is, it does unconditionally 
thanks to DRuntime.


I hardly ever use classes in D, but I'd like to know more about 
why it's not @safe.


Yes, there are other resources to manage. RAII nearly always 
manages that, I don't need to think about that either.


Yes you do. You do need to write those destructors or scoped 
finalizers, don't you? Or so help me use a third-party library 
that implements those? There's fundamentally *no* difference 
from memory management here. None, zero, zip.


I write a destructor once, then I never think about it again. 
It's a lot different from worrying about closing resources all 
the time. I don't write `scope(exit)` unless it's only once as 
well, otherwise I wrap the code in an RAII struct.


Why is Socket a class, blown up from a puny 32-bit value to a 
bloated who-knows-how-many-bytes monstrosity? Will that socket 
close if you rely on the GC? Yes? No? Maybe? Why?


I don't know. I don't think Socket should even have been a class. 
I assume it was written in the D1 days.


Can I deploy the compiler on a remote machine with limited RAM 
and expect it to always successfully build my projects and not 
run out of memory?


If the compiler had the GC turned on, yes. That's not a point 
about GC, it's a point about dmd.





Re: You don't like GC? Do you?

2018-10-13 Thread Stanislav Blinov via Digitalmars-d

On Saturday, 13 October 2018 at 12:15:07 UTC, rjframe wrote:

...I didn't even keep the script; I'll never need it again. 
There are times when the easy or simple solution really is the 
best one for the task at hand.


And?.. Would you now go around preaching how awesome the GC is 
and that everyone should use it?


Re: You don't like GC? Do you?

2018-10-13 Thread rjframe via Digitalmars-d
On Fri, 12 Oct 2018 23:35:19 +, Stanislav Blinov wrote:

>>> Precisely where in memory your data is, how it got there and how it's
>>> laid out should be bread and butter of any D programmer.
>>
>> Of any D programmer writing code that's performance sensitive.
> 
> All code is performance sensitive. Whoever invented that distinction
> should be publicly humiliated. If it's not speed, it's power
> consumption. Or memory. Or I/O. "Not thinking" about any of that means
> you're treating your power champion horse as if it was a one-legged
> pony.

And sometimes it's programmer performance. Last year I had a malformed CSV 
I needed to manipulate; Excel couldn't handle it, and I couldn't (or don't 
know how to) trust a Vim macro to do it, so I wrote a small script in D. 
My design wasn't even close to high-performance, but it was easy to test 
(which was probably my biggest requirement); I probably could have spent 
another 30 minutes writing something that would have run two minutes 
faster, but that would have been inefficient.

I didn't even keep the script; I'll never need it again. There are times 
when the easy or simple solution really is the best one for the task at 
hand.


Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Friday, 12 October 2018 at 23:32:34 UTC, Nicholas Wilson wrote:

On Friday, 12 October 2018 at 20:12:26 UTC, Stanislav Blinov


That's done first and foremost by stripping out unnecessary 
allocations, not by writing "new" every other line and closing 
your eyes.


If you need perf in your _scripts_, a use LDC and b) pass -O3 
which among many other improvements over baseline will promote 
unnecessary garbage collection to the stack.


If you *need* perf, you write performant code. If you don't need 
perf, the least you can do is *not write* lazy-ass pessimized 
crap.


I mean come on, it's 2018. We're writing code for multi-core 
and multi-processor systems with complex memory interaction.


We might be sometimes. I suspect that is less likely for a 
script to fall in that category.


Jesus guys. *All* code falls in that category. Because it is 
being executed by those machines. Yet we all oh so like to 
pretend that doesn't happen, for some bizarre reason.


Precisely where in memory your data is, how it got there and 
how it's laid out should be bread and butter of any D 
programmer. It's true that it isn't critical for one-off 
scripts, but so is deallocation.


Saying stuff like "do more with GC" is just outright harmful.


That is certainly not an unqualified truth. Yes one shouldn't 
`new` stuff just for fun, but speed of executable is often not 
what one is trying to optimise when writing code, e.g. when 
writing a script one is probably trying to minimise 
development/debugging time.


That's fine so long as it doesn't unnecessarily *pessimize* 
execution. Unfortunately, when you advertise GC for it's 
awesomeness in your experience with "throwaway" scripts, you're 
sending a very, *very* wrong message.



Kids are reading, for crying out loud.
Oi, you think thats bad? Try reading what some of the other 
Aussies post, *cough* e.g. a frustrated Manu *cough*


:)


Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Friday, 12 October 2018 at 21:39:13 UTC, Atila Neves wrote:

D isn't Java. If you can, put your data on the stack. If you 
can't, `new` away and don't think about it.


Then five years later, try and hunt down that mysterious heap 
corruption. Caused by some destructor calling into buggy 
third-party code. Didn't want to think about that one either?


The chances you'll have to optimise the code are not high. If 
you do, the chances that the GC allocations are the problem are 
also not high. If the profiler shows they are... then remove 
those allocations.


I mean come on, it's 2018. We're writing code for multi-core 
and multi-processor systems with complex memory interaction.


Sometimes we are. Other times it's a 50 line script.


There is no "sometimes" here. You're writing programs for 
specific machines. All. The. Time.


Precisely where in memory your data is, how it got there and 
how it's laid out should be bread and butter of any D 
programmer.


Of any D programmer writing code that's performance sensitive.


All code is performance sensitive. Whoever invented that 
distinction should be publicly humiliated. If it's not speed, 
it's power consumption. Or memory. Or I/O. "Not thinking" about 
any of that means you're treating your power champion horse as if 
it was a one-legged pony.
Advocating the "not thinking" approach makes you an outright evil 
person.


Re: You don't like GC? Do you?

2018-10-12 Thread Nicholas Wilson via Digitalmars-d
On Friday, 12 October 2018 at 20:12:26 UTC, Stanislav Blinov 
wrote:
On Friday, 12 October 2018 at 19:55:02 UTC, Nicholas Wilson 
wrote:


Freeing your mind and the codebase of having to deal with 
memory leaves it in an easier place to deal with the less 
common higher impact leaks: file descriptors, sockets, 
database handles ect. (this is like chopping down the forest 
so you can see the trees you care about ;) ).


That's done first and foremost by stripping out unnecessary 
allocations, not by writing "new" every other line and closing 
your eyes.


If you need perf in your _scripts_, a use LDC and b) pass -O3 
which among many other improvements over baseline will promote 
unnecessary garbage collection to the stack.


I mean come on, it's 2018. We're writing code for multi-core 
and multi-processor systems with complex memory interaction.


We might be sometimes. I suspect that is less likely for a script 
to fall in that category.


Precisely where in memory your data is, how it got there and 
how it's laid out should be bread and butter of any D 
programmer. It's true that it isn't critical for one-off 
scripts, but so is deallocation.


Saying stuff like "do more with GC" is just outright harmful.


That is certainly not an unqualified truth. Yes one shouldn't 
`new` stuff just for fun, but speed of executable is often not 
what one is trying to optimise when writing code, e.g. when 
writing a script one is probably trying to minimise 
development/debugging time.



Kids are reading, for crying out loud.


Oi, you think thats bad? Try reading what some of the other 
Aussies post, *cough* e.g. a frustrated Manu *cough*




Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Friday, 12 October 2018 at 21:34:35 UTC, Atila Neves wrote:


---
When writing a throwaway script...


...there's absolutely no need for a GC.


True. There's also absolutely no need for computer languages 
either, machine code is sufficient.


Funny. Now for real, in a throwaway script, what is there to gain 
from a GC? Allocate away and forget about it.



In fact, the GC runtime will only detract from performance.



Demonstrably untrue. It puzzles me why this myth persists.


Myth, is it now? Unless all you do is allocate memory, which 
isn't any kind of useful application, pretty much on each sweep 
run the GC's metadata is *cold*. What's worse, you don't control 
how much data there is and where it is. Need I say more? If you 
disagree, please do the demonstration then.


There are trade-offs, and one should pick whatever is best for 
the situation at hand.


Exactly. Which is *not at all* what the OP is encouraging to do.

What this means is that whenever I have disregarded a block 
of information, say removed an index from an array, then that 
memory is automatically cleared and freed back up on the next 
sweep. While the process of collection and actually checking


Which is just as easily achieved with just one additional line 
of code: free the memory.


*Simply* achieved, not *easily*. Decades of bugs has shown 
emphatically that it's not easy.


Alright, from one non-native English speaker to another, well 
done, I salute you. I also used the term "dangling pointer" 
previously, where I should've used "non-null". Strange you didn't 
catch that.
To the point: *that* is a myth. The bugs you're referring to are 
not *solved* by the GC, they're swept under a rug. Because the 
bugs themselves are in the heads, stemming from that proverbial 
programmer laziness. It's like everyone is Scarlett O'Hara with a 
keyboard.


For most applications, you *do* know how much memory you'll need, 
either exactly or an estimation. Garbage collection is useful for 
cases when you don't, or can't estimate, and even then a limited 
subset of that.



Don't be a computer. Do more with GC.


Writing a throwaway script there's nothing stopping you from 
using mmap or VirtualAlloc.


There is: writing less code to achieve the same result.


Well, I guess either of those do take more arguments than a 
"new", so yup, you do indeed write "less" code. Only that you 
have no clue how much more code is hiding behind that "new", how 
many indirections, DLL calls, syscalls with libc's wonderful 
poison that is errno... You don't want to think about that. Then 
two people start using your script. Then ten, a hundred, a 
thousand. Then it becomes a part of an OS distribution. And no 
one wants to "think about that".


The "power" of GC is in the language support for non-trivial 
types, such as strings and associative arrays. Plain old 
arrays don't benefit from it in the slightest.


For me, the power of tracing GC is that I don't need to think 
about ownership, lifetimes, or manual memory management.


Yes you do, don't delude yourself. Pretty much the only way you 
don't is if you're writing purely functional code. But we're 
talking about D here.
Reassigned a reference? You thought about that. If you didn't, 
you just wrote a nasty bug. How much more hypocrisy can we reach 
here?


"Fun" fact: it's not @safe to "new" anything in D if your program 
uses any classes. Thing is, it does unconditionally thanks to 
DRuntime.



I also don't have to please the borrow checker gods.


Yeah, that's another extremum. I guess "rustacians" or whatever 
the hell they call themselves are pushing that one, don't they? 
"Let's not go for a GC, let's straight up cut out whole paradigms 
for safety's sake..."


Yes, there are other resources to manage. RAII nearly always 
manages that, I don't need to think about that either.


Yes you do. You do need to write those destructors or scoped 
finalizers, don't you? Or so help me use a third-party library 
that implements those? There's fundamentally *no* difference from 
memory management here. None, zero, zip.


Sad thing is, you're not alone. Look at all the major OSs today. 
How long does it take to, I don't know, open a project in the 
Visual Studio on Windows? Or do a search in a huge file opened in 
'less' on Unix? On an octacore 4GHz machine with 32Gb 3GHz 
memory? Should just instantly pop up on the screen, shouldn't it? 
Why doesn't it then? Because most programmers think the way you 
do: "oh it doesn't matter here, I don't need to think about 
that". And then proceed to advocate those "awesome" laid-back 
solutions that oh so help them save so much time coding. Of 
course they do, at everyone else's expense. Decades later, we're 
now trying to solve problems that shouldn't have existed in the 
first place. You'd think that joke was just that, a joke...


But let's get back to D. Look at Phobos. Why does stdout.writefln 
need to allocate? How many times does 

Re: You don't like GC? Do you?

2018-10-12 Thread Adam Wilson via Digitalmars-d

On 10/11/18 11:20 PM, JN wrote:

On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:

[snip]
That is fine, if you want to position yourself as competition to 
languages like Go, Java or C#. D wants to be a viable competition to 
languages like C, C++ and Rust, as a result, there are usecases where GC 
might not be enough.


Does it though? The way I see it is that people who want to do what 
C/C++ does are going to use ... C/C++. The same goes for Java/C#. People 
who want to do what Java/C# do are pretty much just going to use 
Java/C#. And nothing D does is going to convince them that D is truly 
better.


For the C/C++ D's more involved involved semantics for non-GC code are 
ALWAYS going to be a turnoff. And for Java/C# people D's less evolved 
standard library (and library ecosystem) is ALWAYS going to be a turnoff.


Where D shines is in it's balance between the two extremes. If want to 
attempt what C# can do with C++ i'm going to spend the next ten years 
writing code to replace what ships OOB in .NET. If I want to use C# as a 
systems language, I have to reinvent everything that C# relies on from 
the ground up, which will cost me about 10 years (see MSR's Singularity).


IMHO D should focus on being the best possible D it can be. If we take 
care of D, the rest will attend to itself.


--
Adam Wilson
IRC: LightBender
import quiet.dlang.dev;


Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Friday, 12 October 2018 at 21:15:04 UTC, welkam wrote:

People in this thread mostly said that for some things GC is 
just awesome. When you need to get shit done fast and dirty GC 
saves time and mental capacity. Not all code deals with 
sockets, DB, bank transactions, multithreading, etc.


Read the OP again then. What message does it send? What broad 
conclusion does it draw from a niche use case?


Re: You don't like GC? Do you?

2018-10-12 Thread Atila Neves via Digitalmars-d
On Friday, 12 October 2018 at 20:12:26 UTC, Stanislav Blinov 
wrote:
On Friday, 12 October 2018 at 19:55:02 UTC, Nicholas Wilson 
wrote:


Freeing your mind and the codebase of having to deal with 
memory leaves it in an easier place to deal with the less 
common higher impact leaks: file descriptors, sockets, 
database handles ect. (this is like chopping down the forest 
so you can see the trees you care about ;) ).


That's done first and foremost by stripping out unnecessary 
allocations, not by writing "new" every other line and closing 
your eyes.


D isn't Java. If you can, put your data on the stack. If you 
can't, `new` away and don't think about it. The chances you'll 
have to optimise the code are not high. If you do, the chances 
that the GC allocations are the problem are also not high. If the 
profiler shows they are... then remove those allocations.


I mean come on, it's 2018. We're writing code for multi-core 
and multi-processor systems with complex memory interaction.


Sometimes we are. Other times it's a 50 line script.

Precisely where in memory your data is, how it got there and 
how it's laid out should be bread and butter of any D 
programmer.


Of any D programmer writing code that's performance sensitive.

It's true that it isn't critical for one-off  scripts, but so 
is deallocation.


We'll agree to disagree.


Saying stuff like "do more with GC" is just outright harmful.


Disagreement yet again.





Re: You don't like GC? Do you?

2018-10-12 Thread Atila Neves via Digitalmars-d
On Friday, 12 October 2018 at 16:26:49 UTC, Stanislav Blinov 
wrote:

On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:

"It takes care of itself
---
When writing a throwaway script...


...there's absolutely no need for a GC.


True. There's also absolutely no need for computer languages 
either, machine code is sufficient.



In fact, the GC runtime will only detract from performance.


Demonstrably untrue. It puzzles me why this myth persists. There 
are trade-offs, and one should pick whatever is best for the 
situation at hand.


What this means is that whenever I have disregarded a block of 
information, say removed an index from an array, then that 
memory is automatically cleared and freed back up on the next 
sweep. While the process of collection and actually checking


Which is just as easily achieved with just one additional line 
of code: free the memory.


*Simply* achieved, not *easily*. Decades of bugs has shown 
emphatically that it's not easy.



Don't be a computer. Do more with GC.


Writing a throwaway script there's nothing stopping you from 
using mmap or VirtualAlloc.


There is: writing less code to achieve the same result.

The "power" of GC is in the language support for non-trivial 
types, such as strings and associative arrays. Plain old arrays 
don't benefit from it in the slightest.


For me, the power of tracing GC is that I don't need to think 
about ownership, lifetimes, or manual memory management. I also 
don't have to please the borrow checker gods.


Yes, there are other resources to manage. RAII nearly always 
manages that, I don't need to think about that either.





Re: You don't like GC? Do you?

2018-10-12 Thread welkam via Digitalmars-d
On Friday, 12 October 2018 at 20:12:26 UTC, Stanislav Blinov 
wrote:
Saying stuff like "do more with GC" is just outright harmful. 
Kids are reading, for crying out loud.


People in this thread mostly said that for some things GC is just 
awesome. When you need to get shit done fast and dirty GC saves 
time and mental capacity. Not all code deals with sockets, DB, 
bank transactions, multithreading, etc.


Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Friday, 12 October 2018 at 19:55:02 UTC, Nicholas Wilson wrote:

Freeing your mind and the codebase of having to deal with 
memory leaves it in an easier place to deal with the less 
common higher impact leaks: file descriptors, sockets, database 
handles ect. (this is like chopping down the forest so you can 
see the trees you care about ;) ).


That's done first and foremost by stripping out unnecessary 
allocations, not by writing "new" every other line and closing 
your eyes.


I mean come on, it's 2018. We're writing code for multi-core and 
multi-processor systems with complex memory interaction. 
Precisely where in memory your data is, how it got there and how 
it's laid out should be bread and butter of any D programmer. 
It's true that it isn't critical for one-off scripts, but so is 
deallocation.


Saying stuff like "do more with GC" is just outright harmful. 
Kids are reading, for crying out loud.


Re: You don't like GC? Do you?

2018-10-12 Thread Nicholas Wilson via Digitalmars-d
On Friday, 12 October 2018 at 19:43:02 UTC, Stanislav Blinov 
wrote:
On Friday, 12 October 2018 at 18:50:26 UTC, Neia Neutuladh 
wrote:


Over the lifetime of the script, it processed more memory than 
my computer had. That means I needed a memory management 
strategy other than "allocate everything". The GC made that 
quite easy.


Now *that* is a good point. Then again, until you run out of 
address space you're still fine with just plain old 
allocate-and-forget. Not that it's a good thing for production 
code, but for one-off scripts? Sure.


People demonstrably have trouble doing that. We can do it 
most of the time, but everyone occasionally forgets.


The GC isn't a cure for forgetfulness. One can also forget to 
close a file or a socket, or I dunno, cancel a financial 
transaction.


By lines of code, programs allocate memory much more often 
than they deal with files or sockets or financial 
transactions. So anything that requires less discipline when 
dealing with memory will reduce bugs a lot, compared with a 
similar system dealing with sockets or files.


My point is it's irrelevant whether it's memory allocation or 
something else. If you allow yourself to slack on important 
problems, that habit *will* bite you in the butt in the future.


Freeing your mind and the codebase of having to deal with memory 
leaves it in an easier place to deal with the less common higher 
impact leaks: file descriptors, sockets, database handles ect. 
(this is like chopping down the forest so you can see the trees 
you care about ;) ).


Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Friday, 12 October 2018 at 19:06:36 UTC, Dejan Lekic wrote:

What a bunch of nonsense! I used to talk like this some 20 
years ago when all I saw in the computing world was C and C++...


Sure garbage collection is not for every project, depends what 
industry you are in I guess... In my case (business 
applications/services) I have never had the need to turn off 
garbage collection!


However, someone in the gaming industry, embedded or realtime 
systems would indeed need to turn off the GC...


Who said anything about turning it off? I'm pointing out that 
using the GC for the sake of simplicity is precisely the wrong 
reason to do so, that's it. Bunch of nonsense, right. Have fun 
writing sloppy code then.


Re: You don't like GC? Do you?

2018-10-12 Thread bachmeier via Digitalmars-d
On Friday, 12 October 2018 at 16:26:49 UTC, Stanislav Blinov 
wrote:

On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:

"It takes care of itself
---
When writing a throwaway script...


...there's absolutely no need for a GC. In fact, the GC runtime 
will only detract from performance.


For me, at least, spending an extra two weeks optimizing a 
program to eliminate that last 0.1 seconds of running time is not 
a good decision.


Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Friday, 12 October 2018 at 18:50:26 UTC, Neia Neutuladh wrote:

Over the lifetime of the script, it processed more memory than 
my computer had. That means I needed a memory management 
strategy other than "allocate everything". The GC made that 
quite easy.


Now *that* is a good point. Then again, until you run out of 
address space you're still fine with just plain old 
allocate-and-forget. Not that it's a good thing for production 
code, but for one-off scripts? Sure.


People demonstrably have trouble doing that. We can do it 
most of the time, but everyone occasionally forgets.


The GC isn't a cure for forgetfulness. One can also forget to 
close a file or a socket, or I dunno, cancel a financial 
transaction.


By lines of code, programs allocate memory much more often than 
they deal with files or sockets or financial transactions. So 
anything that requires less discipline when dealing with memory 
will reduce bugs a lot, compared with a similar system dealing 
with sockets or files.


My point is it's irrelevant whether it's memory allocation or 
something else. If you allow yourself to slack on important 
problems, that habit *will* bite you in the butt in the future.
But the other end of the spectrum is also harmful. That's how we 
get those "good" APIs such as XCB that fragment the hell out of 
your heap, force libc on you and make you collect their garbage.


It's good enough for a lot of people most of the time without 
thinking about things much.


That's precisely the line of thinking that gave us Java, C#, 
Python and other bastard languages that didn't want to concern 
themselves with the hardware all that much. 30 years of 
"progress" down the drain.


It reduces the frequency of problems and it eliminates 
use-after-free


Not in D it doesn't. Unless you only ever write @safe code, in 
which case you're not in the "without thinking about things much" 
camp.


and double-free, which are sources of data corruption, which is 
hard to track down.


Agreed.

And in the context of a one-off script, I'm probably not going 
to worry about using the GC efficiently as long as I'm not 
running out of memory.


Sure, *that's* the appropriate message. Not the "use the GC, it's 
not as bad as you think".


If you "forget" who owns the data, you may as well "forget" 
who writes it and when. Would GC help then as well? You need 
to expend pretty much the same effort to track that.


That's why we have the const system.


Oh please, really? Const in D? And you're still talking about 
people that don't like to think about things much?


Re: You don't like GC? Do you?

2018-10-12 Thread Dejan Lekic via Digitalmars-d
On Friday, 12 October 2018 at 16:26:49 UTC, Stanislav Blinov 
wrote:

On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:

"It takes care of itself
---
When writing a throwaway script...


...there's absolutely no need for a GC. In fact, the GC runtime 
will only detract from performance.


What this means is that whenever I have disregarded a block of 
information, say removed an index from an array, then that 
memory is automatically cleared and freed back up on the next 
sweep. While the process of collection and actually checking


Which is just as easily achieved with just one additional line 
of code: free the memory.



Don't be a computer. Do more with GC.


Writing a throwaway script there's nothing stopping you from 
using mmap or VirtualAlloc. The "power" of GC is in the 
language support for non-trivial types, such as strings and 
associative arrays. Plain old arrays don't benefit from it in 
the slightest.


What a bunch of nonsense! I used to talk like this some 20 years 
ago when all I saw in the computing world was C and C++...


Sure garbage collection is not for every project, depends what 
industry you are in I guess... In my case (business 
applications/services) I have never had the need to turn off 
garbage collection!


However, someone in the gaming industry, embedded or realtime 
systems would indeed need to turn off the GC...


Re: You don't like GC? Do you?

2018-10-12 Thread Neia Neutuladh via Digitalmars-d

On 10/12/2018 11:14 AM, Stanislav Blinov wrote:

On Friday, 12 October 2018 at 17:31:30 UTC, Neia Neutuladh wrote:

Throwaway scripts can allocate a lot of memory and have nontrivial 
running times. It's less common for scripts than for long-running 
processes, granted, but I've written scripts to go through gigabytes 
of data.


Your point being?.. It's not like you need a GC to allocate gigabytes of 
storage. With D it's super easy to just allocate a huge hunk and simply 
(literally) slice through it.


Over the lifetime of the script, it processed more memory than my 
computer had. That means I needed a memory management strategy other 
than "allocate everything". The GC made that quite easy.


People demonstrably have trouble doing that. We can do it most of the 
time, but everyone occasionally forgets.


The GC isn't a cure for forgetfulness. One can also forget to close a 
file or a socket, or I dunno, cancel a financial transaction.


By lines of code, programs allocate memory much more often than they 
deal with files or sockets or financial transactions. So anything that 
requires less discipline when dealing with memory will reduce bugs a 
lot, compared with a similar system dealing with sockets or files.


GC isn't magic. In fact, to use it correctly you need to pay *more* 
attention than when managing memory manually. Don't leave dangling 
pointers. Nurse uninitialized data. Massage it to not sweep in hot 
paths... People seem to forget that and advertise it as some sort of 
magic wand that does all you want without you having to think.


It's good enough for a lot of people most of the time without thinking 
about things much. It reduces the frequency of problems and it 
eliminates use-after-free and double-free, which are sources of data 
corruption, which is hard to track down.


And in the context of a one-off script, I'm probably not going to worry 
about using the GC efficiently as long as I'm not running out of memory.


Beyond that, the concept you're failing to mention here is ownership. 
You need to use your own mental effort to figure out what memory is 
owned by what part of the code. The GC lets you ignore that.


Nope, it doesn't. If you "forget" who owns the data, you may as well 
"forget" who writes it and when. Would GC help then as well? You need to 
expend pretty much the same effort to track that.


That's why we have the const system.


Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Friday, 12 October 2018 at 17:31:30 UTC, Neia Neutuladh wrote:

Throwaway scripts can allocate a lot of memory and have 
nontrivial running times. It's less common for scripts than for 
long-running processes, granted, but I've written scripts to go 
through gigabytes of data.


Your point being?.. It's not like you need a GC to allocate 
gigabytes of storage. With D it's super easy to just allocate a 
huge hunk and simply (literally) slice through it.


People demonstrably have trouble doing that. We can do it most 
of the time, but everyone occasionally forgets.


The GC isn't a cure for forgetfulness. One can also forget to 
close a file or a socket, or I dunno, cancel a financial 
transaction.
GC isn't magic. In fact, to use it correctly you need to pay 
*more* attention than when managing memory manually. Don't leave 
dangling pointers. Nurse uninitialized data. Massage it to not 
sweep in hot paths... People seem to forget that and advertise it 
as some sort of magic wand that does all you want without you 
having to think.


Beyond that, the concept you're failing to mention here is 
ownership. You need to use your own mental effort to figure out 
what memory is owned by what part of the code. The GC lets you 
ignore that.


Nope, it doesn't. If you "forget" who owns the data, you may as 
well "forget" who writes it and when. Would GC help then as well? 
You need to expend pretty much the same effort to track that.


Writing a throwaway script there's nothing stopping you from 
using mmap or VirtualAlloc. The "power" of GC is in the 
language support for non-trivial types, such as strings and 
associative arrays. Plain old arrays don't benefit from it in 
the slightest.


A string is a plain old array.


An ASCII string, perhaps. Not a Unicode one. Count 
statically-typed compiled languages with native strings, please.


 and languages with manual memory management also support 
associative arrays.


Of course they do. But again, are those built-in types?


Re: You don't like GC? Do you?

2018-10-12 Thread Neia Neutuladh via Digitalmars-d

On 10/12/2018 09:26 AM, Stanislav Blinov wrote:

On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:

"It takes care of itself
---
When writing a throwaway script...


...there's absolutely no need for a GC. In fact, the GC runtime will 
only detract from performance.


Throwaway scripts can allocate a lot of memory and have nontrivial 
running times. It's less common for scripts than for long-running 
processes, granted, but I've written scripts to go through gigabytes of 
data.


What this means is that whenever I have disregarded a block of 
information, say removed an index from an array, then that memory is 
automatically cleared and freed back up on the next sweep. While the 
process of collection and actually checking


Which is just as easily achieved with just one additional line of code: 
free the memory.


People demonstrably have trouble doing that. We can do it most of the 
time, but everyone occasionally forgets.


Beyond that, the concept you're failing to mention here is ownership. 
You need to use your own mental effort to figure out what memory is 
owned by what part of the code. The GC lets you ignore that.



Don't be a computer. Do more with GC.


Writing a throwaway script there's nothing stopping you from using mmap 
or VirtualAlloc. The "power" of GC is in the language support for 
non-trivial types, such as strings and associative arrays. Plain old 
arrays don't benefit from it in the slightest.


A string is a plain old array, and languages with manual memory 
management also support associative arrays.


Re: You don't like GC? Do you?

2018-10-12 Thread Stanislav Blinov via Digitalmars-d

On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:

"It takes care of itself
---
When writing a throwaway script...


...there's absolutely no need for a GC. In fact, the GC runtime 
will only detract from performance.


What this means is that whenever I have disregarded a block of 
information, say removed an index from an array, then that 
memory is automatically cleared and freed back up on the next 
sweep. While the process of collection and actually checking


Which is just as easily achieved with just one additional line of 
code: free the memory.



Don't be a computer. Do more with GC.


Writing a throwaway script there's nothing stopping you from 
using mmap or VirtualAlloc. The "power" of GC is in the language 
support for non-trivial types, such as strings and associative 
arrays. Plain old arrays don't benefit from it in the slightest.




Re: You don't like GC? Do you?

2018-10-12 Thread JN via Digitalmars-d

On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:
When writing a throwaway script that I will only use a handful 
of times, optimising that code isn’t necessarily high on my 
priority list. The priority is to get it written, and get it 
running. That’s where the V8 (C++) engine that NodeJS is 
compiled into throws you a bone.




That is fine, if you want to position yourself as competition to 
languages like Go, Java or C#. D wants to be a viable competition 
to languages like C, C++ and Rust, as a result, there are 
usecases where GC might not be enough. Also, the quoted part 
mentions throwaway scripts, which D can be used for, but most 
people would use Python or Node.JS like in the article instead.


You don't like GC? Do you?

2018-10-11 Thread aberba via Digitalmars-d

"It takes care of itself
---
When writing a throwaway script that I will only use a handful of 
times, optimising that code isn’t necessarily high on my priority 
list. The priority is to get it written, and get it running. 
That’s where the V8 (C++) engine that NodeJS is compiled into 
throws you a bone.


When you have no choice but to call arrays into memory and 
manipulate them, sometimes very very large arrays, you can begin 
to worry about the state of your machine and the amount of memory 
that is being used. Luckily, V8 handles automatic garbage 
collection.


What this means is that whenever I have disregarded a block of 
information, say removed an index from an array, then that memory 
is automatically cleared and freed back up on the next sweep. 
While the process of collection and actually checking can be a 
bit intensive, it means when I am quickly iterating through code 
I don’t need to pay a tremendous amount of attention to my memory 
management, and I can entrust V8 to handle all the little 
nuances."


Don't be a computer. Do more with GC.

https://medium.com/@kieranmaher13/why-i-use-nodejs-for-basically-everything-i-do-e0a627787ecc