On Thursday, 10 March 2016 at 10:58:41 UTC, thedeemon wrote:
On Wednesday, 9 March 2016 at 15:14:02 UTC, Gerald Jansen wrote:

  enum n = 100_000_000; // some big number
  auto a = new ulong[](n);
  auto b = new char[8][](n);
  struct S { ulong x; char[8] y; }
  auto c = new S[](n);

will the large memory blocks allocated for a, b and/or c actually be scanned for pointers to GC-allocated memory during a garbage collection? If so, why?

I've just tested it with my GC tracker ( https://bitbucket.org/infognition/dstuff ), all 3 allocations go with flags APPENDABLE | NO_SCAN which means these blocks will not be scanned.

But if you define S as
struct S { ulong x; char[] y; }
so there is some pointer inside, then it gets allocated with just APPENDABLE flag, i.e. it will be scanned then.

Thanks for the very clear answer. Adam too. This alleviates much of my fear of GC performance issues for processing largish datasets in memory with traditional loops, even with multiple threads. Of course, it depends on wasting some memory to avoid char[] fields, but that is often a reasonable trade-off for the kind of data I need to process.

Reply via email to