Re: Subtyping of an enum
On Monday, 15 April 2019 at 14:20:57 UTC, Alex wrote: On Monday, 15 April 2019 at 08:39:24 UTC, Anton Fediushin wrote: Hello! I am currently trying to add a custom `toString` method to an enum so that: 1. Enum members would still have numeric values and can be easily compared (things like `enum a { foo = "FOO", bar = "BAR”}` won't do, I want `a.foo < a.bar)`) 2. More custom methods can be implemented in the future Obvious solution is to wrap an enum in a structure and utilize 'alias this' for subtyping like this: ``` struct Enum { private { enum internal { foo, bar } internal m_enum; } this(internal i) { m_enum = i; } alias m_enum this; string toString() { // custom implementation of toString } } ``` This seems to work just fine for assigning and comparisons but passing Enum as a function argument does not work: ``` void fun(Enum e) {} fun(Enum.foo); --- Error: function fun(Enum e) is not callable using argument types (internal) Cannot pass argument foo of type internal to parameter Enum e. ``` Of course, I could just define a bunch of functions that accept my enum as the first argument and call them using UFCS but it'd require to explicitly specify functions instead of D taking care of that (toString() for structures is called automagically by functions like writeln) and those functions would hang around here and there totally unorganized. I prefer to keep functions inside of structures and classes. If there are other ways of achieving the same *and* keeping code clean and organized, please share. Thank you in advance, Anton. yes, import std.stdio, std.meta, std.traits, std.conv; enum _MyEnum : int { a,b,c} struct _Enum(T) { T value; alias value this; // generate static field members static foreach(e, v; EnumMembers!T) { pragma(msg, "static MyEnum "~to!string(v)~" = MyEnum(T."~to!string(v)~");"); mixin("static MyEnum "~to!string(v)~" = cast(MyEnum)(T."~to!string(v)~");"); } } alias _Enum!_MyEnum MyEnum; void foo(MyEnum e) { writeln(to!int(e)); } void main() { foo(MyEnum.a); foo(MyEnum.b); foo(MyEnum.c); } https://run.dlang.io/is/WOcLrZ Note that value is never used, it just makes the cast work and treats the struct as an enum. Not sure if there is a way around that. Thank you, this is the solution I have been looking for!
Re: Subtyping of an enum
On Monday, 15 April 2019 at 14:11:05 UTC, diniz wrote: Le 15/04/2019 à 10:39, Anton Fediushin via Digitalmars-d-learn a écrit : [snip] I don't understand why you just don't call fun with an Enum (struct) param, since that is how fun is defined. This works by me (see call in main): struct Enum { private { enum internal { foo, bar } internal m_enum; } this (internal i) { m_enum = i; } alias m_enum this; string toString() { switch (this.m_enum) { case internal.foo : return "FOO" ; case internal.bar : return "BAR" ; default : assert(0) ; } } } void fun (Enum e) { writeln(e) ; } void main() { auto e = Enum(Enum.foo) ; fun(e) ;// -> "FOO" } [And I wouldn't make the enum (little e) private, this just risks complicating code, also eg in testing, I would just not export it.] `fun(Enum(Enum.foo));` would obviously work but `fun(Enum.foo);` would not. `Enum(Enum` is just redundant, I was looking for a solution that would make code cleaner. I don't see a problem marking internal enum as private because it isn't accessed outside of the struct
Re: Subtyping of an enum
On Monday, 15 April 2019 at 12:25:38 UTC, XavierAP wrote: On Monday, 15 April 2019 at 10:34:42 UTC, Anton Fediushin wrote: On Monday, 15 April 2019 at 10:06:30 UTC, XavierAP wrote: [snip] Isn't this how subtyping works for integers and other types? For example, you have subtyped an integer and added some new methods to it? Yes (leaving aside whether stuff is private or nested) but you are using the types' relationship the other way around. You have: static assert(is(Enum : internal)); But you are defining and calling fun() as if it were the other way around (internal : Enum) Thank you, I understand now! I think I'd have to stick with UFCS method
Re: Subtyping of an enum
On Monday, 15 April 2019 at 10:45:26 UTC, Alex wrote: On Monday, 15 April 2019 at 10:15:50 UTC, Anton Fediushin wrote: On Monday, 15 April 2019 at 10:00:36 UTC, Alex wrote: [snip] This would: ´´´ struct Enum { private { enum internal { foo, bar } internal m_enum; } this(internal i) { m_enum = i; } alias m_enum this; string toString() { if(m_enum == internal.foo) return "FOO"; else return "BAR"; } } void fun(Enum e) {} void main(){ import std.stdio; fun(Enum.init); Enum a = Enum.foo; Enum b = Enum.bar; assert(a == Enum.foo); assert(a < b); assert(a.toString == "FOO"); assert(b.toString == "BAR"); writeln(a); // FOO writeln(b); // BAR } ´´´ Assuming, that automatic generation of "FOO" from foo was not part of your question :-p This does work unless I want to use it like this: ``` fun(Enum.foo); --- Error: function fun(Enum e) is not callable using argument types (internal) cannot pass argument foo of type internal to parameter Enum e ```
Re: Subtyping of an enum
On Monday, 15 April 2019 at 10:06:30 UTC, XavierAP wrote: On Monday, 15 April 2019 at 08:39:24 UTC, Anton Fediushin wrote: Hello! I am currently trying to add a custom `toString` method Several remarks... First of all, strings can be compared (alphabetically) as well as integers, e.g. assert("foo" > "bar") Perhaps not your use case, but worth noting. I already know that but defining enum with strings would break my code: ``` assert(Enum.foo < Enum.bar); ``` Would never succeed. You have defined your sub-typing the opposite way that you wanted it to work: every `Enum` is an `internal`, but the other way around an `internal` may not work as an `Enum`. Your `fun` would in principle work if it were defined with an `internal` but passed an `Enum`... Of course you have defined `internal` as nested private so no... But then how did you want anything to work if no one outside Enum knows the super-type? Isn't this how subtyping works for integers and other types? For example, you have subtyped an integer and added some new methods to it? You obviously need to re-think your problem and your design :) The problem here is that I want to keep methods that are related to an enum inside of this enum for purely aesthetic and organizational purposes. Obvious solution is to wrap an enum in a structure and utilize 'alias this' for subtyping like this: Actually the obvious solution (not sure if it otherwise works for you) would be to take advantage of D's Uniform Function Call Syntax [1] and define toString as a global function that can be called as a method: enum Fubar { foo, bar } string toString(Fubar fb) { return "It works."; } void main() { import std.stdio; writeln(Fubar.foo.toString); } _ [1] https://tour.dlang.org/tour/en/gems/uniform-function-call-syntax-ufcs This is what I am doing now, I was just curious if there was a better solution. These global functions pollute global namespace. I know that I could put them into an own module and 'public import' just my enum because these methods would rarely be used by the other components of my application. If I would want to use them though, I'd have to import that ugly module and never forget to call `writeln(a.toString)` instead of `writeln(a)` or else it'll do not what I wanted. And once more, what I want to achieve is purely about overall code organization. I mean, if structs (data) can have functions (methods) that are performed on them, why an enum (single value) cannot have own methods performed on it?
Re: Subtyping of an enum
On Monday, 15 April 2019 at 10:00:36 UTC, Alex wrote: On Monday, 15 April 2019 at 08:39:24 UTC, Anton Fediushin wrote: [snip] Otherwise, you could alwas define fun as ´´´ void fun(Enum.internal e) {} ´´´ but I assume, you want to avoid especially this. In favor of my first proposition, also speaks the fact, that Enum.foo is somewhat awkward w.r.t. your question, as you treat the internal enum as a static member. Was this intended? Enum.internal is private to make it inaccessible from any other place. All I want is a way to have an enum that I could extend with my own methods. Something to make the following code work: ``` Enum a = Enum.foo; Enum b = Enum.bar; assert(a == Enum.foo); assert(a < b); assert(a.toString == "FOO"); assert(b.toString == "BAR"); writeln(a); // FOO writeln(b); // BAR ```
Subtyping of an enum
Hello! I am currently trying to add a custom `toString` method to an enum so that: 1. Enum members would still have numeric values and can be easily compared (things like `enum a { foo = "FOO", bar = "BAR”}` won't do, I want `a.foo < a.bar)`) 2. More custom methods can be implemented in the future Obvious solution is to wrap an enum in a structure and utilize 'alias this' for subtyping like this: ``` struct Enum { private { enum internal { foo, bar } internal m_enum; } this(internal i) { m_enum = i; } alias m_enum this; string toString() { // custom implementation of toString } } ``` This seems to work just fine for assigning and comparisons but passing Enum as a function argument does not work: ``` void fun(Enum e) {} fun(Enum.foo); --- Error: function fun(Enum e) is not callable using argument types (internal) Cannot pass argument foo of type internal to parameter Enum e. ``` Of course, I could just define a bunch of functions that accept my enum as the first argument and call them using UFCS but it'd require to explicitly specify functions instead of D taking care of that (toString() for structures is called automagically by functions like writeln) and those functions would hang around here and there totally unorganized. I prefer to keep functions inside of structures and classes. If there are other ways of achieving the same *and* keeping code clean and organized, please share. Thank you in advance, Anton.
Re: High memory usage in vibe.d application
On Sunday, 1 July 2018 at 20:15:02 UTC, crimaniak wrote: On Sunday, 1 July 2018 at 13:44:23 UTC, Anton Fediushin wrote: I reduced the test case to _one_ line: ``` 1.seconds.setTimer(() => "http://google.com".requestHTTP((scope req) {}, (scope res) {res.disconnect;}), true); ``` What happens is `res.disconnect` doesn't free all of the internal buffers, causing leakage. One way to avoid that is to call `res.dropBody`, but it isn't always wanted (just like in my example). The problem is known and mentioned in the documentation: http://vibed.org/api/vibe.http.client/requestHTTP Note that it is highly recommended to use one of the overloads that take a responder callback, as they can avoid some memory allocations and are safe against accidentally leaving stale response objects (objects whose response body wasn't fully read). For the returning overloads of the function it is recommended to put a scope(exit) right after the call in which HTTPClientResponse.dropBody is called to avoid this. As I understand the situation, request object will reside in memory until you fully read or do something with response body. It says so "for the returning overloads". Callback-based ones should be "safe against accidentally leaving stale response objects". Actually, in this example I don't 'accidentally' leave objects, I do that on purpose and call `res.disconnect` to forcefully close everything. Yet it still doesn't free memory. There's nothing much to do with the response body - it can be either read and destroyed or just destroyed, and `res.disconect` should do this.
Re: High memory usage in vibe.d application
On Sunday, 1 July 2018 at 12:32:25 UTC, Jacob Shtokolov wrote: On Sunday, 1 July 2018 at 05:20:17 UTC, Anton Fediushin wrote: Now I tried it and indeed, it's vibe.d's fault. I'm not quite sure what causes it and if this problem is known, I'll look into that later and open an issue if it doesn't exist already. Yes, please do this when you have time. That would be really helpful for further vibe.d improvement. I remember a pretty old (and closed) bug of HTTP client here: https://github.com/vibe-d/vibe.d/issues/1321 So it might be somehow related to this one. Probably something wrong with HTTP client or TLS/SSL related logic. You example code is very good and I was able to reproduce the same issue with the latest stable compiler, so I hope the guys will find the problem. Thanks! I reduced the test case to _one_ line: ``` 1.seconds.setTimer(() => "http://google.com".requestHTTP((scope req) {}, (scope res) {res.disconnect;}), true); ``` What happens is `res.disconnect` doesn't free all of the internal buffers, causing leakage. One way to avoid that is to call `res.dropBody`, but it isn't always wanted (just like in my example). I submitted an issue: https://github.com/vibe-d/vibe.d/issues/2179
Re: High memory usage in vibe.d application
On Saturday, 30 June 2018 at 22:06:50 UTC, Jacob Shtokolov wrote: On Friday, 29 June 2018 at 17:40:07 UTC, Anton Fediushin wrote: So, long story short: - Usage of Mallocator instead of theAllocator made it a little bit better - VibeManualMemoryManagement had no (or little) effect - Manually calling GC.collect had no (or little) effect You could try to call GC.minimize in pair with GC.collect: ``` GC.collect(); GC.minimize(); ``` to return all freed memory back to the OS. With vibe.d this had no effect too. Not sure that the leakage of this type is possible because if you're running your program on 64bit Linux the probability of it is very low. AFAIK the GC is launched every (almost) time you allocate the memory, and if it finds "dead" pointers, it definitely must clean them out. Vibe.d may also leak. Have you tried to run the same code without Vibe.d, say, using https://github.com/ikod/dlang-requests as an HTTP client? Now I tried it and indeed, it's vibe.d's fault. I'm not quite sure what causes it and if this problem is known, I'll look into that later and open an issue if it doesn't exist already.
Re: High memory usage in vibe.d application
On Saturday, 30 June 2018 at 05:00:35 UTC, rikki cattermole wrote: On 30/06/2018 4:49 AM, Bauss wrote: I wouldn't really blame the GC. There is a higher chance you're just not using it how it's meant to be, especially since it looks like you're mixing manual memory management with GC memory. Let's be honest, I don't think it was meant to live in a container with 64mb of ram. I just don't think it is kicking in to collect. It doesn't, I'm experimenting with different GC configurations [1]. By default [2] `maxPoolSize` is set to 64MB, so maybe program gets killed by docker right before GC decides to collect. [1] https://dlang.org/spec/garbage.html#gc_config [2] https://github.com/dlang/druntime/blob/master/src/gc/config.d#L23
Re: High memory usage in vibe.d application
On Friday, 29 June 2018 at 16:49:41 UTC, Bauss wrote: On Friday, 29 June 2018 at 16:07:00 UTC, Anton Fediushin wrote: On Friday, 29 June 2018 at 11:11:57 UTC, rikki cattermole wrote: On 29/06/2018 11:09 PM, Anton Fediushin wrote: It is GC's fault for sure, I built my program with profile-gc and it allocated a lot there. Question is, why doesn't it free this memory? Probably doesn't know that it should deallocate so eagerly. A GC.collect(); call may help. That's a good idea. GC really needs to be kicked in once in a while because it did _nothing_ in 8 hours, even though my application is just a couple of timers - it isn't a hard task for CPU or memory and there's plenty of time to collect some garbage. Now I finally understand why GC is not a great thing. I was writing apps utilizing GC for a long time and never had problems with it, but when it came down to this simple program it stabbed me in the back. I wouldn't really blame the GC. There is a higher chance you're just not using it how it's meant to be, especially since it looks like you're mixing manual memory management with GC memory. I am not quite sure what should I blame now, because even if I use malloc for memory allocation, memory goes... somewhere? So, long story short: - Usage of Mallocator instead of theAllocator made it a little bit better - VibeManualMemoryManagement had no (or little) effect - Manually calling GC.collect had no (or little) effect It makes me think that error is somewhere else. I made a code snippet of my testing program: https://gitlab.com/snippets/1729304 There are some changes to it: - It uses different stream with metaint of 32KB - It calls nowPlaying() every second Now I will take a break from this, dealing with this kind of nonsense annoys me.
Re: High memory usage in vibe.d application
On Friday, 29 June 2018 at 16:19:39 UTC, 12345swordy wrote: On Friday, 29 June 2018 at 16:07:00 UTC, Anton Fediushin wrote: Now I finally understand why GC is not a great thing. I was writing apps utilizing GC for a long time and never had problems with it, but when it came down to this simple program it stabbed me in the back. Which language that you had write apps in that utilize GC? Java? C#? You shouldn't treat D GC the same as other languages GC. Alexander Talking about D here. GC can be the best option for some languages and environments, but it doesn't fit D that well. Writing programs in D I always know where stack-allocated structs get deleted and such, but I have no idea on what's going on with the GC. Does it collect anything at all? Why doesn't it collect this? How do I force it to collect this?
Re: High memory usage in vibe.d application
On Friday, 29 June 2018 at 11:11:57 UTC, rikki cattermole wrote: On 29/06/2018 11:09 PM, Anton Fediushin wrote: It is GC's fault for sure, I built my program with profile-gc and it allocated a lot there. Question is, why doesn't it free this memory? Probably doesn't know that it should deallocate so eagerly. A GC.collect(); call may help. That's a good idea. GC really needs to be kicked in once in a while because it did _nothing_ in 8 hours, even though my application is just a couple of timers - it isn't a hard task for CPU or memory and there's plenty of time to collect some garbage. Now I finally understand why GC is not a great thing. I was writing apps utilizing GC for a long time and never had problems with it, but when it came down to this simple program it stabbed me in the back.
Re: High memory usage in vibe.d application
On Friday, 29 June 2018 at 14:10:26 UTC, Daniel Kozak wrote: Have you try use VibeManualMemoryManagement https://github.com/TechEmpower/FrameworkBenchmarks/blob/3b24d0a21463edc536b30e2cea647fd425915401/frameworks/D/vibed/dub.json#L22 I'll try, not quite sure it'll help much.
Re: High memory usage in vibe.d application
On Friday, 29 June 2018 at 11:42:18 UTC, bauss wrote: On Friday, 29 June 2018 at 11:24:14 UTC, Anton Fediushin wrote: On Friday, 29 June 2018 at 11:01:41 UTC, Anton Fediushin wrote: On Friday, 29 June 2018 at 10:21:24 UTC, Radu wrote: On Friday, 29 June 2018 at 09:44:27 UTC, Anton Fediushin wrote: Almost forgot, there are two timers which call this function for two different streams. Value of `metaint` is 16000, which means that only 16KB of memory are allocated for the `buffer`, then it reads another byte which contains length of the metadata / 16 and then it reads the metadata which is 100-200 bytes long. This gives us... 16KiB per one nowPlaying() call. Why doesn't it free the memory? Maybe use the https://dlang.org/phobos/std_experimental_allocator_mallocator.html instead of theAllocator as it defaults to GC. Thanks, I'll try that. ... I will deploy that and see if it changes anything. It did! Memory usage went down to 7MiB yet it still grows slightly. I'll monitor if it changes in a couple of hours but it is much better. Thank you a lot, Radu. It turns out that theAllocator is so tricky. Again you could do @nogc and see what memory is possibly allocated by the GC and perhaps that way you can see what memory the GC is holding on to. @nogc tells nothing new, just an error on every single line because neither `res.bodyReader.read` nor Mallocator's functions are marked as @nogc. Compiling with dmd's `-vgc` flag shows nothing but the last line. non-GC memory should be freed right away and those there shouldn't be a leak from that. Using Mallocator instead of theAllocator improved the situation, but it still leaks for some reason. After 2 hours it went from 7MiB to 18MiB. I will compile it with profile-gc again and look for the possible cause of that, maybe I'll try valgrind too.
Re: High memory usage in vibe.d application
On Friday, 29 June 2018 at 11:01:41 UTC, Anton Fediushin wrote: On Friday, 29 June 2018 at 10:21:24 UTC, Radu wrote: On Friday, 29 June 2018 at 09:44:27 UTC, Anton Fediushin wrote: Almost forgot, there are two timers which call this function for two different streams. Value of `metaint` is 16000, which means that only 16KB of memory are allocated for the `buffer`, then it reads another byte which contains length of the metadata / 16 and then it reads the metadata which is 100-200 bytes long. This gives us... 16KiB per one nowPlaying() call. Why doesn't it free the memory? Maybe use the https://dlang.org/phobos/std_experimental_allocator_mallocator.html instead of theAllocator as it defaults to GC. Thanks, I'll try that. ... I will deploy that and see if it changes anything. It did! Memory usage went down to 7MiB yet it still grows slightly. I'll monitor if it changes in a couple of hours but it is much better. Thank you a lot, Radu. It turns out that theAllocator is so tricky.
Re: High memory usage in vibe.d application
On Friday, 29 June 2018 at 10:31:14 UTC, bauss wrote: On Friday, 29 June 2018 at 10:21:24 UTC, Radu wrote: On Friday, 29 June 2018 at 09:44:27 UTC, Anton Fediushin wrote: Almost forgot, there are two timers which call this function for two different streams. Value of `metaint` is 16000, which means that only 16KB of memory are allocated for the `buffer`, then it reads another byte which contains length of the metadata / 16 and then it reads the metadata which is 100-200 bytes long. This gives us... 16KiB per one nowPlaying() call. Why doesn't it free the memory? Maybe use the https://dlang.org/phobos/std_experimental_allocator_mallocator.html instead of theAllocator as it defaults to GC. Also, why you .idup the array? .array already creates a new one on the heap. This. Which kind of makes the usage of theAllocator useless. Indeed, because it uses GC by default my `theAllocator.dispose` did nothing, which basically made these two samples of code equal. I was going to suggest using @nogc too, because it would most likely be GC allocated memory that is taking up space. It is GC's fault for sure, I built my program with profile-gc and it allocated a lot there. Question is, why doesn't it free this memory? I run multiple vibe.d applications and I have no issues with memory (Even with GC.) Me neither, my other vibe.d project uses 7.5MB and that's it.
Re: High memory usage in vibe.d application
On Friday, 29 June 2018 at 10:21:24 UTC, Radu wrote: On Friday, 29 June 2018 at 09:44:27 UTC, Anton Fediushin wrote: Almost forgot, there are two timers which call this function for two different streams. Value of `metaint` is 16000, which means that only 16KB of memory are allocated for the `buffer`, then it reads another byte which contains length of the metadata / 16 and then it reads the metadata which is 100-200 bytes long. This gives us... 16KiB per one nowPlaying() call. Why doesn't it free the memory? Maybe use the https://dlang.org/phobos/std_experimental_allocator_mallocator.html instead of theAllocator as it defaults to GC. Thanks, I'll try that. Also, why you .idup the array? .array already creates a new one on the heap. It does, but it creates char[] and I need a string. I changed code a little bit to remove unnecessary `map` and `idup` too. Code now: ``` @safe string nowPlaying(string url) { import vibe.core.stream; import std.experimental.allocator; import std.experimental.allocator.mallocator; import std.string; string r; url.requestHTTP( (scope req) { req.headers.addField("Icy-MetaData", "1"); }, (scope res) { RCIAllocator a = allocatorObject(Mallocator.instance); auto metaint = res.headers.get("icy-metaint").to!int; auto buffer = a.makeArray!ubyte(metaint); scope(exit) a.dispose(buffer); res.bodyReader.read(buffer, IOMode.all); auto lengthBuffer = a.makeArray!ubyte(1); scope(exit) a.dispose(lengthBuffer); res.bodyReader.read(lengthBuffer, IOMode.all); auto dataBuffer = a.makeArray!ubyte(lengthBuffer[0] * 16); scope(exit) a.dispose(dataBuffer); res.bodyReader.read(dataBuffer, IOMode.all); r = dataBuffer.split('\'').drop(1).front.array.assumeUTF; res.disconnect; } ); return r; } ``` I will deploy that and see if it changes anything.
Re: High memory usage in vibe.d application
Almost forgot, there are two timers which call this function for two different streams. Value of `metaint` is 16000, which means that only 16KB of memory are allocated for the `buffer`, then it reads another byte which contains length of the metadata / 16 and then it reads the metadata which is 100-200 bytes long. This gives us... 16KiB per one nowPlaying() call. Why doesn't it free the memory?
High memory usage in vibe.d application
Hello, I'm looking for an advice on what I am doing wrong. I have a vibe.d-based program, which connects to an audio stream and gets name of the song currently playing. For that, I wrote the following code: ``` @safe string nowPlaying(string url) { import vibe.core.stream; string r; url.requestHTTP( (scope req) { req.headers.addField("Icy-MetaData", "1"); }, (scope res) { auto metaint = res.headers.get("icy-metaint").to!int; auto buffer = new ubyte[metaint]; res.bodyReader.read(buffer, IOMode.all); auto lengthBuff = new ubyte[1]; res.bodyReader.read(lengthBuff, IOMode.all); auto dataBuffer = new ubyte[lengthBuff[0] * 16]; res.bodyReader.read(dataBuffer, IOMode.all); r = dataBuffer.map!(a => a.to!char).split('\'').drop(1).front.array.idup; } ); return r; } ``` And I call it with a timer every 10 seconds: ``` string now_playing; 10.seconds.setTimer(() { now_playing = nowPlaying(stream); }, true); ``` This code worked fine for 8 or so hours and then got killed by docker because of a limit of 64MB of RAM. I executed the same code on my machine and saw resident set size growing in real-time. Blaming GC (as people usually do) I changed the code to use std.experimental.allocator instead: ``` @safe string nowPlaying(string url) { import vibe.core.stream; import std.experimental.allocator; string r; url.requestHTTP( (scope req) { req.headers.addField("Icy-MetaData", "1"); }, (scope res) { auto metaint = res.headers.get("icy-metaint").to!int; auto buffer = theAllocator.makeArray!ubyte(metaint); scope(exit) theAllocator.dispose(buffer); res.bodyReader.read(buffer, IOMode.all); auto lengthBuffer = theAllocator.makeArray!ubyte(1); scope(exit) theAllocator.dispose(lengthBuffer); res.bodyReader.read(lengthBuffer, IOMode.all); auto dataBuffer = theAllocator.makeArray!ubyte(lengthBuffer[0] * 16); scope(exit) theAllocator.dispose(dataBuffer); res.bodyReader.read(dataBuffer, IOMode.all); r = dataBuffer.map!(a => a.to!char).split('\'').drop(1).front.array.idup; } ); return r; } ``` And somehow, it got *worse*. Now my program gets killed every 3 hours. How is that possible? Am I missing something? Some screenshots of CPU/Memory usage: 1. These are metrics of a whole cluster, program is started at around 8:00 and gets killed after 16:00 https://imgur.com/a/IhHvOt4 2. These are metrics of an updated program which uses std.experimental.allocator. https://imgur.com/a/XBchJ7C
Re: Array Printing
On Tuesday, 12 September 2017 at 13:15:01 UTC, Vino.B wrote: Hi, Sorry, it didn't work, the genrated out is as below Oops, sorry. It should look like this: writefln("%-(%s\n%)", array);
Re: Array Printing
On Tuesday, 12 September 2017 at 06:29:53 UTC, Vino.B wrote: Hi All, Request your help in printing the below array output as per the below required output Array Output: ["C:\\Temp\\TEST2\\BACKUP\\dir1", "34", "C:\\Temp\\TEST2\\BACKUP\\dir2", "36", "C:\\Temp\\TEST3\\BACKUP\\dir1", "69"] ["C:\\Temp\\TEST2\\PROD_TEAM\\dir1", "34", "C:\\Temp\\TEST2\\PROD_TEAM\\DND1", "34"] ["C:\\Temp\\TEST2\\TEAM\\DND1", "34"] Required output: C:\Temp\TEST2\BACKUP\dir1 34 C:\Temp\TEST2\BACKUP\dir2 36 C:\Temp\TEST3\BACKUP\\dir1 69 C:\Temp\TEST2\PROD_TEAM\\dir1 34 C:\Temp\TEST2\PROD_TEAM\\DND1 34 C:\Temp\TEST2\TEAM\\DND134 From, Vino.B Try this: writefln("%(%s\n%)", array); See std.format's documentation for more
Re: Dub documentation with an initial ddoc file
On Sunday, 3 September 2017 at 23:14:15 UTC, Conor O'Brien wrote: I've been trying to figure out how to generate documentation for my project using dub. I have found this link[1] which told me how I could use dub to generate docs: dub build --build=docs However, I wish to have a set of macros that are present on every documentation file, that would define how the resultant HTML document is rendered. I tried: dub build --build=docs html.ddoc But I got the following error: Expected one or zero arguments. Run "dub build -h" for more information about the "build" command. How might I include `html.ddoc` with every file that has documentation? Add this to your dub.json: "configurations": [ { "name": "docs", "buildOptions": ["syntaxOnly"], "dflags": ["-Dddocs"], "sourceFiles": ["html.ddoc"] } ] Or if you use dub.sdl: configuration "docs" { buildOptions "syntaxOnly" dflags "-Dddocs" sourceFiles "html.ddoc" } This adds a new configuration named "docs", which can be used like this: $ dub -c docs
Re: Questions about dmd source
On Sunday, 30 July 2017 at 06:18:16 UTC, Francis Nixon wrote: I have two completely unrelated questions about the dmd source code. 2. I've noticed there are some rather long methods in the dmd source, involving more than one goto; parse.d is particularly bad. Is there a reason for this/is it being fixed? It is impossible to write short parser, and goto operators are quite useful in such code. Also, there is no need to rewrite anything unless it is slow or buggy, and parse.d probably isn't.
Re: Idiomatic way of writing nested loops?
On Tuesday, 18 July 2017 at 03:36:04 UTC, Nicholas Wilson wrote: With regards to parallel, only use it on the outermost loop. Assuming you have more items in the outermost loop than you do threads parallelising more than one loop won't net you any speed. Thank you! Yes, `parallel` runs only 4 threads on my machine, so there is no reason to use it in nested loops.
Re: Idiomatic way of writing nested loops?
On Monday, 17 July 2017 at 11:32:45 UTC, Sebastiaan Koppe wrote: On Monday, 17 July 2017 at 11:07:35 UTC, Anton Fediushin wrote: Hello! What is the best way of rewriting this code in idiomatic D manner? https://dlang.org/phobos/std_algorithm_setops.html#.cartesianProduct Thank you! I knew it is in the library! So, `parallel` will work just fine with this function, isn't it?
Idiomatic way of writing nested loops?
Hello! What is the best way of rewriting this code in idiomatic D manner? -- foreach(a; ["foo", "bar"]) { foreach(b; ["baz", "foz", "bof"]) { foreach(c; ["FOO", "BAR"]) { // Some operations on a, b and c } } } -- Every array has at least 1 element, and adding/removing new "nested loops" should be as easy as possible. Also, I have a question about running this in parallel: if I want to use nested loops with `parallel` from `std.parallelism`, should I add `parallel` to every loop like this? -- foreach(a; ["foo", "bar"].parallel) { foreach(b; ["baz", "foz", "bof"].parallel) { foreach(c; ["FOO", "BAR"].parallel) { // Some operations on a, b and c } } } -- I am worried about running thousands of threads, because in this case first `parallel` runs 2 tasks, every task runs 3 tasks and every task runned inside a task runs 2 more tasks. So, how to write this in idiomatic D manner and run it _if possible_ in parallel?
Re: sorting a string
On Friday, 14 July 2017 at 17:23:41 UTC, Steven Schveighoffer wrote: Don't do this, because it's not what you think. It's not actually calling std.algorithm.sort, but the builtin array sort property. This will be going away soon. This sucks. I know, that `.sort` will be removed, but I thought it won't break any code. With 2.075, it won't compile even without the parentheses, because a char[] is not an array according to std.algorithm... But why? This should be true for `char[]`, isn't it? - if ((ss == SwapStrategy.unstable && (hasSwappableElements!Range || hasAssignableElements!Range) || ss != SwapStrategy.unstable && hasAssignableElements!Range) && isRandomAccessRange!Range && hasSlicing!Range && hasLength!Range) - (It's from https://dlang.org/phobos/std_algorithm_sorting.html#sort)
Re: std.container.array of struct inside a struct fails
On Friday, 14 July 2017 at 16:42:59 UTC, drug wrote: It's because Array(T) is a value type and needs type size to define itself, so you have expected forward reference. But T[] is reference type and its size is known in advance - it doesn't depend on type, it's always pointer.sizeof + length.sizeof, for 64x architecture it is 16 bytes, so in this case you have no the issue. It's not a bug at all. Thank you!
Re: sorting a string
On Friday, 14 July 2017 at 15:56:49 UTC, Namal wrote: Thx Steve! By sorting string I mean a function or series of functions that sorts a string by ASCII code, "cabA" to "Aabc" for instance. import std.algorithm : sort; import std.stdio : writeln; "cabA".dup.sort.writeln; `dup` is used, because string cannot be modified, so a copy of string used instead.
std.container.array of struct inside a struct fails
This code: - import std.container.array; struct Test { Array!Test t; } - Fails with an error: - /usr/include/dlang/dmd/std/traits.d(2404): Error: struct arrayissue.Test no size because of forward reference /usr/include/dlang/dmd/std/traits.d(3462): Error: template instance std.traits.FieldTypeTuple!(Test) error instantiating /usr/include/dlang/dmd/std/container/array.d(276): instantiated from here: hasElaborateDestructor!(Test) arrayissue.d(4):instantiated from here: Array!(Test) /usr/include/dlang/dmd/std/container/array.d(280): Error: template instance std.traits.hasIndirections!(Test) error instantiating arrayissue.d(4):instantiated from here: Array!(Test) /usr/include/dlang/dmd/std/traits.d(2613): Error: template instance std.traits.RepresentationTypeTuple!(Test) error instantiating /usr/include/dlang/dmd/std/traits.d(2934):instantiated from here: hasRawAliasing!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1362): instantiated from here: hasAliasing!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1206): instantiated from here: moveEmplace!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1200):... (3 instantiations, -v to show) ... /usr/include/dlang/dmd/std/container/array.d(487): instantiated from here: RangeT!(Array!(Test)) arrayissue.d(4):instantiated from here: Array!(Test) /usr/include/dlang/dmd/std/traits.d(2934): Error: template instance std.traits.hasObjects!(Test) error instantiating /usr/include/dlang/dmd/std/algorithm/mutation.d(1362): instantiated from here: hasAliasing!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1206): instantiated from here: moveEmplace!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1200): instantiated from here: moveImpl!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1162):... (2 instantiations, -v to show) ... /usr/include/dlang/dmd/std/container/array.d(487): instantiated from here: RangeT!(Array!(Test)) arrayissue.d(4):instantiated from here: Array!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1372): Error: template instance std.traits.hasElaborateAssign!(Test) error instantiating /usr/include/dlang/dmd/std/algorithm/mutation.d(1206): instantiated from here: moveEmplace!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1200): instantiated from here: moveImpl!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1162): instantiated from here: trustedMoveImpl!(Test) /usr/include/dlang/dmd/std/container/array.d(148):... (1 instantiations, -v to show) ... /usr/include/dlang/dmd/std/container/array.d(487): instantiated from here: RangeT!(Array!(Test)) arrayissue.d(4):instantiated from here: Array!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1379): Error: template instance std.traits.hasElaborateCopyConstructor!(Test) error instantiating /usr/include/dlang/dmd/std/algorithm/mutation.d(1206): instantiated from here: moveEmplace!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1200): instantiated from here: moveImpl!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1162): instantiated from here: trustedMoveImpl!(Test) /usr/include/dlang/dmd/std/container/array.d(148):... (1 instantiations, -v to show) ... /usr/include/dlang/dmd/std/container/array.d(487): instantiated from here: RangeT!(Array!(Test)) arrayissue.d(4):instantiated from here: Array!(Test) - But if I use `Test[] t;` instead, everything is fine. Also, same code with `class` instead of `struct` works fine, and using `union` produces this error message: - /usr/include/dlang/dmd/std/traits.d(2404): Error: union arrayissue.Test no size because of forward reference /usr/include/dlang/dmd/std/traits.d(3025): Error: template instance std.traits.FieldTypeTuple!(Test) error instantiating /usr/include/dlang/dmd/std/container/array.d(280): instantiated from here: hasIndirections!(Test) arrayissue.d(4):instantiated from here: Array!(Test) /usr/include/dlang/dmd/std/traits.d(2613): Error: template instance std.traits.RepresentationTypeTuple!(Test) error instantiating /usr/include/dlang/dmd/std/traits.d(2934):instantiated from here: hasRawAliasing!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1362): instantiated from here: hasAliasing!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1206): instantiated from here: moveEmplace!(Test) /usr/include/dlang/dmd/std/algorithm/mutation.d(1200):... (3 instantiations, -v to show) ... /usr/include/dlang/dmd/std/container/array.d(487): instantiated from here: RangeT!(Array!(Test)) arrayissue.d(4):instantiated from here: Array!(Test) So, is it a bug, which should be fixed or it cannot be implemented at all? In this case it should be documented.