On Thu, Oct 25, 2012 at 05:34:05AM +0200, Mehrdad wrote: > Funny, I was actually building a (GLR) parser, too. > > Keeping track of duplicate item sets was what tripped me up.
Oh. You should've said so earlier -- for such specific applications, you don't _need_ a fully generalized solution to the AA hash problem. You can just implement your own .toHash methods on the structs/objects your parser uses, and be done with it. No need to try to accomodate every single type D has or can have -- that's the kind of thing Phobos developers have to worry about, not people who have a specific application in mind. I've learnt from hard experience that premature generalization is just as evil as premature optimization. (Believe me, I'm a sucker for completely general solutions too -- I like killing ants with nuclear warheads, proverbially speaking, just for the feeling of confidence that that it would've also worked in the hypothetical scenario of levelling an entire city.) You usually end up completely frustrated that the system/language you're using just doesn't have that one last bit of homogeneity for your generalization to fully work, and eventually the project never gets off the ground. But in many cases, I've learned, this is all needless pain. Write the case-specific code first, and then refactor and generalize it afterwards. Works much better, I find, and you have a tangible product to speak of from the get-go, instead of spending countless hours writing a generic framework that eventually never gets used. Plus, when you actually have a *working* codebase, exactly which generalizations will work, and which are just castles in the air, tend to be a lot more obvious than when you're just sketching the product on the drawing board. Just my slightly more than $0.02. :) T -- English has the lovely word "defenestrate", meaning "to execute by throwing someone out a window", or more recently "to remove Windows from a computer and replace it with something useful". :-) -- John Cowan