We wouldn't _need_ to change the algorithm, but if we do generate from (say) a flat file of constants (or something more sophisticated, like the Windows header files), the potential for using more constants in a program is larger, and a better algorithm, while it won't change the run time of the program, will improve startup times (rather than making it worse when adding large numbers of constants).

It should be changed, but the focus shouldn't be on it persay, but on the bigger picture so to speak. If a solution to constants is to be found, it'll be a compromise on performance, memory and ease of use - and the devil will be in the details:)

In my case, my main app comes in a 5.32 MB as the final exe, and once it's started it uses 50 meg - before it's even done anything! Increasing the size of the GUI.dll, or adding some constant dll of say 200K would be nothing for this application. Would it still be the case for a simple script? Even if we're talking about increasing the size by a much larger number does it really matter if the solution is efficient and works well?

Personally, I would always sacrifice memory for a tangible increase in performance - but I know other people who say otherwise:)

Cheers,

jez.



Reply via email to