On 10/09/2010 19:20, Walter Bright wrote:
http://www.reddit.com/r/programming/comments/dc6ir/overlooked_essentials_for_optimizing_code/


Generally, an interesting article. However, there are a few points I would like to counter:

"Nope, it isn't avoiding premature optimization. It isn't replacing bubble sort with quicksort (i.e. algorithmic improvements). It's not what language used, nor is it how good the compiler is. It isn't writing i<<2 instead of i*4.

It is:

   1. Using a profiler
   2. Looking at the assembly code being executed
"

There is a bit confusion here. The first things (use algorithmic improvements, a better language or compiler, "i << 2" code, etc.) are not of the same nature as the other things (1 & 2 - Using a profiler and looking at the assembly ) The first are techniques for optimizing particular code, and the second ones are strategies for figuring out *which* code is best to try to optimize. It does not make sense to try to oppose the two, because the second one actually requires the first one to be useful. I mean, no code has ever improved in performance *strictly just* by using a profiler or looking at the assembly code being executed. :)


But more importantly 1 and 2, (especially 1: "using a profiler") are crucial elements of "avoiding premature optimization". I mean, I learnt "avoid premature optimization" as being "don't optimize code until you *know* that code is a bottleneck", and in something like 80% of the books/articles/college-courses that taught about premature optimization, they also explicitly mentioned that running a profiler was by far the best way to determine which code is the bottleneck, and that the alternative of guessing is bad, because it is often wrong.



"Though that is undeniably true, there are two caveats that don't get taught in schools. First and most importantly, choosing the best algorithm for a part of the program that has no participation to the performance profile has a negative effect on optimization because it wastes your time that could be better invested in making actual progress, and diverts attention from the parts that matter."

I don't think its true that it "doesn't get taught in schools", at least in CS university degrees. In my degree this was explained in detail at least 2 core curricula courses, and a few more times in other non-core (optional/specialization) courses. Also, (like I mentioned above) the majority of other learning material (articles, books) that talk about optimization do mention the importance of profiling.

"Second, algorithms' performance always varies with the statistics of the data they operate on. Even bubble sort, the butt of all jokes, is still the best on almost-sorted data that has only a few unordered items. So worrying about using good algorithms without measuring where they matter is a waste of time - your's and computer's."

Also, the notion that an intrinsic part of algorithm design is to understand what kind of data you are going to work was also mentioned in one of the core curricula courses in my degree (although with less detail than with case 1, above).

I don't mean to offend anyone, but if you CS degree (at least for the last decade or so), doesn't teach about points 1 and 2 above as part of core curricula, then it's a pretty crappy CS degree. The same is probably also true for other related degrees (*-engineering, maths), at least with regards to point 1.

--
Bruno Medeiros - Software Engineer

Reply via email to