On 20/10/2010 16:17, dsimcha wrote:
== Quote from Bruno Medeiros (brunodomedeiros+s...@com.gmail)'s
Also, the notion that an intrinsic part of algorithm design is to
understand what kind of data you are going to work was also mentioned in
one of the core curricula courses in my degree (although with less
detail than with case 1, above).
I don't mean to offend anyone, but if you CS degree (at least for the
last decade or so), doesn't teach about points 1 and 2 above as part of
core curricula, then it's a pretty crappy CS degree. The same is
probably also true for other related degrees (*-engineering, maths), at
least with regards to point 1.

You have a point to some degree, but I've noticed two related themes from my
experience with my Ph.D. research and with the D community.  Both of these are
things Walter seems to understand and exploit exceptionally well.

1.  There's a big difference between "it was covered" and "most people actually
understood it".  Of course the first is a necessary condition for the second, 
but
it's not a sufficient condition.  Of course any CS/software engineering program
worth its salt will recommend using a profiler, but the question is, does it 
sink
in for most students or was it mentioned in maybe one lecture on a Friday 
morning
when half the class was hung over and the other half was still sleeping?


If a student is not paying attention, didn't go to class, or didn't study a topic, that's the student's fault and there is little that the university can (or even should) try to do change that. But that's besides the point. I wasn't arguing that a lot of students or professionals understand the issue around premature optimization and profilers. I was just arguing against "[it] doesn't get taught in schools". I do agree however, that a topic can be covered with varying degrees of detail and importance.

As for the premature-optimization/profiling, it was well covered in my degree. I clearly remember it being mentioned in the very first "Introduction to Programming" course (based on "Structure and Interpretation of Computer Programs", the MIT one), where the Paretto principle as applied to programs was explained (aka the 90/10 law). And the topic was again studied during the "Software Engineering" course, as part of the key larger topic of how best to manage/optimize the time of the programmers and team members in a real-world project. It was also mentioned (in the context of the Agile series) that even if you know for sure that the code you are writing is indeed part of the 10% bottleneck, you likely should not optimize it yet, as changing requirements may make that code not used anymore, or no longer part of the bottleneck.

As for the issue of the importance of analyzing the inputs of the algorithms, well, that one was also mentioned, but definitely not covered that well. It was mentioned in the context of hashing keys, but mostly only in passing, a lot of the nuances of the topic where not discussed. It was only recently that I learned more about this, as I watched the MIT OCW lectures of the "Introduction to Algorithms" course, as well as reading part of the respective book. For example the issue of deterministic hashing vs. probabilistic hashing: if you have a deterministic hashing function which indeed does distribute your input keys well on the average case, that may actually not be good enough! Because if people have access to how your hashing function works, an adversary can force the worst case to happen! (random hashing is the solution to that) I guess that with the rise of the Internet, scenarios where this matters are more common.

2.  "It's been done before" is not a good reason not to do something if you're
going to up-level it.  If something's been done before at a proof-of-concept
level, it's often rewarding to tweak/improve it to the point where it's 
practical.
  If it's practical, it's often rewarding to try to tweak/improve it so it's 
usable
by mere mortals and is well integrated into whatever other stuff might be 
related.
  If it's usable by mere mortals, it's often rewarding to figure out what
improvements need to be made to break barriers to widespread adoption.

??
I have not idea what you mean by this...

--
Bruno Medeiros - Software Engineer

Reply via email to