Yeah, I support Kent. Brian's code is obviously C style, define a variable and give it an origin value, then use it, modify its value and so on. If you choose Python, you should adapt to it that variable needn't to be defined specificly before being used!def is_leap_year(year): is_leap = True try: datetime.date(year, 2, 29) except ValueError: is_leap = False return is_leap
I would write def is_leap_year(year): try: datetime.date(year, 2, 29) return True except ValueError: return False
I far prefer the Brian's version, because it lets me set a single breakpoint while I'm debugging, and I can look at the return value before returning it, instead of having to set n breakpoints (or, usually n-1 because I've overlooked the one that's actually being executed) and looking at what's being returned on each line. (Yes, I do the same thing in C and C++, but I originally started using it in Java, and after a few debugging sessions it makes a lot of sense.) Only having one return point from a function is a long-standing convention that is supposed to make programs easier to read/debug/optimize/prove correct.
Later, Blake.
_______________________________________________ Tutor maillist - [EMAIL PROTECTED] http://mail.python.org/mailman/listinfo/tutor