On Mon, 13 May 2013, Greg Ewing wrote:

Wayne Werner wrote:
On Fri, 10 May 2013, Gregory Ewing wrote:

  f = open("myfile.dat")
  f.close()
  data = f.read()

To clarify - you don't want a class that has functions that need to be called in a certain order with *valid input* in order to not crash.

Exactly what does happen - a ValueError is raised because you're(*) passing self into the file.read() function, and that input is invalid

The same argument can be applied to:

  foo = Foo()
  foo.do_something()
  foo.enable() # should have done this first

You're passing an invalid input to Foo.do_something,
namely a Foo that hasn't been enabled yet.

That is the crux of the argument - as designer of the class *you* need to ensure that when your constructor is done, your class is in a stable state. And that every other state transition (with valid input) results in your class then being in a stable state.


If anything, the stronger argument is that `file.close()` is not a well designed function because it leaves your object in an unstable state.

Which I would be inclined to agree with, but I couldn't give you the answer for what makes it better. Because the answer is the best one you can get in computer science: It depends.


The reason that it depends, is because it depends on what you want to do. Do you want a program that seems purely functional? Do you want a program that's easy to maintain? Do you want a program that more accurately models the "real world"?

Personally, I think the file object API in Python is about as good as it can get - but that's because it's working with "physical" things (i.e. files - bits on a platter, or flash/SSD drive...) which necessarily have a temporal nature. And it's much less badness to blow up on a call to `read` than it is to remove the `read` function and die with a NameError when the underlying file is in a closed state.


At least in my opinion ;)
-W
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to