Knut Anders Hatlen wrote:
Kristian Waagan <[EMAIL PROTECTED]> writes:

db/derby/code/trunk/java/testing/org/apache/derbyTesting/functionTests/tests/jdbc4/CallableStatementTest.java
Sun May 27 11:30:45 2007
@@ -71,9 +71,9 @@
      */
     protected void tearDown()
         throws Exception {
-        if (cStmt != null && !cStmt.isClosed()) {
-            cStmt.close();
-        }
+
+        cStmt.close();
+        cStmt = null;
Just out of curiosity:
At first look the existing code looks sound - calling close only if
the statement is not null and not already closed.
Why is that inadequate?
A bit quick on the send button...
I understand why you nullify the reference, what I was wondering about
was if there was a reason for changing the if-block.
As far as I can tell, it removes complexity, but increases the chance
of a NPE. Unless there is some other reason I don's see?

Since none of the test cases in that test closes cStmt or sets it to
null, I didn't see any reason to check for it. As to the increased
chance of NPE, since the variable is initialized in setUp() and
shouldn't be changed afterwards, a NPE would (correctly) have been
thrown much earlier if prepareCall() had returned null. It seems like
it's the common practice in the tearDown() methods just to call close
with no checks, but feel free to change it back.

I was not thinking about prepareCall() returning null. I think the if was added because some people liked to do the cleaning up themselves, nullifying the reference in the test method. That's the NPE I was thinking about, but these things are probably best handled by code reviews (i.e. don't nullify references yourself).

The change is okay for me, was just wondering if there was something fishy with isClosed() so that it couldn't be trusted.


--
Kristian

Reply via email to