juliuszsompolski commented on code in PR #43985:
URL: https://github.com/apache/spark/pull/43985#discussion_r1409865055


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/SessionHolder.scala:
##########
@@ -205,14 +220,26 @@ case class SessionHolder(userId: String, sessionId: 
String, session: SparkSessio
    * Called only by SparkConnectSessionManager.
    */
   private[connect] def close(): Unit = {
+    // It is called only by SparkConnectSessionManager.closeSession, only 
after it's removed from
+    // the sessionStore there guarantees that it is called only once.
+    if (closedTime.isDefined) {
+      throw new IllegalStateException(s"Session $key is already closed.")
+    }
+
     logInfo(s"Closing session with userId: $userId and sessionId: $sessionId")
 
-    // After isClosing=true, SessionHolder.addExecuteHolder() will not allow 
new executions for
-    // this session. Because both SessionHolder.addExecuteHolder() and
+    // After closedTime is defined, SessionHolder.addExecuteHolder() will not 
allow new executions
+    // to be added for this session. Because both 
SessionHolder.addExecuteHolder() and
     // SparkConnectExecutionManager.removeAllExecutionsForSession() are 
executed under
     // executionsLock, this guarantees that removeAllExecutionsForSession 
triggered below will
     // remove all executions and no new executions will be added in the 
meanwhile.
-    isClosing = true
+    closedTime = Some(System.currentTimeMillis())

Review Comment:
   I'll think about this locking with a fresh head tomorrow. I agree there is 
something too complicated here if it requires extra explanation like that :-).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to