kotlovs edited a comment on pull request #32081:
URL: https://github.com/apache/spark/pull/32081#issuecomment-817048293


   @dongjoon-hyun, @mridulm  thanks a lot for helping with this issue!
   
   Do you mean that this build problem is only about Hive Thrift Server tests?
   Looking at the code of HiveThriftServer2, it seems that it works in this 
way: starting SparkContext and the Server, exit main(), and working as a server 
that processes incoming requests.
   And closing the SparkContext after the main() breaks this behavior.
   
   Below code doesn't look as a good solution, but for test purposes, I was be 
able to fix this test issue by code:
   ```
   if (args.mainClass != 
"org.apache.spark.sql.hive.thriftserver.HiveThriftServer2") {
     SparkContext.getActive.foreach(_.stop())
   }
   ```
   May be we can introduce another application arg, for example _--isServer_ 
(HiveThriftServer2 scripts will set it), and take it into account when closing 
the context?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to