[ 
https://issues.apache.org/jira/browse/SPARK-34373?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17279279#comment-17279279
 ] 

Apache Spark commented on SPARK-34373:
--------------------------------------

User 'yaooqinn' has created a pull request for this issue:
https://github.com/apache/spark/pull/31479

> HiveThriftServer2 startWithContext may hang with a race issue 
> --------------------------------------------------------------
>
>                 Key: SPARK-34373
>                 URL: https://issues.apache.org/jira/browse/SPARK-34373
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.1, 3.1.0
>            Reporter: Kent Yao
>            Priority: Major
>
> ```
> 21:43:26.809 WARN org.apache.thrift.server.TThreadPoolServer: Transport error 
> occurred during acceptance of message.
> org.apache.thrift.transport.TTransportException: No underlying server socket.
>       at 
> org.apache.thrift.transport.TServerSocket.acceptImpl(TServerSocket.java:126)
>       at 
> org.apache.thrift.transport.TServerSocket.acceptImpl(TServerSocket.java:35)
>       at org.apache.thrift.transport.TServerTransport.acceException in thread 
> "Thread-15" java.io.IOException: Stream closed
>       at 
> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
>       at java.io.BufferedInputStream.read(BufferedInputStream.java:336)
>       at java.io.FilterInputStream.read(FilterInputStream.java:107)
>       at scala.sys.process.BasicIO$.loop$1(BasicIO.scala:238)
>       at scala.sys.process.BasicIO$.transferFullyImpl(BasicIO.scala:246)
>       at scala.sys.process.BasicIO$.transferFully(BasicIO.scala:227)
>       at scala.sys.process.BasicIO$.$anonfun$toStdOut$1(BasicIO.scala:221)
> ```
> the TServer might try to serve even the stop is called



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to