[GitHub] spark pull request #15381: [SPARK-17707] [WEBUI] Web UI prevents spark-submi...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/15381 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #15381: [SPARK-17707] [WEBUI] Web UI prevents spark-submi...
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/15381#discussion_r82339491 --- Diff: sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/thrift/ThriftHttpCLIService.java --- @@ -90,8 +95,21 @@ public void run() { Arrays.toString(sslContextFactory.getExcludeProtocols())); sslContextFactory.setKeyStorePath(keyStorePath); sslContextFactory.setKeyStorePassword(keyStorePassword); -connector = new ServerConnector(httpServer, sslContextFactory); +connectionFactories = AbstractConnectionFactory.getFactories( --- End diff -- I don't think so just because I copied it from the Jetty constructor that was already being called in this path: ``` /* */ /** HTTP Server Connection. * Construct a ServerConnector with a private instance of {@link HttpConnectionFactory} as the primary protocol. * @param server The {@link Server} this connector will accept connection for. * @param sslContextFactory If non null, then a {@link SslConnectionFactory} is instantiated and prepended to the * list of HTTP Connection Factory. */ public ServerConnector( @Name("server") Server server, @Name("sslContextFactory") SslContextFactory sslContextFactory) { this(server,null,null,null,-1,-1,AbstractConnectionFactory.getFactories(sslContextFactory,new HttpConnectionFactory())); } ``` However if this wasn't actually the intended behavior we can change it of course. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #15381: [SPARK-17707] [WEBUI] Web UI prevents spark-submi...
Github user zsxwing commented on a diff in the pull request: https://github.com/apache/spark/pull/15381#discussion_r82326116 --- Diff: sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/thrift/ThriftHttpCLIService.java --- @@ -90,8 +95,21 @@ public void run() { Arrays.toString(sslContextFactory.getExcludeProtocols())); sslContextFactory.setKeyStorePath(keyStorePath); sslContextFactory.setKeyStorePassword(keyStorePassword); -connector = new ServerConnector(httpServer, sslContextFactory); +connectionFactories = AbstractConnectionFactory.getFactories( --- End diff -- This will expose both http and https, and it's a behavior change. Right? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #15381: [SPARK-17707] [WEBUI] Web UI prevents spark-submi...
GitHub user srowen opened a pull request: https://github.com/apache/spark/pull/15381 [SPARK-17707] [WEBUI] Web UI prevents spark-submit application to be finished ## What changes were proposed in this pull request? This expands calls to Jetty's simple `ServerConnector` constructor to explicitly specify a `ScheduledExecutorScheduler` that makes daemon threads. It should otherwise result in exactly the same configuration, because the other args are copied from the constructor that is currently called. (I'm not sure we should change the Hive Thriftserver impl, but I did anyway.) This also adds `sc.stop()` to the quick start guide example. ## How was this patch tested? Existing tests; _pending_ at least manual verification of the fix. You can merge this pull request into a Git repository by running: $ git pull https://github.com/srowen/spark SPARK-17707 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/15381.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #15381 commit 91144aaa2bf499ed96341ff03cbc31978d69d8e6 Author: Sean OwenDate: 2016-10-06T18:59:39Z Force Jetty ServerConnectors to use daemon Scheduler threads commit ad19603ff91aff2302735591e3919590c1be2ab9 Author: Sean Owen Date: 2016-10-06T18:59:57Z Emphasize in docs that SparkContext should be stopped at exit --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org