[GitHub] spark pull request #20287: [SPARK-23121][WEB-UI] When the Spark Streaming ap...
Github user guoxiaolongzte closed the pull request at: https://github.com/apache/spark/pull/20287 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20287: [SPARK-23121][WEB-UI] When the Spark Streaming ap...
Github user smurakozi commented on a diff in the pull request: https://github.com/apache/spark/pull/20287#discussion_r162566289 --- Diff: core/src/main/scala/org/apache/spark/ui/jobs/AllJobsPage.scala --- @@ -427,17 +430,24 @@ private[ui] class JobDataSource( val formattedDuration = duration.map(d => UIUtils.formatDuration(d)).getOrElse("Unknown") val submissionTime = jobData.submissionTime val formattedSubmissionTime = submissionTime.map(UIUtils.formatDate).getOrElse("Unknown") -val lastStageAttempt = store.lastStageAttempt(jobData.stageIds.max) -val lastStageDescription = lastStageAttempt.description.getOrElse("") - +var lastStageDescription = "" --- End diff -- Instead of catching the exception the logic should be modified to be prepared for a missing stageAttempt. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20287: [SPARK-23121][WEB-UI] When the Spark Streaming ap...
Github user smurakozi commented on a diff in the pull request: https://github.com/apache/spark/pull/20287#discussion_r162566562 --- Diff: core/src/main/scala/org/apache/spark/ui/jobs/JobPage.scala --- @@ -335,9 +335,12 @@ private[ui] class JobPage(parent: JobsTab, store: AppStatusStore) extends WebUIP content ++= makeTimeline(activeStages ++ completedStages ++ failedStages, store.executorList(false), appStartTime) - -content ++= UIUtils.showDagVizForJob( - jobId, store.operationGraphForJob(jobId)) +try { + content ++= UIUtils.showDagVizForJob( +jobId, store.operationGraphForJob(jobId)) +} catch { + case e => None +} --- End diff -- Same here. We should avoid the situation when the exception is thrown. Catching the exception and doing nothing just hides the problems. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20287: [SPARK-23121][WEB-UI] When the Spark Streaming ap...
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/20287#discussion_r161950447 --- Diff: core/src/main/scala/org/apache/spark/ui/jobs/AllJobsPage.scala --- @@ -65,10 +65,13 @@ private[ui] class AllJobsPage(parent: JobsTab, store: AppStatusStore) extends We }.map { job => val jobId = job.jobId val status = job.status - val jobDescription = store.lastStageAttempt(job.stageIds.max).description - val displayJobDescription = jobDescription -.map(UIUtils.makeDescription(_, "", plainText = true).text) -.getOrElse("") + var displayJobDescription = "" + try { +displayJobDescription = store.lastStageAttempt(job.stageIds.max).description + .map(UIUtils.makeDescription(_, "", plainText = true).text).getOrElse("") + } catch { +case e => displayJobDescription = job.description.getOrElse("") --- End diff -- No, you're catching all exceptions. This should check whether the last stage attempt exists rather than catching anything that goes wrong. I don't think it's correct to pretend it exists but is empty. The request is for something that doesn't exist and ideally generates a 404. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20287: [SPARK-23121][WEB-UI] When the Spark Streaming ap...
GitHub user guoxiaolongzte opened a pull request: https://github.com/apache/spark/pull/20287 [SPARK-23121][WEB-UI] When the Spark Streaming app is running for a period of time, the page is incorrectly reported when accessing '/jobs' or '/jobs/job?id=13' ## What changes were proposed in this pull request? When the Spark Streaming app is running for a period of time, the page is incorrectly reported when accessing '/ jobs /' or '/ jobs / job /? Id = 13' and ui can not be accessed. Test command: ./bin/spark-submit --class org.apache.spark.examples.streaming.HdfsWordCount ./examples/jars/spark-examples_2.11-2.4.0-SNAPSHOT.jar /spark The app is running for a period of time, ui can not be accessed, please see attachment. fix before: ![1](https://user-images.githubusercontent.com/26266482/35024280-8c06f79e-fb79-11e7-8e5c-b804e06945d2.png) ![2](https://user-images.githubusercontent.com/26266482/35024281-8c353906-fb79-11e7-8f99-4e1bfbac9776.png) ## How was this patch tested? manual tests Please review http://spark.apache.org/contributing.html before opening a pull request. You can merge this pull request into a Git repository by running: $ git pull https://github.com/guoxiaolongzte/spark SPARK-23121 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/20287.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #20287 commit 03a84436ef2b6227f8bcfdd0b803c9457c8bd5cd Author: guoxiaolongDate: 2018-01-17T03:26:19Z [SPARK-23121][WEB-UI]When the Spark Streaming app is running for a period of time, the page is incorrectly reported when accessing '/josb' or '/jobs/job?id=13' --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org