[ https://issues.apache.org/jira/browse/SPARK-13064?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Zhuo Liu updated SPARK-13064: ----------------------------- Description: For any application launches with spark-shell will not have attemptId field in their rest API. From the REST API point of view, we might want to force a Id for it, i.e., "1". {code} { "id" : "application_1453789230389_377545", "name" : "PySparkShell", "attempts" : [ { "startTime" : "2016-01-28T02:17:11.035GMT", "endTime" : "2016-01-28T02:30:01.355GMT", "lastUpdated" : "2016-01-28T02:30:01.516GMT", "duration" : 770320, "sparkUser" : "huyng", "completed" : true } ] } {code} was: For any application launches with spark-shell will not have attemptId field in their rest API. From the REST API point of view, we might want to force a Id for it, i.g., "1". {code} { "id" : "application_1453789230389_377545", "name" : "PySparkShell", "attempts" : [ { "startTime" : "2016-01-28T02:17:11.035GMT", "endTime" : "2016-01-28T02:30:01.355GMT", "lastUpdated" : "2016-01-28T02:30:01.516GMT", "duration" : 770320, "sparkUser" : "huyng", "completed" : true } ] } {code} > api/v1/application/jobs/attempt lacks "attempId" field for spark-shell > ---------------------------------------------------------------------- > > Key: SPARK-13064 > URL: https://issues.apache.org/jira/browse/SPARK-13064 > Project: Spark > Issue Type: Bug > Reporter: Zhuo Liu > Priority: Minor > > For any application launches with spark-shell will not have attemptId field > in their rest API. From the REST API point of view, we might want to force a > Id for it, i.e., "1". > {code} > { > "id" : "application_1453789230389_377545", > "name" : "PySparkShell", > "attempts" : [ { > "startTime" : "2016-01-28T02:17:11.035GMT", > "endTime" : "2016-01-28T02:30:01.355GMT", > "lastUpdated" : "2016-01-28T02:30:01.516GMT", > "duration" : 770320, > "sparkUser" : "huyng", > "completed" : true > } ] > } > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org