[jira] [Updated] (SPARK-2636) Expose job ID in JobWaiter API

2014-08-29 Thread Reynold Xin (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2636?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Reynold Xin updated SPARK-2636:
---

Assignee: Chengxiang Li

> Expose job ID in JobWaiter API
> --
>
> Key: SPARK-2636
> URL: https://issues.apache.org/jira/browse/SPARK-2636
> Project: Spark
>  Issue Type: New Feature
>  Components: Java API
>Reporter: Chengxiang Li
>Assignee: Chengxiang Li
>  Labels: hive
>
> In Hive on Spark, we want to track spark job status through Spark API, the 
> basic idea is as following:
> # create an hive-specified spark listener and register it to spark listener 
> bus.
> # hive-specified spark listener generate job status by spark listener events.
> # hive driver track job status through hive-specified spark listener. 
> the current problem is that hive driver need job identifier to track 
> specified job status through spark listener, but there is no spark API to get 
> job identifier(like job id) while submit spark job.
> I think other project whoever try to track job status with spark API would 
> suffer from this as well.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-2636) Expose job ID in JobWaiter API

2014-08-29 Thread Reynold Xin (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2636?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Reynold Xin updated SPARK-2636:
---

Target Version/s: 1.2.0

> Expose job ID in JobWaiter API
> --
>
> Key: SPARK-2636
> URL: https://issues.apache.org/jira/browse/SPARK-2636
> Project: Spark
>  Issue Type: New Feature
>  Components: Java API
>Reporter: Chengxiang Li
>  Labels: hive
>
> In Hive on Spark, we want to track spark job status through Spark API, the 
> basic idea is as following:
> # create an hive-specified spark listener and register it to spark listener 
> bus.
> # hive-specified spark listener generate job status by spark listener events.
> # hive driver track job status through hive-specified spark listener. 
> the current problem is that hive driver need job identifier to track 
> specified job status through spark listener, but there is no spark API to get 
> job identifier(like job id) while submit spark job.
> I think other project whoever try to track job status with spark API would 
> suffer from this as well.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-2636) Expose job ID in JobWaiter API

2014-08-28 Thread Patrick Wendell (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-2636?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell updated SPARK-2636:
---

Summary: Expose job ID in JobWaiter API  (was: no where to get job 
identifier while submit spark job through spark API)

> Expose job ID in JobWaiter API
> --
>
> Key: SPARK-2636
> URL: https://issues.apache.org/jira/browse/SPARK-2636
> Project: Spark
>  Issue Type: New Feature
>  Components: Java API
>Reporter: Chengxiang Li
>  Labels: hive
>
> In Hive on Spark, we want to track spark job status through Spark API, the 
> basic idea is as following:
> # create an hive-specified spark listener and register it to spark listener 
> bus.
> # hive-specified spark listener generate job status by spark listener events.
> # hive driver track job status through hive-specified spark listener. 
> the current problem is that hive driver need job identifier to track 
> specified job status through spark listener, but there is no spark API to get 
> job identifier(like job id) while submit spark job.
> I think other project whoever try to track job status with spark API would 
> suffer from this as well.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org