[ 
https://issues.apache.org/jira/browse/PIG-5052?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15630611#comment-15630611
 ] 

Adam Szita commented on PIG-5052:
---------------------------------

Just one remark: I think sparkContext.getConf().getAppId() will return the same 
value for the same spark context. That means that (since we're not creating a 
new spark context every time we run a job) that more jobs will get the same ID. 
Would that still be fine for our use cases (etc. 
org.apache.pig.builtin.RANDOM#exec) ?

> Initialize MRConfiguration.JOB_ID in spark mode correctly
> ---------------------------------------------------------
>
>                 Key: PIG-5052
>                 URL: https://issues.apache.org/jira/browse/PIG-5052
>             Project: Pig
>          Issue Type: Sub-task
>          Components: spark
>            Reporter: liyunzhang_intel
>            Assignee: liyunzhang_intel
>             Fix For: spark-branch
>
>         Attachments: PIG-5052.patch
>
>
> currently, we initialize MRConfiguration.JOB_ID in SparkUtil#newJobConf.  
> we just set the value as a random string.
> {code}
>         jobConf.set(MRConfiguration.JOB_ID, UUID.randomUUID().toString());
> {code}
> We need to find a spark api to initiliaze it correctly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to