liyunzhang_intel created PIG-5051:
-------------------------------------

             Summary: Initialize PigContants.TASK_INDEX in spark mode correctly
                 Key: PIG-5051
                 URL: https://issues.apache.org/jira/browse/PIG-5051
             Project: Pig
          Issue Type: Sub-task
            Reporter: liyunzhang_intel


in MR, we initialize PigContants.TASK_INDEX in  
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce.Reduce#setup
 
{code}
protected void setup(Context context) throws IOException, InterruptedException {
   ...
    context.getConfiguration().set(PigConstants.TASK_INDEX, 
Integer.toString(context.getTaskAttemptID().getTaskID().getId()));
...
}
{code}
But spark does not provide funtion like PigGenericMapReduce.Reduce#setup to 
initialize PigContants.TASK_INDEX when job starts. We need find a solution to 
initialize PigContants.TASK_INDEX correctly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to