[ 
https://issues.apache.org/jira/browse/PIG-5283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16113817#comment-16113817
 ] 

liyunzhang_intel commented on PIG-5283:
---------------------------------------

[~nkollar] and [~szita]: Can we set the correct value of 
CommonConfigurationKeys.IO_SERIALIZATIONS_KEY in  pig on spark to avoid the 
problem?
If we can not  ,+1 for the patch.

> Configuration is not passed to SparkPigSplits on the backend
> ------------------------------------------------------------
>
>                 Key: PIG-5283
>                 URL: https://issues.apache.org/jira/browse/PIG-5283
>             Project: Pig
>          Issue Type: Bug
>          Components: spark
>            Reporter: Adam Szita
>            Assignee: Adam Szita
>         Attachments: PIG-5283.0.patch
>
>
> When a Hadoop ObjectWritable is created during a Spark job, the instantiated 
> PigSplit (wrapped into a SparkPigSplit) is given an empty Configuration 
> instance.
> This happens 
> [here|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SerializableWritable.scala#L44]



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to