Hi Mark, you are right, we had another pig 0.10.0 jar on the class path.
Thanks again.


On Wed, Jul 3, 2013 at 9:12 PM, Jian Fang <jian.fang.subscr...@gmail.com>wrote:

> Thanks Mark. But I am pretty sure that I only have one version of pig. I
> wonder if it is a backward compatibility issue of pig 0.11.1.
>
>
> On Wed, Jul 3, 2013 at 5:57 PM, Mark Wagner <wagner.mar...@gmail.com>wrote:
>
>> HI Jian,
>>
>> I've seen this before when multiple versions of Pig are on your classpath.
>> I suggest looking around to see if something similar might be happening to
>> you.
>>
>> -Mark
>>
>>
>> On Wed, Jul 3, 2013 at 3:15 PM, Jian Fang <jian.fang.subscr...@gmail.com
>> >wrote:
>>
>> > Resend
>> >
>> >
>> > On Wed, Jul 3, 2013 at 2:34 PM, Jian Fang <
>> jian.fang.subscr...@gmail.com
>> > >wrote:
>> >
>> > > Hi,
>> > >
>> > > We have our Pig UDF working fine for 0.9.2.2, but after we upgraded to
>> > pig
>> > > 0.11.1 and the pig job failed due to the following error.
>> > >
>> > > at
>> > >
>> >
>> org\.apache\.pig\.impl\.util\.ObjectSerializer\.deserialize(ObjectSerializer\.java:55)
>> > >
>> > > at
>> > >
>> >
>> org\.apache\.pig\.impl\.util\.UDFContext\.deserialize(UDFContext\.java:192)
>> > > at
>> > >
>> >
>> org\.apache\.pig\.backend\.hadoop\.executionengine\.util\.MapRedUtil\.setupUDFContext(MapRedUtil\.java:159)
>> > >
>> > > at
>> > >
>> >
>> org\.apache\.pig\.backend\.hadoop\.executionengine\.mapReduceLayer\.PigOutputFormat\.setupUdfEnvAndStores(PigOutputFormat\.java:229)
>> > >
>> > > at
>> > >
>> >
>> org\.apache\.pig\.backend\.hadoop\.executionengine\.mapReduceLayer\.PigOutputFormat\.getOutputCommitter(PigOutputFormat\.java:275)
>> > >
>> > > at org\.apache\.hadoop\.mapred\.Task\.initialize(Task\.java:515)
>> > > at org\.apache\.hadoop\.mapred\.MapTask\.run(MapTask\.java:358)
>> > > at org\.apache\.hadoop\.mapred\.Child$4\.run(Child\.java:255)
>> > > at java\.security\.AccessController\.doPrivileged(Native Method)
>> > > at javax\.security\.auth\.Subject\.doAs(Subject\.java:396)
>> > > at
>> > >
>> >
>> org\.apache\.hadoop\.security\.UserGroupInformation\.doAs(UserGroupInformation\.java:1132)
>> > >
>> > > at org\.apache\.hadoop\.mapred\.Child\.main(Child\.java:249)
>> > > Caused by: java\.io\.StreamCorruptedException: invalid stream header:
>> > > 2DF52715
>> > > at
>> > >
>> >
>> java\.io\.ObjectInputStream\.readStreamHeader(ObjectInputStream\.java:782)
>> > > at java\.io\.ObjectInputStream\.<init>(ObjectInputStream\.java:279)
>> > > at
>> > >
>> >
>> org\.apache\.pig\.impl\.util\.ObjectSerializer\.deserialize(ObjectSerializer\.java:52)
>> > >
>> > > Does anyone know what was wrong and how to resolve it?
>> > >
>> > > Thanks,
>> > >
>> > > John
>> > >
>> >
>>
>
>

Reply via email to