On May 23, 2012, at 5:48 PM, Alejandro Abdelnur wrote:

> Mmmh,
> 
> I can think of 2 possible reasons:
> 
> 1* are you putting all pig dependency JARs except the Hadoop ones in your
> share/lib/pig/ directory?
> 2* the pig JAR you are including has ALL Hadoop JARs & dependencies in it
> and those don't match your cluster version
> 
> The correct thing would be to modify oozie sharelib/pig/pom to use the
> right version of Pig.
Cool - so I updated the version for pig which I found in the root pom.xml - I 
guess to have configuration settings in one place.  I rebuilt and copied those 
libs into the share/lib/pig and reran - same error.  However, I'll try some 
messing around with whether it uses the without-hadoop version of pig 0.10 and 
see if that helps.  Will post when I get things working.  Thanks for the 
insights!


> 
> thx
> 
> 
> On Wed, May 23, 2012 at 12:51 PM, Jeremy Hanna
> <[email protected]>wrote:
> 
>> I'm using Pig 0.10.0 with Oozie 3.2 and am getting an error when trying to
>> initialize a Pig script.  The way I got Pig 0.10.0 in there was to build it
>> from source, then clear out the contents of the oozie share lib/pig and put
>> the jars in the pig lib from 0.10.0 and the pig without hadoop jar.
>> 
>> I get this error when it tries to run:
>> 
>> Run pig script using PigRunner.run() for Pig version 0.8+
>> ERROR org.apache.pig.Main - ERROR 2999: Unexpected internal error. unable
>> to read pigs manifest file
>> WARN  org.apache.pig.Main - There is no log file to write to.
>> ERROR org.apache.pig.Main - java.lang.RuntimeException: unable to read
>> pigs manifest file
>>       at org.apache.pig.Main.getVersionString(Main.java:737)
>>       at org.apache.pig.Main.run(Main.java:235)
>>       at org.apache.pig.PigRunner.run(PigRunner.java:49)
>>       at
>> org.apache.oozie.action.hadoop.PigMain.runPigJob(PigMain.java:282)
>>       at org.apache.oozie.action.hadoop.PigMain.run(PigMain.java:218)
>>       at
>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
>>       at org.apache.oozie.action.hadoop.PigMain.main(PigMain.java:76)
>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>       at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>       at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>       at java.lang.reflect.Method.invoke(Method.java:597)
>>       at
>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:467)
>>       at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>>       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>>       at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>       at java.security.AccessController.doPrivileged(Native Method)
>>       at javax.security.auth.Subject.doAs(Subject.java:396)
>>       at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
>>       at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: java.lang.NullPointerException
>>       at org.apache.pig.Main.getVersionString(Main.java:729)
>>       ... 19 more
>> 
>> The odd thing is that when I use the same compiled pig on the command line
>> - doing pig -i, which is what the exception points to I get this output:
>> /opt/pig$ pig -i
>> Apache Pig version 0.10.0-SNAPSHOT (rexported)
>> compiled May 17 2012, 20:38:57
>> 
>> Is there something I need to do differently when putting the pig libs into
>> the share/lib/pig or maybe make sure the version is non-SNAPSHOT or
>> something?  Seems odd to me that it would get an NPE though.
> 
> 
> 
> 
> -- 
> Alejandro

Reply via email to