Not my day I guess.

Trying with Hadoop 1.2.x

Getting :


Caused by: java.lang.RuntimeException: could not instantiate
'org.apache.pig.piggybank.storage.avro.AvroStorage' with arguments 'null'
        at
org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:618)

Caused by: java.lang.NoClassDefFoundError:
org/json/simple/parser/ParseException
        at java.lang.Class.getDeclaredConstructors0(Native Method)



When I attempt to load the relation using

Load 'path' USING org.apache.pig.piggybank.storage.avro.AvroStorage();

I've registered  :  json-simple-1.1.jar


On Thu, Sep 19, 2013 at 3:14 PM, j.barrett Strausser <
j.barrett.straus...@gmail.com> wrote:

> Are the releases from the download page not compatible with 23.x? or 2.X
>
> Says they are -
> http://pig.apache.org/releases.html#1+April%2C+2013%3A+release+0.11.1+available
>
> In any case I tried it with .23.9 and received a different error:
>
> 2013-09-19 15:13:56,044 [main] WARN
> org.apache.pig.backend.hadoop20.PigJobControl - falling back to default
> JobControl (not using hadoop 0.20 ?)
> java.lang.NoSuchFieldException: runnerState
>     at java.lang.Class.getDeclaredField(Class.java:1938)
>
>
>
>
>
> On Thu, Sep 19, 2013 at 2:24 PM, Mark Wagner <wagner.mar...@gmail.com>wrote:
>
>> It sounds like you're using a version of Pig that wasn't compiled for
>> Hadoop 2.x/.23. Try recompiling with 'ant clean jar
>> -Dhadoopversion=23'.
>>
>> -Mark
>>
>> On Thu, Sep 19, 2013 at 9:23 AM, j.barrett Strausser
>> <j.barrett.straus...@gmail.com> wrote:
>> > Running
>> >
>> > Hadoop-2.1.0-Beta
>> > Pig-0.11.1
>> > Hive-0.11.1
>> >
>> > 1. Created Avro backed table in Hive.
>> > 2. Loaded the table in Pig - records = Load '/path' USING
>> > org.apache.pig.piggybank.storage.avro.AvroStorage();
>> > 3. Can successfully describe the relation.
>> >
>> > I registered the following on pig start :
>> > REGISTER piggybank.jar
>> > REGISTER avro-*.jar
>> > REGISTER jackson-core-asl-1.8.8.jar
>> > REGISTER jackson-mapper-asl-1.8.8.jar
>> > REGISTER json-simple-1.1.jar
>> > REGISTER snappy-java-1.0.3.2.jar
>> >
>> > The avro tools are at 1.7.5
>> >
>> >
>> >
>> >
>> >
>> > *
>> > *
>> > *Running Dump Produces the following*:
>> >
>> > 2013-09-19 12:08:21,639 [JobControl] ERROR
>> > org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error while
>> trying
>> > to run jobs.
>> > java.lang.IncompatibleClassChangeError: Found interface
>> > org.apache.hadoop.mapreduce.JobContext, but class was expected
>> >     at
>> >
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
>> >     at
>> >
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
>> >     at
>> >
>> org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:441)
>> >     at
>> >
>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:340)
>> >
>> >
>> >
>> > 2013-09-19 12:08:21,651 [main] ERROR
>> > org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to
>> > recreate exception from backend error: Unexpected System Error Occured:
>> > java.lang.IncompatibleClassChangeError: Found interface
>> > org.apache.hadoop.mapreduce.JobContext, but class was expected
>> >     at
>> >
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
>> >     at
>> >
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
>> >     at
>> >
>> org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:441)
>> >     at
>> >
>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:340)
>> >
>> > *Running illustrate :*
>> >
>> > Pig Stack Trace
>> > ---------------
>> > ERROR 1070: Could not resolve
>> > org.apache.pig.piggybank.storage.avro.AvroStorage using imports: [,
>> > org.apache.pig.builtin., org.apache.pig.impl.builtin.]
>> >
>> > Pig Stack Trace
>> > ---------------
>> > ERROR 2998: Unhandled internal error.
>> >
>> org.apache.hadoop.mapreduce.Mapper$Context.<init>(Lorg/apache/hadoop/mapreduce/Mapper;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;L$
>> >
>> > java.lang.NoSuchMethodError:
>> >
>> org.apache.hadoop.mapreduce.Mapper$Context.<init>(Lorg/apache/hadoop/mapreduce/Mapper;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;Lorg/apach$
>> >
>> >
>> >
>> > Any thougths?
>> >
>> > --
>> >
>> >
>> > https://github.com/bearrito
>> > @deepbearrito
>>
>
>
>
> --
>
>
> https://github.com/bearrito
> @deepbearrito
>



-- 


https://github.com/bearrito
@deepbearrito

Reply via email to