Yup, that seems like a classpath issue. Also, make sure to compile pig with
the correct hadoop version if you are using the fat jar.


On Tue, Feb 11, 2014 at 9:05 PM, Skanda <[email protected]> wrote:

> Hi Russell,
>
> Which version of HBase and Hadoop are you using? The reason for this issue
> is that TaskAttemptContext is an interface in Hadoop 2.x but is a class in
> Hadoop 1.x.
>
> Regards,
> Skanda
>
>
> On Wed, Feb 12, 2014 at 10:06 AM, James Taylor <[email protected]>wrote:
>
>> This is beyond my knowledge of Pig, but Prashant may know as he
>> contributed our Pig integration.
>>
>> Thanks,
>> James
>>
>>
>> On Tue, Feb 11, 2014 at 4:34 PM, Russell Jurney <[email protected]
>> > wrote:
>>
>>> I am trying to store data into this table:
>>>
>>> CREATE TABLE IF NOT EXISTS BEACONING_ACTIVITY  (
>>>
>>> EVENT_TIME VARCHAR NOT NULL,
>>> C_IP VARCHAR NOT NULL,
>>> CS_HOST VARCHAR NOT NULL,
>>>  SLD  VARCHAR NOT NULL,
>>> CONFIDENCE DOUBLE NOT NULL,
>>> RISK DOUBLE NOT NULL,
>>>  ANOMOLY DOUBLE NOT NULL,
>>> INTERVAL DOUBLE NOT NULL
>>>
>>> CONSTRAINT PK PRIMARY KEY (EVENT_TIME, C_IP, CS_HOST)
>>> );
>>>
>>>
>>> Using this Pig:
>>>
>>> hosts_and_risks = FOREACH hosts_and_anomaly GENERATE hour, c_ip,
>>> cs_host, sld, confidence, (confidence * anomaly) AS risk:double, anomaly,
>>> interval;
>>> --hosts_and_risks = ORDER hosts_and_risks BY risk DESC;
>>> --STORE hosts_and_risks INTO '/tmp/beacons.txt';
>>> STORE hosts_and_risks into 'hbase://BEACONING_ACTIVITY' using
>>> com.salesforce.phoenix.pig.PhoenixHBaseStorage('hiveapp1','-batchSize
>>> 5000');
>>>
>>> And the most helpful error message I get is this:
>>>
>>> 2014-02-11 16:24:13,831 FATAL org.apache.hadoop.mapred.Child: Error running 
>>> child : java.lang.IncompatibleClassChangeError: Found interface 
>>> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
>>>     at 
>>> com.salesforce.phoenix.pig.hadoop.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:75)
>>>     at 
>>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:84)
>>>     at 
>>> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:597)
>>>     at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:444)
>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at 
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
>>>
>>>
>>> What am I to do?
>>>
>>>
>>> --
>>> Russell Jurney twitter.com/rjurney [email protected] datasyndrome
>>> .com
>>>
>>
>>
>

Reply via email to