Hi guys..


I am previously using hadoop and Hbase...



So for Hbase to run perfectly fine we need Hadoop-0.20-append for Hbase jar
files.. So I am using Hadoop-0.20-append jar files.. which made both my
hadoop and hbase to work fine..

Now I want to use pig for my hadoop and hbase clusters..

I downloaded pig 0.8.0... and configured pig to run in map-reduce mode by
setting the pig_classpath to point to the $HADOOP_HOME/conf directory. Then
running ‘pig’ gives the following error messeage.



hadoop@ub13:/usr/local/pig/bin$ pig

2011-07-01 17:41:52,150 [main] INFO  org.apache.pig.Main - Logging error
messages to: /usr/local/pig/bin/pig_1309522312144.log

2011-07-01 17:41:52,454 [main] INFO
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting
to hadoop file system at: hdfs://ub13:54310

2011-07-01 17:41:52,654 [main] ERROR org.apache.pig.Main - ERROR 2999:
Unexpected internal error. Failed to create DataStorage

LOG MESSAGE -----

Error before Pig is launched---------------------------

ERROR 2999: Unexpected internal error. Failed to create DataStorage

java.lang.RuntimeException: Failed to create DataStorage

at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)

at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)

at
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)

at
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)

at org.apache.pig.impl.PigContext.connect(PigContext.java:183)

at org.apache.pig.PigServer.<init>(PigServer.java:226)

at org.apache.pig.PigServer.<init>(PigServer.java:215)

at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)

at org.apache.pig.Main.run(Main.java:452)

at org.apache.pig.Main.main(Main.java:107)

Caused by: org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol
org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client =
41, server = 43)

at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)

at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)

at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)

at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)

at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)

at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)

... 9 more
================================================================================

I guess the problem is the version mismatch between the hadoop-append-core
jar files that my hadoop/hbase clusters is using currently and the
hadoop-core jar files that pig is using.Anyone faced any similar kind of
issue..???
On the documentation website... its written requirement as hadoop-0.20.2,
but the problem is I want to use my hadoop and hbase along with pig also.

Any suggestions.. how to resolve this issue..!!
Can anyone please mention which version of each one of them, are compatible
with each other to work fine to put them in production.

Thanks,
Praveenesh

Reply via email to