I've built pig-withouthadoop.jar and have copied it to my linux box.
Now how do I put hadoop-core-0.20.203.0.jar and pig-withouthadoop.jar
in the classpath. Is it by using CLASSPATH variable?

On Thu, May 26, 2011 at 10:18 AM, Mohit Anchlia <mohitanch...@gmail.com> wrote:
> On Thu, May 26, 2011 at 10:06 AM, Jonathan Coveney <jcove...@gmail.com> wrote:
>> I'll repost it here then :)
>>
>> "Here is what I had to do to get pig running with a different version of
>> Hadoop (in my case, the cloudera build but I'd try this as well):
>
>>
>> build pig-withouthadoop.jar by running "ant jar-withouthadoop". Then, when
>> you run pig, put the pig-withouthadoop.jar on your classpath as well as your
>> hadoop jar. In my case, I found that scripts only worked if I additionally
>> manually registered the antlr jar:
>
> Thanks Jonathan! I will give it a shot.
>
>>
>> register /path/to/pig/build/ivy/lib/Pig/antlr-runtime-3.2.jar;"
>
> Is this a windows command? Sorry, have not used this before.
>
>>
>> 2011/5/26 Mohit Anchlia <mohitanch...@gmail.com>
>>
>>> For some reason I don't see that reply from Jonathan in my Inbox. I'll
>>> try to google it.
>>>
>>> What should be my next step in that case? I can't use pig then?
>>>
>>> On Thu, May 26, 2011 at 10:00 AM, Harsh J <ha...@cloudera.com> wrote:
>>> > I think Jonathan Coveney's reply on user@pig answered your question.
>>> > Its basically an issue of hadoop version differences between the one
>>> > Pig 0.8.1 release got bundled with vs. Hadoop 0.20.203 release which
>>> > is newer.
>>> >
>>> > On Thu, May 26, 2011 at 10:26 PM, Mohit Anchlia <mohitanch...@gmail.com>
>>> wrote:
>>> >> I sent this to pig apache user mailing list but have got no response.
>>> >> Not sure if that list is still active.
>>> >>
>>> >> thought I will post here if someone is able to help me.
>>> >>
>>> >> I am in process of installing and learning pig. I have a hadoop
>>> >> cluster and when I try to run pig in mapreduce mode it errors out:
>>> >>
>>> >> Hadoop version is hadoop-0.20.203.0 and pig version is pig-0.8.1
>>> >>
>>> >> Error before Pig is launched
>>> >> ----------------------------
>>> >> ERROR 2999: Unexpected internal error. Failed to create DataStorage
>>> >>
>>> >> java.lang.RuntimeException: Failed to create DataStorage
>>> >>       at
>>> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
>>> >>       at
>>> org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
>>> >>       at
>>> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
>>> >>       at
>>> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
>>> >>       at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
>>> >>       at org.apache.pig.PigServer.<init>(PigServer.java:226)
>>> >>       at org.apache.pig.PigServer.<init>(PigServer.java:215)
>>> >>       at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
>>> >>       at org.apache.pig.Main.run(Main.java:452)
>>> >>       at org.apache.pig.Main.main(Main.java:107)
>>> >> Caused by: java.io.IOException: Call to dsdb1/172.18.60.96:54310
>>> >> failed on local exception: java.io.EOFException
>>> >>       at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
>>> >>       at org.apache.hadoop.ipc.Client.call(Client.java:743)
>>> >>       at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>>> >>       at $Proxy0.getProtocolVersion(Unknown Source)
>>> >>       at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>>> >>       at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
>>> >>       at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>>> >>       at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>>> >>       at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
>>> >>       at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
>>> >>       at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>> >>       at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>>> >>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>>> >>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>>> >>       at
>>> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
>>> >>       ... 9 more
>>> >> Caused by: java.io.EOFException
>>> >>       at java.io.DataInputStream.readInt(DataInputStream.java:375)
>>> >>       at
>>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
>>> >>       at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
>>> >>
>>> >
>>> >
>>> >
>>> > --
>>> > Harsh J
>>> >
>>>
>>
>

Reply via email to