Hi, 

Can someone give a pointer? 


Thanks in advance.

Regards..
Salman. 


Salman Toor, PhD
salman.t...@it.uu.se



On Nov 3, 2013, at 11:31 PM, Salman Toor wrote:

> Hi, 
> 
> I am quite new to the Hadoop world, previously was running hadoop-1.2.0 
> stable version on my small cluster and encountered some strange problems like 
> the local path to the mapper  file didn't copy to the hdfs .... It works fine 
> on the single node setup but on multiple node simple word-count python 
> example didn't work...  I read on the blog that it might be the problem in 
> the version I am using. So I thought to change the version and downloaded the 
> Hadoop 2.2.0. This version has yarn together with many new features that I 
> hope I will learn in the future. Now simple wordcount example works without 
> any problem on the multi-node setup. I am using simple python example. 
> 
> Now I would like to compile my C++ code. Since the directory structure 
> together with other things have been changed. I have started to get the 
> following error: 
> 
> ----------------
> /urs/bin/ld:  skipping incompatible 
> /home/sztoor/hadoop-2.2.0/lib/native/libhadooputils.a when searching 
> -lhadooputils
> cannot find -lhadooputils
> 
>  /urs/bin/ld:  skipping incompatible 
> /home/sztoor/hadoop-2.2.0/lib/native/libhadooppipes.a when searching 
> -lhadooppipes
> cannot find -lhadooppipes
> ------------------
> 
> I have managed to run the c++ example successfully with the 1.2.0 version on 
> single node setup.
> 
> I am having 64bit Ubuntu machine. previously I was using Linux-amd64-64
> 
> Now in new version, "lib" and "include" directories are in the hadoop-2.2.0 
> directory. No build.xml is available... 
> 
> Can someone please give me an example of a makefile based on the version 
> 2.2.0? Or suggest me which version I should go for? Or if there are some 
> prerequisites that I should do before compiling my code? 
> 
> Thanks in advance. 
> 
> Regards..
> Salman. 
> 
> 
>  
> Salman Toor, PhD
> salman.t...@it.uu.se
> 
> 
> 

Reply via email to