Sonal Goyal wrote:
Hi Adarsh,

I think your mapred.cache.files property has an extra space at the end. Try
removing that and let us know how it goes.
Thanks and Regards,
Sonal
<https://github.com/sonalgoyal/hiho>Hadoop ETL and Data
Integration<https://github.com/sonalgoyal/hiho>
Nube Technologies <http://www.nubetech.co>

<http://in.linkedin.com/in/sonalgoyal>



Thanks a Lot Sonal but it doesn't succeed.
Please if possible tell me the proper steps that are need to be followed after Configuring Hadoop Cluster.

I don't believe that a simple commands succeeded as

[root@cuda1 hadoop-0.20.2]# javac EnumDevices.java
[root@cuda1 hadoop-0.20.2]# java EnumDevices
Total number of devices: 1
Name: Tesla C1060
Version: 1.3
Clock rate: 1296000 MHz
Threads per block: 512


but in Map-reduce job it fails :

11/02/28 18:42:47 INFO mapred.JobClient: Task Id : attempt_201102281834_0001_m_000001_2, Status : FAILED
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
       at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
       at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
       ... 3 more
Caused by: java.lang.UnsatisfiedLinkError: no jcuda in java.library.path
       at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
       at java.lang.Runtime.loadLibrary0(Runtime.java:823)
       at java.lang.System.loadLibrary(System.java:1028)
       at jcuda.driver.CUDADriver.<clinit>(CUDADriver.java:909)
       at jcuda.CUDA.init(CUDA.java:62)
       at jcuda.CUDA.<init>(CUDA.java:42)



Thanks & best Regards,

Adarsh Sharma


On Mon, Feb 28, 2011 at 5:06 PM, Adarsh Sharma <adarsh.sha...@orkash.com>wrote:

Thanks Sanjay, it seems i found the root cause.

But I result in following error:

[hadoop@ws37-mah-lin hadoop-0.20.2]$ bin/hadoop jar wordcount1.jar
org.myorg.WordCount /user/hadoop/gutenberg /user/hadoop/output1
Exception in specified URI's java.net.URISyntaxException: Illegal character
in path at index 36: hdfs://192.168.0.131:54310/jcuda.jar
      at java.net.URI$Parser.fail(URI.java:2809)
      at java.net.URI$Parser.checkChars(URI.java:2982)
      at java.net.URI$Parser.parseHierarchical(URI.java:3066)
      at java.net.URI$Parser.parse(URI.java:3014)
      at java.net.URI.<init>(URI.java:578)
      at
org.apache.hadoop.util.StringUtils.stringToURI(StringUtils.java:204)
      at
org.apache.hadoop.filecache.DistributedCache.getCacheFiles(DistributedCache.java:593)
      at
org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:638)
      at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
      at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
      at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
      at org.myorg.WordCount.main(WordCount.java:59)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

Exception in thread "main" java.lang.NullPointerException
      at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:176)
      at
org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:506)
      at
org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:640)
      at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
      at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
      at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
      at org.myorg.WordCount.main(WordCount.java:59)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

Please check my attached mapred-site.xml


Thanks & best regards,

Adarsh Sharma



Kaluskar, Sanjay wrote:

You will probably have to use distcache to distribute your jar to all
the nodes too. Read the distcache documentation; Then on each node you
can add the new jar to the java.library.path through
mapred.child.java.opts.

You need to do something like the following in mapred-site.xml, where
fs-uri is the URI of the file system (something like
host.mycompany.com:54310).

<property>
 <name>mapred.cache.files</name>
 <value>hdfs://fs-uri/jcuda/jcuda.jar#jcuda.jar </value>
</property>
<property>
 <name>mapred.create.symlink</name>
 <value>yes</value>
</property>
<property>
 <name>mapred.child.java.opts</name>
 <value>-Djava.library.path=jcuda.jar</value>
</property>


-----Original Message-----
From: Adarsh Sharma [mailto:adarsh.sha...@orkash.com] Sent: 28 February
2011 16:03
To: common-user@hadoop.apache.org
Subject: Setting java.library.path for map-reduce job

Dear all,

I want to set some extra jars in java.library.path , used while running
map-reduce program in Hadoop Cluster.

I got a exception entitled "no jcuda in java.library.path" in each map
task.

I run my map-reduce code by below commands :

javac -classpath
/home/hadoop/project/hadoop-0.20.2/hadoop-0.20.2-core.jar://home/hadoop/
project/hadoop-0.20.2/jcuda_1.1_linux64/jcuda.jar:/home/hadoop/project/h
adoop-0.20.2/lib/commons-cli-1.2.jar
-d wordcount_classes1/ WordCount.java

jar -cvf wordcount1.jar -C wordcount_classes1/ .

bin/hadoop jar wordcount1.jar org.myorg.WordCount /user/hadoop/gutenberg
/user/hadoop/output1


Please guide how to achieve this.



Thanks & best Regards,

Adarsh Sharma




Reply via email to