Can you try this ? Pick a class like WordCount from your package and
execute this command:

javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
version.

For e.g. here's what I get for my class:

$ javap -verbose WCMapper | grep version
  minor version: 0
  major version: 50

Please paste the output of this - we can verify what the problem is.

Thanks
Hemanth


On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fatih.hal...@nyu.edu> wrote:

> Hi again,
>
> Thanks for your help but now, I am struggling with the same problem on a
> machine. As the preivous problem, I just decrease the Java version by Java
> 6, but this time I could not solve the problem.
>
> those are outputs that may explain the situation:
>
> ---------------------------------------------------------------------------------------------------------------------------------------------
> 1. I could not run my own code, to check the system I just tried to run
> basic wordcount example without any modification, except package info.
> **************************************************
> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/WordCount : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************************
> 2. Java version:
> ********************************
> COMMAND EXECUTED: java -version
> java version "1.6.0_24"
> OpenJDK Runtime Environment (IcedTea6 1.11.6)
> (rhel-1.33.1.11.6.el5_9-x86_64)
> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
> **********************************
> 3. JAVA_HOME variable:
> **********************************
> COMMAND EXECUTED: echo $JAVA_HOME
> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
> ********************************************
> 4. HADOOP version:
> *******************************************
> COMMAND EXECUTED: hadoop version
> Warning: $HADOOP_HOME is deprecated.
>
> Hadoop 1.0.4
> Subversion
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1393290
> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
> ********************************************************
>
> Are these still incompatible with eachother? (Hadoop version and java
> version)
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fatih.hal...@nyu.edu>wrote:
>
>> Thank you all very much
>>
>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>
>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>> well. Lets drop this thread and carry on there :)
>>>
>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>> > Hi,
>>> >
>>> > The new error usually happens if you compile using Java 7 and try to
>>> > run via Java 6 (for example). That is, an incompatibility in the
>>> > runtimes for the binary artifact produced.
>>> >
>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fatih.hal...@nyu.edu>
>>> wrote:
>>> >> Thank you very much Harsh,
>>> >>
>>> >> Now, as I promised earlier I am much obliged to you.
>>> >>
>>> >> But, now I solved that problem by just changing the directories then
>>> again
>>> >> creating a jar file of org. but I am getting this error:
>>> >>
>>> >> 1.) What I got
>>> >>
>>> ------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>>> >> Warning: $HADOOP_HOME is deprecated.
>>> >>
>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>> >>         at
>>> >>
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> >>         at
>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >>         at java.lang.Class.forName0(Native Method)
>>> >>         at java.lang.Class.forName(Class.java:266)
>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >>
>>> >> 2.) How I create my jar
>>> >>
>>> -------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>> org
>>> >> added manifest
>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>> >> 690)(deflated 58%)
>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>> >> 823)(deflated 56%)
>>> >>
>>> >> 3.) Content of my jar file
>>> >>
>>> ---------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>> >> META-INF/
>>> >> META-INF/MANIFEST.MF
>>> >> org/
>>> >> org/myorg/
>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>> >> org/myorg/MapReduce.class
>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>> >>
>>> -----------------------------------------------------------------------------------------
>>> >>
>>> >>
>>> >> Thank you very much.
>>> >>
>>> >>
>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>> >>>
>>> >>> Your point (4) explains the problem. The jar packed structure should
>>> >>> look like the below, and not how it is presently (one extra top level
>>> >>> dir is present):
>>> >>>
>>> >>> META-INF/
>>> >>> META-INF/MANIFEST.MF
>>> >>> org/
>>> >>> org/myorg/
>>> >>> org/myorg/WordCount.class
>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>> Harsh J
>>>
>>
>

Reply via email to