Thanks Chris!

The issue was that even though I set jdk-7u21 as my default, it checked for
/usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.

Is there anyway to generate a proper hadoop-config.sh to reflect the minor
version hadoop was built with? So that in my case, it would check for
/usr/java/jdk-1.7* instead?  I appreciate the help!


On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <chris.maw...@gmail.com>
wrote:

> Yet another place to check -- in the hadoop-env.sh file there is also a
> JAVA_HOME setting.
> Chris
> On Jul 17, 2014 9:46 PM, "andrew touchet" <adt...@latech.edu> wrote:
>
>> Hi Fireflyhoo,
>>
>> Below I follow the symbolic links for the jdk-7u21. These links are
>> changed accordingly as I change between versions. Also, I have 8 datanodes
>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>> just this server is an issue.
>>
>> $ java -version
>> java version "1.7.0_21"
>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>
>> java
>> $ ls -l `which java`
>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>> /usr/java/default/bin/java*
>> $ ls -l /usr/java/default
>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>> /usr/java/latest*
>> $ ls -l /usr/java/latest
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>> /usr/java/jdk1.7.0_21*
>>
>> jar
>> $ ls -l `which jar`
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>> /etc/alternatives/jar*
>> $ ls -l /etc/alternatives/jar
>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>> /usr/java/jdk1.7.0_21/bin/jar*
>>
>> javac
>> $ ls -l `which javac`
>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>> /etc/alternatives/javac*
>> $ ls -l /etc/alternatives/javac
>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>> /usr/java/jdk1.7.0_21/bin/javac*
>>
>> Now that I've tried version from  6 & 7, I'm really not sure what is
>> causing this issue.
>>
>>
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 8:21 PM, firefly...@gmail.com <
>> firefly...@gmail.com> wrote:
>>
>>>  I think  you first confirm you local java version ,
>>> Some  liux will  pre-installed java ,that version is very low
>>>
>>> ------------------------------
>>> firefly...@gmail.com
>>>
>>>
>>> *From:* andrew touchet <adt...@latech.edu>
>>> *Date:* 2014-07-18 09:06
>>> *To:* user <user@hadoop.apache.org>
>>> *Subject:* Re: HDFS input/output error - fuse mount
>>> Hi Chris,
>>>
>>> I tried to mount /hdfs with java versions below but there was no change
>>> in output.
>>> jre-7u21
>>> jdk-7u21
>>> jdk-7u55
>>> jdk1.6.0_31
>>> jdk1.6.0_45
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <chris.maw...@gmail.com>
>>> wrote:
>>>
>>>> Version 51 ia Java 7
>>>> Chris
>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <adt...@latech.edu> wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> Hadoop package installed:
>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>
>>>>> Operating System:
>>>>> CentOS release 5.8 (Final)
>>>>>
>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>> the below output.
>>>>>
>>>>>
>>>>> $ls /hdfs
>>>>> *ls: /hdfs: Input/output error*
>>>>> $hadoop fs -ls
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>
>>>>>
>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>> *glados2*.
>>>>>
>>>>>
>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>> glados:9000Exception in thread "main"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 
>>>>> 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR 
>>>>> fuse_init.c:127
>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 
>>>>> 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR 
>>>>> (3),
>>>>> nodeid: 1, insize: 56*
>>>>>
>>>>> I adopted this system after this was already setup, so I do not know
>>>>> which java version was used during install. Currently I'm using:
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>> Is my java version really the cause of this issue?  What is the
>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>
>>>>> If java isn't my issue, then what is?
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Andrew
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>>

Reply via email to