Env variables hang off of the session context and are specific to both the user 
profile and their shell-specific preferences.  If your driver is loading in 
kernel mode, it cannot depend on env variables.

This will be a problem for the other environment variables like hadoop_home.

Instead of using Java directly in kernel mode, I suggest splitting the problem:
1. fs abstraction for the kernel
   a. Like the nfs filesystem kernel driver implementation for example -- a 
remote mount fs.
   b. use a c impl of the protocol
      I. To avoid issues, use hadoop 2.0 for protobuffs, since they yield a 
versioned protocol to avoid hangs and dumps when the protocol changes.
      II.  OR push most of your implementation into a proxy service
          a. Surface NFS directly, and just use the nfs kernel driver
          b. Surface your own protocol to be consumed in the kernel mode driver.
2.  Start hdfs elsewhere, as a independent service in user mode like cups, 
httpd, or xinetd.
    a.  Will have a session and the ability to configure env vars.


Not sure if that exactly answers the question, but I hope it was helpful.

John

Sent from my Windows Phone
________________________________
From: harryxiyou<mailto:harryxi...@gmail.com>
Sent: ‎2/‎9/‎2013 5:35 AM
To: hdfs-dev@hadoop.apache.org<mailto:hdfs-dev@hadoop.apache.org>
Cc: Kang Hua<mailto:kanghua...@gmail.com>; 
clou...@googlegroups.com<mailto:clou...@googlegroups.com>
Subject: [Hadoop]Environment variable CLASSPATH not set!

Hi all,

We are developing  a hdfs-based File system, which is HLFS(
http://code.google.com/p/cloudxy/wiki/WHAT_IS_CLOUDXY). Now, we
have developed HLFS driver for Libvirt(http://libvirt.org/). But when i
boot a VM from a base linux OS, which the OS have been installed into our
HLFS block device at first. However, it(HDFS or JVM) says i have not set the
CLASSPATH like following.

[...]
uri:hdfs:///tmp/testenv/testfs,head:hdfs,dir:/tmp/testenv,fsname:testfs,hostname:default,port:0,user:kanghua
Environment variable CLASSPATH not set!
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
fs is null, hdfsConnect error!
[...]




Actually, i have set CLASSPATH in ~/.bashrc like following. I have
installed CDH3u2
for developing. I can do other hdfs jobs successfully.

$ cat /home/jiawei/.bashrc
[...]
export HLFS_HOME=/home/jiawei/workshop3/hlfs
export LOG_HOME=$HLFS_HOME/3part/log
export SNAPPY_HOME=$HLFS_HOME/3part/snappy
export HADOOP_HOME=$HLFS_HOME/3part/hadoop
export JAVA_HOME=/usr/lib/jvm/java-6-sun
export PATH=/usr/bin/:/usr/local/bin/:/bin/:/usr/sbin/:/sbin/:$JAVA_HOME/bin/
#export LD_LIBRARY_PATH=$JAVAHOME/lib
export 
LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/i386/server/:$HADOOP_HOME/lib32/:$LOG_HOME/lib32/:$SNAPPY_HOME/lib32/:$HLFS_HOME/output/lib32/:/usr/lib/
export PKG_CONFIG_PATH=/usr/lib/pkgconfig/:/usr/share/pkgconfig/
export CFLAGS="-L/usr/lib -L/lib -L/usr/lib64"
export CXXFLAGS="-L/usr/lib -L/lib"
export 
CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/htmlconverter.jar:$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/jre/lib/charsets.jar:$JAVA_HOME/jre/lib/deploy.jar:$JAVA_HOME/jre/lib/javaws.jar:$JAVA_HOME/jre/lib/jce.jar:$JAVA_HOME/jre/lib/jsse.jar:$JAVA_HOME/jre/lib/management-agent.jar:$JAVA_HOME/jre/lib/plugin.jar:$JAVA_HOME/jre/lib/resources.jar:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/jre/lib/:$JAVA_HOME/lib/:/usr/lib/hadoop-0.20/conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/usr/lib/hadoop-0.20:/usr/lib/hadoop-0.20/hadoop-core-0.20.2-cdh3u2.jar:/usr/lib/hadoop-0.20/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2-cdh3u2.jar:/usr/lib/hadoop-0.20/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop-0.20/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jetty-servlet-tester-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jsch-0.1.42.jar:/usr/lib/hadoop-0.20/lib/junit-4.5.jar:/usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/log4j-1.2.15.jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/oro-2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20/lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop-0.20/lib/jsp-2.1/jsp-api-2.1.jar



I have reproted this matter as an issue to handle, which you can see
http://code.google.com/p/cloudxy/issues/detail?id=37  for details.

Could anyone give me some suggestions?
Thanks a lot in advance ;-)



--
Thanks
Harry Wei


Reply via email to