You can change the code:
${HADOOP_HOME}/etc/hadoop/hadoop-env.sh

like this:
#for f in $HADOOP_HOME/contrib/capacity-scheduler/*.jar; do
#  if [ "$HADOOP_CLASSPATH" ]; then
#    export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$f
#  else
#    export HADOOP_CLASSPATH=$f
#  fi
#done


在 2015年05月04日 15:42, hongbin ma 写道:
https://github.com/KylinOLAP/Kylin/blob/0.7.1/bin/kylin.sh#L53

On Mon, May 4, 2015 at 3:41 PM, hongbin ma <[email protected]> wrote:

when kylin starts, it will include everything in HBASE_CLASSPATH and
hive's classpath.  Does your HBASE_CLASSPATH include /data1/app/hadoop-2.
6.0/contrib?

On Mon, May 4, 2015 at 3:41 PM, hongbin ma <[email protected]> wrote:

forward to mail list for discussion


On Mon, May 4, 2015 at 3:39 PM, hongbin ma <[email protected]> wrote:

WARNING: Failed to process JAR
[jar:file:/data1/app/hadoop-2.6.0/contrib/capacity-scheduler/.jar!/] for
TLD files
java.io.FileNotFoundException:
/data1/app/hadoop-2.6.0/contrib/capacity-scheduler/.jar (No such file or
directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.(ZipFile.java:215)
at java.util.zip.ZipFile.(ZipFile.java:145)
at java.util.jar.JarFile.(JarFile.java:153)
at java.util.jar.JarFile.(JarFile.java:90)
at sun.net.www.protocol.jar.URLJarFile.(URLJarFile.java:93)
at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:99)
at
sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
at
sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:89)
at org.apache.tomcat.util.scan.FileUrlJar.(FileUrlJar.java:41)
at org.apache.tomcat.util.scan.JarFactory.newInstance(JarFactory.java:34)
at org.apache.catalina.startup.TldConfig.tldScanJar(TldConfig.java:485)
at org.apache.catalina.startup.TldConfig.access$100(TldConfig.java:61)
at
org.apache.catalina.startup.TldConfig$TldJarScannerCallback.scan(TldConfig.java:296)
at
org.apache.tomcat.util.scan.StandardJarScanner.process(StandardJarScanner.java:258)
at
org.apache.tomcat.util.scan.StandardJarScanner.scan(StandardJarScanner.java:220)
at org.apache.catalina.startup.TldConfig.execute(TldConfig.java:269)
at
org.apache.catalina.startup.TldConfig.lifecycleEvent(TldConfig.java:565)
at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at
org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5412)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at
org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:649)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1081)
at
org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1877)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
hadoop2.x contrib文件夹应该是没有的,这个为什么还会用这个文件夹?
​



--
Regards,

*Bin Mahone | 马洪宾*
Apache Kylin: http://kylin.io
Github: https://github.com/binmahone



--
Regards,

*Bin Mahone | 马洪宾*
Apache Kylin: http://kylin.io
Github: https://github.com/binmahone



Reply via email to