Thanks Billy for your help.

Indeed, It was a configuration issue with hadoop. We were able to resolve it 
and now Apache Kylin is working fine.


Regards,

Tarun

________________________________
From: Billy Liu <billy...@apache.org>
Sent: Thursday, December 8, 2016 7:48:40 PM
To: Samirul Haque
Cc: Tarun Vashisth; d...@kylin.incubator.apache.org; Abhishek Shishodia
Subject: Re: Issues in building cubes of Apache Kylin

It seems not Kylin's issue. The stacktrace is Hadoop code. Based on my limited 
experience, most of Hadoop issues come from the misconfiguration or 
incompatible components. Could you try some Hadoop distribution? Or your hadoop 
administrator could give much help.

2016-12-08 20:44 GMT+08:00 Samirul Haque 
<saha...@adobe.com<mailto:saha...@adobe.com>>:
Somehow we did surpassed issue of last mail. But now stucked at below point,

#3 Step Name: Extract Fact Table Distinct Columns
Logs error : no counters for job job_1481188794276_0003

Hadoop logs shows for this job as :

log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: 
/usr/local/hadoop/logs/userlogs/application_1481188794276_0003/container_1481188794276_0003_01_000001
 (Is a directory)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
        at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
        at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
        at 
org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55)
        at 
org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
        at 
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
        at 
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
        at 
org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
        at 
org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
        at 
org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
        at 
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
        at 
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
        at 
org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
        at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
        at 
org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285)
        at 
org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
        at 
org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
        at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
        at 
org.apache.hadoop.service.AbstractService.<clinit>(AbstractService.java:43)



From: Tarun Vashisth
Sent: Thursday, December 08, 2016 5:54 PM
To: Billy Liu <billy...@apache.org<mailto:billy...@apache.org>>; Samirul Haque 
<saha...@adobe.com<mailto:saha...@adobe.com>>
Cc: d...@kylin.incubator.apache.org<mailto:d...@kylin.incubator.apache.org>; 
Abhishek Shishodia <shish...@adobe.com<mailto:shish...@adobe.com>>
Subject: Re: Issues in building cubes of Apache Kylin


Hi Billy,



echo $HIVE_CONF gives output: /usr/local/hive/conf

We are still getting the same error



java.io.FileNotFoundException: File does not exist: 
hdfs://localhost:54310/app/hadoop/tmp/mapred/staging/hduser1254415952/.staging/job_local1254415952_0001/libjars/hive-metastore-1.2.1.jar

                    at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)

                    at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)

                    at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

                    at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)

                    at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)

                    at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)

                    at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:99)

                    at 
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)

                    at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)

                    at 
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)

                    at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)

                    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)

                    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)

                    at java.security.AccessController.doPrivileged(Native 
Method)

                    at javax.security.auth.Subject.doAs(Subject.java:422)

                    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)

                    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)

                    at 
org.apache.kylin.engine.mr<http://org.apache.kylin.engine.mr>.common.AbstractHadoopJob.waitForCompletion(AbstractHadoopJob.java:149)

                    at 
org.apache.kylin.engine.mr<http://org.apache.kylin.engine.mr>.steps.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:108)

                    at 
org.apache.kylin.engine.mr<http://org.apache.kylin.engine.mr>.MRUtil.runMRJob(MRUtil.java:92)

                    at 
org.apache.kylin.engine.mr<http://org.apache.kylin.engine.mr>.common.MapReduceExecutable.doWork(MapReduceExecutable.java:120)

                    at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:113)

                    at 
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:57)

                    at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:113)

                    at 
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:136)

                    at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

                    at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

                    at java.lang.Thread.run(Thread.java:745)



result code:2


Any pointers?



Regards,

Tarun

________________________________
From: Billy Liu <billy...@apache.org<mailto:billy...@apache.org>>
Sent: Thursday, December 8, 2016 12:14:57 PM
To: Samirul Haque
Cc: dev; 
d...@kylin.incubator.apache.org<mailto:d...@kylin.incubator.apache.org>; 
Abhishek Shishodia; Tarun Vashisth
Subject: Re: Issues in building cubes of Apache Kylin

Seems you were not using any Hadoop distribution, such HDP or CDH, but Apache 
Hadoop, Apache Hive and Apache HBase.

Could you try export HIVE_CONF to your hive config path first?


2016-12-08 13:53 GMT+08:00 Samirul Haque 
<saha...@adobe.com<mailto:saha...@adobe.com>>:
Hi Billy,

Thanks for your response.

Output of the bin/find-dependency.sh is :

KYLIN_HOME is set to /usr/local/kylin
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in 
jar:file:/usr/local/hive/lib/hive-common-2.1.0.jar!/hive-log4j2.properties 
Async: true
HIVE_CONF is set to: 
/home/hduser/bin:/home/hduser/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hive/conf,
 use it to locate hive configurations.
HCAT_HOME is set to: /usr/local/hive/hcatalog, use it to find hcatalog path:
hive dependency: 
/home/hduser/bin:/home/hduser/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hadoop/bin:/usr/lib/jvm/java-8-openjdk-amd64/bin:/usr/local/hive/bin:/usr/local/kylin/bin:/usr/local/Hbase/bin:/usr/local/hive/conf:/usr/local/hive/lib/antlr-runtime-3.4.jar:/usr/local/hive/lib/servlet-api-2.4.jar:/usr/local/hive/lib/hive-llap-client-2.1.0.jar:/usr/local/hive/lib/twill-zookeeper-0.6.0-incubating.jar:/usr/local/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/local/hive/lib/hbase-annotations-1.1.1.jar:/usr/local/hive/lib/zookeeper-3.4.6.jar:/usr/local/hive/lib/guice-assistedinject-3.0.jar:/usr/local/hive/lib/ivy-2.4.0.jar:/usr/local/hive/lib/datanucleus-rdbms-4.1.7.jar:/usr/local/hive/lib/parquet-hadoop-bundle-1.8.1.jar:/usr/local/hive/lib/hive-exec-2.1.0.jar:/usr/local/hive/lib/hbase-common-1.1.1.jar:/usr/local/hive/lib/commons-lang3-3.1.jar:/usr/local/hive/lib/commons-pool-1.5.4.jar:/usr/local/hive/lib/accumulo-start-1.6.0.jar:/usr/local/hive/lib/twill-common-0.6.0-incubating.jar:/usr/local/hive/lib/jetty-6.1.26.jar:/usr/local/hive/lib/jackson-core-2.4.2.jar:/usr/local/hive/lib/commons-httpclient-3.0.1.jar:/usr/local/hive/lib/hive-jdbc-2.1.0.jar:/usr/local/hive/lib/hive-serde-2.1.0.jar:/usr/local/hive/lib/hbase-prefix-tree-1.1.1.jar:/usr/local/hive/lib/commons-lang-2.6.jar:/usr/local/hive/lib/opencsv-2.3.jar:/usr/local/hive/lib/jsp-api-2.1.jar:/usr/local/hive/lib/antlr-2.7.7.jar:/usr/local/hive/lib/libfb303-0.9.3.jar:/usr/local/hive/lib/slider-core-0.90.2-incubating.jar:/usr/local/hive/lib/jsp-2.1-6.1.14.jar:/usr/local/hive/lib/metrics-json-3.1.0.jar:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar:/usr/local/hive/lib/jamon-runtime-2.3.1.jar:/usr/local/hive/lib/avro-1.7.7.jar:/usr/local/hive/lib/jetty-all-server-7.6.0.v20120127.jar:/usr/local/hive/lib/activation-1.1.jar:/usr/local/hive/lib/derby.jar:/usr/local/hive/lib/stax-api-1.0.1.jar:/usr/local/hive/lib/tephra-api-0.6.0.jar:/usr/local/hive/lib/netty-all-4.0.23.Final.jar:/usr/local/hive/lib/jersey-client-1.9.jar:/usr/local/hive/lib/metrics-core-3.1.0.jar:/usr/local/hive/lib/paranamer-2.3.jar:/usr/local/hive/lib/snappy-java-1.0.5.jar:/usr/local/hive/lib/metrics-core-2.2.0.jar:/usr/local/hive/lib/commons-codec-1.4.jar:/usr/local/hive/lib/commons-vfs2-2.0.jar:/usr/local/hive/lib/javolution-5.5.1.jar:/usr/local/hive/lib/hive-cli-2.1.0.jar:/usr/local/hive/lib/hive-metastore-2.1.0.jar:/usr/local/hive/lib/hive-beeline-2.1.0.jar:/usr/local/hive/lib/ant-1.9.1.jar:/usr/local/hive/lib/ant-1.6.5.jar:/usr/local/hive/lib/commons-cli-1.2.jar:/usr/local/hive/lib/mysql-connector-java-5.1.21.jar:/usr/local/hive/lib/jsr305-3.0.0.jar:/usr/local/hive/lib/commons-compress-1.9.jar:/usr/local/hive/lib/hive-llap-ext-client-2.1.0.jar:/usr/local/hive/lib/javax.jdo-3.2.0-m3.jar:/usr/local/hive/lib/disruptor-3.3.0.jar:/usr/local/hive/lib/jackson-xc-1.9.2.jar:/usr/local/hive/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hive/lib/ant-launcher-1.9.1.jar:/usr/local/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar:/usr/local/hive/lib/tempus-fugit-1.1.jar:/usr/local/hive/lib/hive-ant-2.1.0.jar:/usr/local/hive/lib/datanucleus-core-4.1.6.jar:/usr/local/hive/lib/hive-llap-common-2.1.0.jar:/usr/local/hive/lib/hive-shims-0.23-2.1.0.jar:/usr/local/hive/lib/datanucleus-api-jdo-4.2.1.jar:/usr/local/hive/lib/guice-3.0.jar:/usr/local/hive/lib/findbugs-annotations-1.3.9-1.jar:/usr/local/hive/lib/jackson-jaxrs-1.9.2.jar:/usr/local/hive/lib/hive-accumulo-handler-2.1.0.jar:/usr/local/hive/lib/httpclient-4.4.jar:/usr/local/hive/lib/hbase-hadoop2-compat-1.1.1.jar:/usr/local/hive/lib/curator-framework-2.6.0.jar:/usr/local/hive/lib/aopalliance-1.0.jar:/usr/local/hive/lib/hive-testutils-2.1.0.jar:/usr/local/hive/lib/twill-api-0.6.0-incubating.jar:/usr/local/hive/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.0.jar:/usr/local/hive/lib/groovy-all-2.4.4.jar:/usr/local/hive/lib/metrics-jvm-3.1.0.jar:/usr/local/hive/lib/log4j-api-2.4.1.jar:/usr/local/hive/lib/log4j-web-2.4.1.jar:/usr/local/hive/lib/jersey-server-1.14.jar:/usr/local/hive/lib/hbase-server-1.1.1.jar:/usr/local/hive/lib/twill-core-0.6.0-incubating.jar:/usr/local/hive/lib/hive-llap-tez-2.1.0.jar:/usr/local/hive/lib/joda-time-2.5.jar:/usr/local/hive/lib/accumulo-core-1.6.0.jar:/usr/local/hive/lib/hive-hplsql-2.1.0.jar:/usr/local/hive/lib/jetty-sslengine-6.1.26.jar:/usr/local/hive/lib/hive-storage-api-2.1.0.jar:/usr/local/hive/lib/hive-llap-server-2.1.0.jar:/usr/local/hive/lib/json-20090211.jar:/usr/local/hive/lib/antlr4-runtime-4.5.jar:/usr/local/hive/lib/jackson-annotations-2.4.0.jar:/usr/local/hive/lib/stringtemplate-3.2.1.jar:/usr/local/hive/lib/log4j-1.2-api-2.4.1.jar:/usr/local/hive/lib/regexp-1.3.jar:/usr/local/hive/lib/hive-service-2.1.0.jar:/usr/local/hive/lib/jline-2.12.jar:/usr/local/hive/lib/jasper-compiler-5.5.23.jar:/usr/local/hive/lib/asm-commons-3.1.jar:/usr/local/hive/lib/jetty-all-7.6.0.v20120127.jar:/usr/local/hive/lib/guice-servlet-3.0.jar:/usr/local/hive/lib/jetty-util-6.1.26.jar:/usr/local/hive/lib/twill-discovery-core-0.6.0-incubating.jar:/usr/local/hive/lib/hbase-client-1.1.1.jar:/usr/local/hive/lib/plexus-utils-1.5.6.jar:/usr/local/hive/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hive/lib/maven-scm-api-1.4.jar:/usr/local/hive/lib/snappy-0.2.jar:/usr/local/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/local/hive/lib/jdo-api-3.0.1.jar:/usr/local/hive/lib/jta-1.1.jar:/usr/local/hive/lib/junit-4.11.jar:/usr/local/hive/lib/joni-2.1.2.jar:/usr/local/hive/lib/commons-math-2.2.jar:/usr/local/hive/lib/htrace-core-3.1.0-incubating.jar:/usr/local/hive/lib/javax.inject-1.jar:/usr/local/hive/lib/hive-service-rpc-2.1.0.jar:/usr/local/hive/lib/jackson-databind-2.4.2.jar:/usr/local/hive/lib/jsp-api-2.0.jar:/usr/local/hive/lib/hive-common-2.1.0.jar:/usr/local/hive/lib/transaction-api-1.1.jar:/usr/local/hive/lib/hive-hbase-handler-2.1.0.jar:/usr/local/hive/lib/eigenbase-properties-1.1.5.jar:/usr/local/hive/lib/hbase-hadoop2-compat-1.1.1-tests.jar:/usr/local/hive/lib/fastutil-6.5.6.jar:/usr/local/hive/lib/jcommander-1.32.jar:/usr/local/hive/lib/httpcore-4.4.jar:/usr/local/hive/lib/accumulo-fate-1.6.0.jar:/usr/local/hive/lib/hive-contrib-2.1.0.jar:/usr/local/hive/lib/gson-2.2.4.jar:/usr/local/hive/lib/commons-logging-1.2.jar:/usr/local/hive/lib/guava-14.0.1.jar:/usr/local/hive/lib/hbase-hadoop-compat-1.1.1.jar:/usr/local/hive/lib/javax.servlet-3.0.0.v201112011016.jar:/usr/local/hive/lib/hbase-common-1.1.1-tests.jar:/usr/local/hive/lib/hamcrest-core-1.3.jar:/usr/local/hive/lib/asm-tree-3.1.jar:/usr/local/hive/lib/ST4-4.0.4.jar:/usr/local/hive/lib/hive-shims-2.1.0.jar:/usr/local/hive/lib/jcodings-1.0.8.jar:/usr/local/hive/lib/commons-collections-3.2.2.jar:/usr/local/hive/lib/tephra-hbase-compat-1.0-0.6.0.jar:/usr/local/hive/lib/curator-client-2.6.0.jar:/usr/local/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/usr/local/hive/lib/hbase-procedure-1.1.1.jar:/usr/local/hive/lib/netty-3.7.0.Final.jar:/usr/local/hive/lib/commons-dbcp-1.4.jar:/usr/local/hive/lib/hive-orc-2.1.0.jar:/usr/local/hive/lib/tephra-core-0.6.0.jar:/usr/local/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/local/hive/lib/accumulo-trace-1.6.0.jar:/usr/local/hive/lib/org.abego.treelayout.core-1.0.1.jar:/usr/local/hive/lib/velocity-1.5.jar:/usr/local/hive/lib/log4j-core-2.4.1.jar:/usr/local/hive/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/usr/local/hive/lib/jasper-runtime-5.5.23.jar:/usr/local/hive/lib/protobuf-java-2.5.0.jar:/usr/local/hive/lib/commons-io-2.4.jar:/usr/local/hive/lib/curator-recipes-2.6.0.jar:/usr/local/hive/lib/super-csv-2.2.0.jar:/usr/local/hive/lib/hive-shims-common-2.1.0.jar:/usr/local/hive/lib/janino-2.7.6.jar:/usr/local/hive/lib/libthrift-0.9.3.jar:/usr/local/hive/lib/twill-discovery-api-0.6.0-incubating.jar:/usr/local/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar:/usr/local/hive/lib/commons-el-1.0.jar:/usr/local/hive/lib/hbase-protocol-1.1.1.jar:/usr/local/hive/lib/asm-3.1.jar:/usr/local/hive/lib/hive-hwi-2.1.0.jar:/usr/local/hive/lib/hive-shims-scheduler-2.1.0.jar:/usr/local/hive/lib/jpam-1.1.jar:/usr/local/hive/lib/commons-compiler-2.7.6.jar:/usr/local/hive/lib/mail-1.4.1.jar:/usr/local/hive/hcatalog/share/hcatalog/hive-hcatalog-core-2.1.0.jar


Hadoop distribution which was used is :
http://redrockdigimark.com/apachemirror/hadoop/core/stable/hadoop-2.7.3.tar.gz

Please let me know if you need any other information.

From: Billy Liu [mailto:billy...@apache.org<mailto:billy...@apache.org>]
Sent: Thursday, December 08, 2016 6:19 AM
To: dev <dev@kylin.apache.org<mailto:dev@kylin.apache.org>>
Cc: d...@kylin.incubator.apache.org<mailto:d...@kylin.incubator.apache.org>; 
Samirul Haque <saha...@adobe.com<mailto:saha...@adobe.com>>; Abhishek Shishodia 
<shish...@adobe.com<mailto:shish...@adobe.com>>
Subject: Re: Issues in building cubes of Apache Kylin

Hi Tarun,

what's the output of bin/find-hive-dependency.sh? It seems something wrong in 
the environment variables.
Which Hadoop distribution were you using?

2016-12-07 23:04 GMT+08:00 Tarun Vashisth 
<vashi...@adobe.com<mailto:vashi...@adobe.com>>:
Hi,



We are trying to build cubes for data and while doing so, we are getting the 
following error during the step 3 of cube creating

File does not exist: 
hdfs://localhost:54310/app/hadoop/tmp/mapred/staging/hduser341814501/.staging/job_local341814501_0007/libjars/hive-exec-2.1.0.jar


Earlier, it was trying to find the jars at 
hdfs://localhost:54310/usr/local/hive/lib. We manually put all the jars of hive 
there and after that, it started failing at this step.

  *   Is kylin trying to create this directory hduser../ and so on and unable 
to do so? Is it failing at that step?

Hadoop Version: - 2.7.3
Hive Version: - 2.1.0
Hbase Version: - 1.2.4
Kylin Version: - 1.6

Any help would be much appreciated in this regard.


Regards,

Tarun



Reply via email to