Hi Kylin-devs,

we are currently trying to build a cube / refresh a table but are
unable to do so. Kylin produces the following error:

[2016-01-28 
16:14:41,472][INFO][org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:58)]
- -table KYLIN_DK.DIM_DTM -output
/tmp/kylin/cardinality/KYLIN_DK.DIM_DTM
Starting: Kylin Hive Column Cardinality Update Job
table=KYLIN_DK.DIM_DTM output=/tmp/kylin/cardinality/KYLIN_DK.DIM_DTM
The hadoop cardinality value is not valid
usage: HiveColumnCardinalityUpdateJob
 -output <path>        Output path
 -table <table name>   The hive table name
[2016-01-28 
16:14:41,502][ERROR][org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:64)]
- error execute
HadoopShellExecutable{id=a22a895d-d10f-4ab8-9bb0-defe1fdf1756-01,
name=null, state=RUNNING}
java.lang.StringIndexOutOfBoundsException: String index out of range: -1
at java.lang.String.substring(String.java:1911)
at 
org.apache.kylin.job.hadoop.cardinality.HiveColumnCardinalityUpdateJob.updateKylinTableExd(HiveColumnCardinalityUpdateJob.java:113)
at 
org.apache.kylin.job.hadoop.cardinality.HiveColumnCardinalityUpdateJob.run(HiveColumnCardinalityUpdateJob.java:80)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at 
org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:62)
at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
at 
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:51)
at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
at 
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:130)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
[2016-01-28 
16:14:41,504][DEBUG][org.apache.kylin.common.persistence.ResourceStore.putResource(ResourceStore.java:200)]
- Saving resource
/execute_output/a22a895d-d10f-4ab8-9bb0-defe1fdf1756-01 (Store
kylin_metadata@hbase)
[2016-01-28 
16:14:41,510][DEBUG][org.apache.kylin.common.persistence.ResourceStore.putResource(ResourceStore.java:200)]
- Saving resource
/execute_output/a22a895d-d10f-4ab8-9bb0-defe1fdf1756-01 (Store
kylin_metadata@hbase)
[2016-01-28 
16:14:41,513][INFO][org.apache.kylin.job.manager.ExecutableManager.updateJobOutput(ExecutableManager.java:241)]
- job id:a22a895d-d10f-4ab8-9bb0-defe1fdf1756-01 from RUNNING to ERROR

Fun fact: the cube building process does not catch this error and
states SUCCESS in this stage.

Any hints on what is going on or/and how to fix this issue?

Thanks
-Seb

Reply via email to