Kylin metadata and storage cleanup.

2016-10-17 Thread Joel Victor
In order to cleanup kylin metadata, HBase tables and HDFS. Can I do this
while kylin cube jobs are running in the background.
Is there a chance that the metadata might get messed up?
In this document
http://kylin.apache.org/docs15/howto/howto_cleanup_storage.html it says you
can do offline cleanup. What is offline in this context does it mean I have
to stop the build jobs ?
Similarly with this
http://kylin.apache.org/docs15/howto/howto_backup_metadata.html. Is it safe
to run this while build jobs are running ?


Error of migrating cubes

2016-10-17 Thread 仇同心
Hi,
  Throw an exception information when using org.apache.kylin.storage.hbase.util 
.CubeMigrationCLI migration cube, kylin version is kylin-1.5.4.1-HBase1.x.

  2016-10-17 14:57:25,595 INFO  [main CubeMigrationCLI:325]: Executing 
operation: ADD_INTO_PROJECT:CUBE[name=testc], testc, test,
  2016-10-17 14:57:25,595 ERROR [main CubeMigrationCLI:307]: error met
  java.lang.ClassCastException: org.apache.kylin.cube.CubeInstance cannot be 
cast to java.lang.String
at 
org.apache.kylin.storage.hbase.util.CubeMigrationCLI.doOpt(CubeMigrationCLI.java:425)
at 
org.apache.kylin.storage.hbase.util.CubeMigrationCLI.doOpts(CubeMigrationCLI.java:304)
at 
org.apache.kylin.storage.hbase.util.CubeMigrationCLI.moveCube(CubeMigrationCLI.java:153)
at 
org.apache.kylin.storage.hbase.util.CubeMigrationCLI.moveCube(CubeMigrationCLI.java:167)
at 
org.apache.kylin.storage.hbase.util.CubeMigrationCLI.main(CubeMigrationCLI.java:102)
2016-10-17 14:57:25,597 INFO  [main CubeMigrationCLI:308]: Try undoing previous 
changes

In github , the source code of kylin - 1.5.4.1 - HBase1. X branch:
case ADD_INTO_PROJECT: {
String cubeName = (String) opt.params[0];
String projectName = (String) opt.params[1];
String projectResPath = 
ProjectInstance.concatResourcePath(projectName);
Serializer projectSerializer = new 
JsonSerializer(ProjectInstance.class);
ProjectInstance project = dstStore.getResource(projectResPath, 
ProjectInstance.class, projectSerializer);
project.removeRealization(RealizationType.CUBE, cubeName);
project.addRealizationEntry(RealizationType.CUBE, cubeName);
dstStore.putResource(projectResPath, project, 
projectSerializer);
logger.info("Project instance for " + projectName + " is 
corrected");
break;
}

This version of kylin 1.5.4.1 -- cdh5.7 is the same.


the source code of master X branch:
case ADD_INTO_PROJECT: {
CubeInstance srcCube = (CubeInstance) opt.params[0];
String cubeName = (String) opt.params[1];
String projectName = (String) opt.params[2];
String modelName = srcCube.getDescriptor().getModelName();

String projectResPath = 
ProjectInstance.concatResourcePath(projectName);
Serializer projectSerializer = new 
JsonSerializer(ProjectInstance.class);
ProjectInstance project = dstStore.getResource(projectResPath, 
ProjectInstance.class, projectSerializer);

project.addModel(modelName);
project.removeRealization(RealizationType.CUBE, cubeName);
project.addRealizationEntry(RealizationType.CUBE, cubeName);

dstStore.putResource(projectResPath, project, projectSerializer);
logger.info("Project instance for " + projectName + " is 
corrected");
break;.
}

I compare the two source code: kylin - 1.5.4.1 - HBase1. X branch 、kylin 
1.5.4.1 -- cdh5.7 and  master, I think in kylin - 1.5.4.1 - HBase1 .x 、kylin 
1.5.4.1 -- cdh5.7 branch , the code of CubeMigrationCLI is wrong?

Thank you!


Kylin Error on Building Cube

2016-10-17 Thread Serhat Can

Hi,

I'm new at Hadoop environment and we would like to use Kylin for BI 
Analytics. I have download and install Kylin prerequsites that versions 
are indicated below and then install Kylin. I can create a cube but when 
I try to build it #3 process (#3 Step Name: Extract Fact Table Distinct 
Columns) is crashed with an error code which is indicated below.


Are there anyone faced this situation before and could you please give 
some advice the clear the error.



Hadoop 2.7.3
Hbase 1.2.3
Hive 2.1.0
Kylin 1.5.4.1-hbase1.x



java.lang.NoSuchMethodError: 
org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashLong(J)I 
	at 
org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:11555) 
	at 
org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62) 
	at java.util.HashMap.hash(HashMap.java:362) 	at 
java.util.HashMap.put(HashMap.java:492) 	at 
org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139) 
	at 
org.apache.hadoop.mapred.LocalJobRunner$Job.(LocalJobRunner.java:163) 
	at 
org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731) 
	at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:432) 
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) 	at 
org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) 	at 
java.security.AccessController.doPrivileged(Native Method) 	at 
javax.security.auth.Subject.doAs(Subject.java:415) 	at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) 
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) 	at 
org.apache.kylin.engine.mr.common.AbstractHadoopJob.waitForCompletion(AbstractHadoopJob.java:150) 
	at 
org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:108) 
	at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:88) 	at 
org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork(MapReduceExecutable.java:120) 
	at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:113) 
	at 
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:57) 
	at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:113) 
	at 
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:136) 
	at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
	at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
	at java.lang.Thread.run(Thread.java:745)


--