[jira] [Commented] (KYLIN-3607) can't build cube with spark in v2.5.0
[ https://issues.apache.org/jira/browse/KYLIN-3607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722407#comment-16722407 ] ASF subversion and git services commented on KYLIN-3607: Commit dd0943a74bdba838a21342a21f32e8be0012fa13 in kylin's branch refs/heads/master from [~caolijun1166] [ https://gitbox.apache.org/repos/asf?p=kylin.git;h=dd0943a ] KYLIN-3607 add hbase-hadoop*-compat*.jar > can't build cube with spark in v2.5.0 > - > > Key: KYLIN-3607 > URL: https://issues.apache.org/jira/browse/KYLIN-3607 > Project: Kylin > Issue Type: Bug > Components: Storage - HBase >Affects Versions: v2.5.0 >Reporter: ANIL KUMAR >Assignee: Lijun Cao >Priority: Major > Fix For: v2.6.0 > > > in Kylin v2.5.0, can't be built cube at step 8 Convert Cuboid Data to HFile, > the following is the related exception: > > ERROR yarn.ApplicationMaster: User class threw exception: > java.lang.RuntimeException: error execute > org.apache.kylin.storage.hbase.steps.SparkCubeHFile. Root cause: Job aborted > due to stage failure: Task 0 in stage 1.0 failed 4 times, > java.lang.ExceptionInInitializerError > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:247) > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:194) > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:152) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1125) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123) > at > org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1353) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1131) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1102) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: Could not create interface > org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the > hadoop compatibility jar on the classpath? > at > org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:73) > at org.apache.hadoop.hbase.io.MetricsIO.(MetricsIO.java:31) > at org.apache.hadoop.hbase.io.hfile.HFile.(HFile.java:192) > ... 15 more > Caused by: java.util.NoSuchElementException > at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365) > at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) > at java.util.ServiceLoader$1.next(ServiceLoader.java:480) > at > org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:59) > ... 17 more -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (KYLIN-3607) can't build cube with spark in v2.5.0
[ https://issues.apache.org/jira/browse/KYLIN-3607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722406#comment-16722406 ] ASF GitHub Bot commented on KYLIN-3607: --- shaofengshi closed pull request #395: KYLIN-3607 add hbase-hadoop*-compat*.jar URL: https://github.com/apache/kylin/pull/395 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java b/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java index ccab22f878..86ad0fbe5d 100644 --- a/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java +++ b/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java @@ -66,6 +66,11 @@ public AbstractExecutable createConvertCuboidToHfileStep(String jobId) { StringUtil.appendWithSeparator(jars, ClassUtil.findContainingJar("org.apache.htrace.Trace", null)); // htrace-core.jar StringUtil.appendWithSeparator(jars, ClassUtil.findContainingJar("com.yammer.metrics.core.MetricsRegistry", null)); // metrics-core.jar +//KYLIN-3607 +StringUtil.appendWithSeparator(jars, + ClassUtil.findContainingJar("org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory", null));//hbase-hadoop-compat-1.1.1.jar +StringUtil.appendWithSeparator(jars, + ClassUtil.findContainingJar("org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactoryImpl", null));//hbase-hadoop2-compat-1.1.1.jar StringUtil.appendWithSeparator(jars, seg.getConfig().getSparkAdditionalJars()); sparkExecutable.setJars(jars.toString()); This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > can't build cube with spark in v2.5.0 > - > > Key: KYLIN-3607 > URL: https://issues.apache.org/jira/browse/KYLIN-3607 > Project: Kylin > Issue Type: Bug > Components: Storage - HBase >Affects Versions: v2.5.0 >Reporter: ANIL KUMAR >Assignee: Lijun Cao >Priority: Major > Fix For: v2.6.0 > > > in Kylin v2.5.0, can't be built cube at step 8 Convert Cuboid Data to HFile, > the following is the related exception: > > ERROR yarn.ApplicationMaster: User class threw exception: > java.lang.RuntimeException: error execute > org.apache.kylin.storage.hbase.steps.SparkCubeHFile. Root cause: Job aborted > due to stage failure: Task 0 in stage 1.0 failed 4 times, > java.lang.ExceptionInInitializerError > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:247) > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:194) > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:152) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1125) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123) > at > org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1353) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1131) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1102) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: Could not create interface > org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the > hadoop compatibility jar on the classpath? > at >
[GitHub] shaofengshi closed pull request #395: KYLIN-3607 add hbase-hadoop*-compat*.jar
shaofengshi closed pull request #395: KYLIN-3607 add hbase-hadoop*-compat*.jar URL: https://github.com/apache/kylin/pull/395 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java b/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java index ccab22f878..86ad0fbe5d 100644 --- a/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java +++ b/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java @@ -66,6 +66,11 @@ public AbstractExecutable createConvertCuboidToHfileStep(String jobId) { StringUtil.appendWithSeparator(jars, ClassUtil.findContainingJar("org.apache.htrace.Trace", null)); // htrace-core.jar StringUtil.appendWithSeparator(jars, ClassUtil.findContainingJar("com.yammer.metrics.core.MetricsRegistry", null)); // metrics-core.jar +//KYLIN-3607 +StringUtil.appendWithSeparator(jars, + ClassUtil.findContainingJar("org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory", null));//hbase-hadoop-compat-1.1.1.jar +StringUtil.appendWithSeparator(jars, + ClassUtil.findContainingJar("org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactoryImpl", null));//hbase-hadoop2-compat-1.1.1.jar StringUtil.appendWithSeparator(jars, seg.getConfig().getSparkAdditionalJars()); sparkExecutable.setJars(jars.toString()); This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] shaofengshi closed pull request #396: KYLIN-3721 modify sourceCacheKey for RDBMS
shaofengshi closed pull request #396: KYLIN-3721 modify sourceCacheKey for RDBMS URL: https://github.com/apache/kylin/pull/396 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java b/core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java index 03559bc370..c0bd3228ec 100644 --- a/core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java +++ b/core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java @@ -103,9 +103,10 @@ private String createSourceCacheKey(ISourceAware aware) { builder.append(config.getJdbcSourceConnectionUrl()).append('|'); builder.append(config.getJdbcSourceDriver()).append('|'); builder.append(config.getJdbcSourceUser()).append('|'); +builder.append(config.getJdbcSourcePass()).append('|'); // In case password is wrong at the first time builder.append(config.getJdbcSourceFieldDelimiter()).append('|'); builder.append(config.getJdbcSourceDialect()).append('|'); -return builder.toString(); // jdbc password not needed, because url+user should be identical. +return builder.toString(); } private ISource createSource(ISourceAware aware) { This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (KYLIN-3721) Failed to get source table when write the wrong password at the first time
[ https://issues.apache.org/jira/browse/KYLIN-3721?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722404#comment-16722404 ] ASF GitHub Bot commented on KYLIN-3721: --- shaofengshi closed pull request #396: KYLIN-3721 modify sourceCacheKey for RDBMS URL: https://github.com/apache/kylin/pull/396 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java b/core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java index 03559bc370..c0bd3228ec 100644 --- a/core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java +++ b/core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java @@ -103,9 +103,10 @@ private String createSourceCacheKey(ISourceAware aware) { builder.append(config.getJdbcSourceConnectionUrl()).append('|'); builder.append(config.getJdbcSourceDriver()).append('|'); builder.append(config.getJdbcSourceUser()).append('|'); +builder.append(config.getJdbcSourcePass()).append('|'); // In case password is wrong at the first time builder.append(config.getJdbcSourceFieldDelimiter()).append('|'); builder.append(config.getJdbcSourceDialect()).append('|'); -return builder.toString(); // jdbc password not needed, because url+user should be identical. +return builder.toString(); } private ISource createSource(ISourceAware aware) { This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Failed to get source table when write the wrong password at the first time > --- > > Key: KYLIN-3721 > URL: https://issues.apache.org/jira/browse/KYLIN-3721 > Project: Kylin > Issue Type: Bug > Components: RDBMS Source >Affects Versions: v2.6.0 > Environment: MacOSX,JDK1.8 >Reporter: rongchuan.jin >Assignee: rongchuan.jin >Priority: Minor > Fix For: v2.6.0 > > > When I write RDBMS configuration,I has written the wrong password the first > time.It can not load tables. > Then I change the right password in configuration, it can not load table > until I restart Kylin. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (KYLIN-3721) Failed to get source table when write the wrong password at the first time
[ https://issues.apache.org/jira/browse/KYLIN-3721?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722405#comment-16722405 ] ASF subversion and git services commented on KYLIN-3721: Commit 9fb5b4657edcb58d370bd64e5b011172188c4c37 in kylin's branch refs/heads/master from woyumen4597 [ https://gitbox.apache.org/repos/asf?p=kylin.git;h=9fb5b46 ] KYLIN-3721 modify sourceCacheKey for RDBMS > Failed to get source table when write the wrong password at the first time > --- > > Key: KYLIN-3721 > URL: https://issues.apache.org/jira/browse/KYLIN-3721 > Project: Kylin > Issue Type: Bug > Components: RDBMS Source >Affects Versions: v2.6.0 > Environment: MacOSX,JDK1.8 >Reporter: rongchuan.jin >Assignee: rongchuan.jin >Priority: Minor > Fix For: v2.6.0 > > > When I write RDBMS configuration,I has written the wrong password the first > time.It can not load tables. > Then I change the right password in configuration, it can not load table > until I restart Kylin. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] coveralls commented on issue #396: KYLIN-3721 modify sourceCacheKey for RDBMS
coveralls commented on issue #396: KYLIN-3721 modify sourceCacheKey for RDBMS URL: https://github.com/apache/kylin/pull/396#issuecomment-447615867 ## Pull Request Test Coverage Report for [Build 3969](https://coveralls.io/builds/20657691) * **0** of **2** **(0.0%)** changed or added relevant lines in **1** file are covered. * No unchanged relevant lines lost coverage. * Overall coverage decreased (**-0.0004%**) to **25.81%** --- | Changes Missing Coverage | Covered Lines | Changed/Added Lines | % | | :-|--||---: | | [core-metadata/src/main/java/org/apache/kylin/source/SourceManager.java](https://coveralls.io/builds/20657691/source?filename=core-metadata%2Fsrc%2Fmain%2Fjava%2Forg%2Fapache%2Fkylin%2Fsource%2FSourceManager.java#L106) | 0 | 2 | 0.0% | Totals | [![Coverage Status](https://coveralls.io/builds/20657691/badge)](https://coveralls.io/builds/20657691) | | :-- | --: | | Change from base [Build 3967](https://coveralls.io/builds/20657444): | -0.0004% | | Covered Lines: | 17823 | | Relevant Lines: | 69055 | --- # - [Coveralls](https://coveralls.io) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (KYLIN-3721) Failed to get source table when write the wrong password at the first time
rongchuan.jin created KYLIN-3721: Summary: Failed to get source table when write the wrong password at the first time Key: KYLIN-3721 URL: https://issues.apache.org/jira/browse/KYLIN-3721 Project: Kylin Issue Type: Bug Components: RDBMS Source Affects Versions: v2.6.0 Environment: MacOSX,JDK1.8 Reporter: rongchuan.jin Fix For: v2.6.0 When I write RDBMS configuration,I has written the wrong password the first time.It can not load tables. Then I change the right password in configuration, it can not load table until I restart Kylin. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (KYLIN-3721) Failed to get source table when write the wrong password at the first time
[ https://issues.apache.org/jira/browse/KYLIN-3721?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722373#comment-16722373 ] ASF GitHub Bot commented on KYLIN-3721: --- woyumen4597 opened a new pull request #396: KYLIN-3721 modify sourceCacheKey for RDBMS URL: https://github.com/apache/kylin/pull/396 This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > Failed to get source table when write the wrong password at the first time > --- > > Key: KYLIN-3721 > URL: https://issues.apache.org/jira/browse/KYLIN-3721 > Project: Kylin > Issue Type: Bug > Components: RDBMS Source >Affects Versions: v2.6.0 > Environment: MacOSX,JDK1.8 >Reporter: rongchuan.jin >Assignee: rongchuan.jin >Priority: Minor > Fix For: v2.6.0 > > > When I write RDBMS configuration,I has written the wrong password the first > time.It can not load tables. > Then I change the right password in configuration, it can not load table > until I restart Kylin. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] woyumen4597 opened a new pull request #396: KYLIN-3721 modify sourceCacheKey for RDBMS
woyumen4597 opened a new pull request #396: KYLIN-3721 modify sourceCacheKey for RDBMS URL: https://github.com/apache/kylin/pull/396 This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] asfgit commented on issue #396: KYLIN-3721 modify sourceCacheKey for RDBMS
asfgit commented on issue #396: KYLIN-3721 modify sourceCacheKey for RDBMS URL: https://github.com/apache/kylin/pull/396#issuecomment-447615098 Can one of the admins verify this patch? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Assigned] (KYLIN-3721) Failed to get source table when write the wrong password at the first time
[ https://issues.apache.org/jira/browse/KYLIN-3721?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] rongchuan.jin reassigned KYLIN-3721: Assignee: rongchuan.jin > Failed to get source table when write the wrong password at the first time > --- > > Key: KYLIN-3721 > URL: https://issues.apache.org/jira/browse/KYLIN-3721 > Project: Kylin > Issue Type: Bug > Components: RDBMS Source >Affects Versions: v2.6.0 > Environment: MacOSX,JDK1.8 >Reporter: rongchuan.jin >Assignee: rongchuan.jin >Priority: Minor > Fix For: v2.6.0 > > > When I write RDBMS configuration,I has written the wrong password the first > time.It can not load tables. > Then I change the right password in configuration, it can not load table > until I restart Kylin. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] coveralls commented on issue #395: KYLIN-3607 add hbase-hadoop*-compat*.jar
coveralls commented on issue #395: KYLIN-3607 add hbase-hadoop*-compat*.jar URL: https://github.com/apache/kylin/pull/395#issuecomment-447613893 ## Pull Request Test Coverage Report for [Build 3968](https://coveralls.io/builds/20657527) * **0** of **4** **(0.0%)** changed or added relevant lines in **1** file are covered. * **6** unchanged lines in **2** files lost coverage. * Overall coverage decreased (**-0.006%**) to **25.804%** --- | Changes Missing Coverage | Covered Lines | Changed/Added Lines | % | | :-|--||---: | | [storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java](https://coveralls.io/builds/20657527/source?filename=storage-hbase%2Fsrc%2Fmain%2Fjava%2Forg%2Fapache%2Fkylin%2Fstorage%2Fhbase%2Fsteps%2FHBaseSparkSteps.java#L70) | 0 | 4 | 0.0% | Files with Coverage Reduction | New Missed Lines | % | | :-|--|--: | | [core-metadata/src/main/java/org/apache/kylin/source/datagen/ColumnGenerator.java](https://coveralls.io/builds/20657527/source?filename=core-metadata%2Fsrc%2Fmain%2Fjava%2Forg%2Fapache%2Fkylin%2Fsource%2Fdatagen%2FColumnGenerator.java#L319) | 1 | 81.08% | | [core-cube/src/main/java/org/apache/kylin/cube/inmemcubing/MemDiskStore.java](https://coveralls.io/builds/20657527/source?filename=core-cube%2Fsrc%2Fmain%2Fjava%2Forg%2Fapache%2Fkylin%2Fcube%2Finmemcubing%2FMemDiskStore.java#L439) | 5 | 77.81% | | Totals | [![Coverage Status](https://coveralls.io/builds/20657527/badge)](https://coveralls.io/builds/20657527) | | :-- | --: | | Change from base [Build 3967](https://coveralls.io/builds/20657444): | -0.006% | | Covered Lines: | 17820 | | Relevant Lines: | 69058 | --- # - [Coveralls](https://coveralls.io) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (KYLIN-3607) can't build cube with spark in v2.5.0
[ https://issues.apache.org/jira/browse/KYLIN-3607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722355#comment-16722355 ] ASF GitHub Bot commented on KYLIN-3607: --- caolijun1166 opened a new pull request #395: KYLIN-3607 add hbase-hadoop*-compat*.jar URL: https://github.com/apache/kylin/pull/395 This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > can't build cube with spark in v2.5.0 > - > > Key: KYLIN-3607 > URL: https://issues.apache.org/jira/browse/KYLIN-3607 > Project: Kylin > Issue Type: Bug > Components: Storage - HBase >Affects Versions: v2.5.0 >Reporter: ANIL KUMAR >Assignee: Lijun Cao >Priority: Major > Fix For: v2.6.0 > > > in Kylin v2.5.0, can't be built cube at step 8 Convert Cuboid Data to HFile, > the following is the related exception: > > ERROR yarn.ApplicationMaster: User class threw exception: > java.lang.RuntimeException: error execute > org.apache.kylin.storage.hbase.steps.SparkCubeHFile. Root cause: Job aborted > due to stage failure: Task 0 in stage 1.0 failed 4 times, > java.lang.ExceptionInInitializerError > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:247) > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:194) > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:152) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1125) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123) > at > org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1353) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1131) > at > org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1102) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: Could not create interface > org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the > hadoop compatibility jar on the classpath? > at > org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:73) > at org.apache.hadoop.hbase.io.MetricsIO.(MetricsIO.java:31) > at org.apache.hadoop.hbase.io.hfile.HFile.(HFile.java:192) > ... 15 more > Caused by: java.util.NoSuchElementException > at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365) > at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) > at java.util.ServiceLoader$1.next(ServiceLoader.java:480) > at > org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:59) > ... 17 more -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] asfgit commented on issue #395: KYLIN-3607 add hbase-hadoop*-compat*.jar
asfgit commented on issue #395: KYLIN-3607 add hbase-hadoop*-compat*.jar URL: https://github.com/apache/kylin/pull/395#issuecomment-447613046 Can one of the admins verify this patch? This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] caolijun1166 opened a new pull request #395: KYLIN-3607 add hbase-hadoop*-compat*.jar
caolijun1166 opened a new pull request #395: KYLIN-3607 add hbase-hadoop*-compat*.jar URL: https://github.com/apache/kylin/pull/395 This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (KYLIN-3597) Fix sonar reported static code issues
[ https://issues.apache.org/jira/browse/KYLIN-3597?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722149#comment-16722149 ] ASF subversion and git services commented on KYLIN-3597: Commit 62830469424c3f2dcb79f35d1f66e1698768428f in kylin's branch refs/heads/master from [~caolijun1166] [ https://gitbox.apache.org/repos/asf?p=kylin.git;h=6283046 ] KYLIN-3597 improve code smell > Fix sonar reported static code issues > - > > Key: KYLIN-3597 > URL: https://issues.apache.org/jira/browse/KYLIN-3597 > Project: Kylin > Issue Type: Improvement > Components: Others >Reporter: Shaofeng SHI >Priority: Major > Fix For: v2.6.0 > > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (KYLIN-3707) Add configuration for setting isolation-level for sqoop
[ https://issues.apache.org/jira/browse/KYLIN-3707?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722154#comment-16722154 ] ASF subversion and git services commented on KYLIN-3707: Commit fb54f38cdad26b776b6f5116cacfbc8470e19142 in kylin's branch refs/heads/master from woyumen4597 [ https://gitbox.apache.org/repos/asf?p=kylin.git;h=fb54f38 ] KYLIN-3707 add configuration for setting isolation-level for sqoop > Add configuration for setting isolation-level for sqoop > --- > > Key: KYLIN-3707 > URL: https://issues.apache.org/jira/browse/KYLIN-3707 > Project: Kylin > Issue Type: Improvement > Components: RDBMS Source > Environment: MacOSX JDK1.8 >Reporter: rongchuan.jin >Priority: Minor > > When use RDBMS as datasource,we use Apache Sqoop to import data from RDBMS to > HDFS.Sqoop use read_commited isolation-level by default,while some RDBMS does > not > support read_commited level,so need to add a configuration to fix it. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (KYLIN-3707) Add configuration for setting isolation-level for sqoop
[ https://issues.apache.org/jira/browse/KYLIN-3707?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722153#comment-16722153 ] ASF GitHub Bot commented on KYLIN-3707: --- shaofengshi closed pull request #366: KYLIN-3707 add configuration for setting isolation-level for sqoop URL: https://github.com/apache/kylin/pull/366 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/JdbcConnector.java b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/JdbcConnector.java index d849e6c010..47ba6b3bc0 100644 --- a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/JdbcConnector.java +++ b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/JdbcConnector.java @@ -37,7 +37,6 @@ import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import com.google.common.annotations.VisibleForTesting; import com.google.common.collect.Lists; public class JdbcConnector implements Closeable { @@ -175,8 +174,7 @@ public String getPropertyValue(String key) { return jdbcDs.getPropertyValue(key); } -@VisibleForTesting -SqlConverter getSqlConverter() { +public SqlConverter getSqlConverter() { return sqlConverter; } diff --git a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/DefaultConfiguer.java b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/DefaultConfiguer.java index 6d7fb6da37..6c01a70072 100644 --- a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/DefaultConfiguer.java +++ b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/DefaultConfiguer.java @@ -78,10 +78,6 @@ public String fixAfterDefaultConvert(String orig) { if (this.adaptor == null) { return orig; } -// fix problem of case sensitive when generate sql. -//if (isCaseSensitive()) { -//orig = adaptor.fixCaseSensitiveSql(orig); -//} return adaptor.fixSql(orig); } @@ -134,4 +130,9 @@ public String fixIdentifierCaseSensitve(String orig) { } return adaptor.fixIdentifierCaseSensitve(orig); } + +@Override +public String getTransactionIsolationLevel() { +return dsDef.getPropertyValue("transaction.isolation-level"); +} } diff --git a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverter.java b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverter.java index d25c04fd61..e8302e8a23 100644 --- a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverter.java +++ b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverter.java @@ -110,5 +110,11 @@ public IConfigurer getConfigurer() { boolean enableQuote(); String fixIdentifierCaseSensitve(String orig); + +/** + * Only support following 3 types + * TRANSACTION_READ_COMMITTED,TRANSACTION_READ_UNCOMMITTED,TRANSACTION_READ_COMMITTED + */ +String getTransactionIsolationLevel(); } } diff --git a/datasource-sdk/src/test/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverterTest.java b/datasource-sdk/src/test/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverterTest.java index 94cc651223..451be6061c 100644 --- a/datasource-sdk/src/test/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverterTest.java +++ b/datasource-sdk/src/test/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverterTest.java @@ -111,12 +111,20 @@ public boolean enableQuote() { public String fixIdentifierCaseSensitve(String orig) { return orig; } + +@Override +public String getTransactionIsolationLevel() { +return null; +} }, master); // escape default keywords -Assert.assertEquals("SELECT *\nFROM \"DEFAULT\".\"FACT\"", converter.convertSql("select * from \"DEFAULT\".FACT")); -Assert.assertEquals("SELECT *\nFROM \"Default\".\"FACT\"", converter.convertSql("select * from \"Default\".FACT")); -Assert.assertEquals("SELECT *\nFROM \"default\".\"FACT\"", converter.convertSql("select * from \"default\".FACT")); +Assert.assertEquals("SELECT *\nFROM \"DEFAULT\".\"FACT\"", +converter.convertSql("select * from \"DEFAULT\".FACT")); +Assert.assertEquals("SELECT *\nFROM \"Default\".\"FACT\"", +converter.convertSql("select * from
[GitHub] shaofengshi closed pull request #366: KYLIN-3707 add configuration for setting isolation-level for sqoop
shaofengshi closed pull request #366: KYLIN-3707 add configuration for setting isolation-level for sqoop URL: https://github.com/apache/kylin/pull/366 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/JdbcConnector.java b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/JdbcConnector.java index d849e6c010..47ba6b3bc0 100644 --- a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/JdbcConnector.java +++ b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/JdbcConnector.java @@ -37,7 +37,6 @@ import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import com.google.common.annotations.VisibleForTesting; import com.google.common.collect.Lists; public class JdbcConnector implements Closeable { @@ -175,8 +174,7 @@ public String getPropertyValue(String key) { return jdbcDs.getPropertyValue(key); } -@VisibleForTesting -SqlConverter getSqlConverter() { +public SqlConverter getSqlConverter() { return sqlConverter; } diff --git a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/DefaultConfiguer.java b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/DefaultConfiguer.java index 6d7fb6da37..6c01a70072 100644 --- a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/DefaultConfiguer.java +++ b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/DefaultConfiguer.java @@ -78,10 +78,6 @@ public String fixAfterDefaultConvert(String orig) { if (this.adaptor == null) { return orig; } -// fix problem of case sensitive when generate sql. -//if (isCaseSensitive()) { -//orig = adaptor.fixCaseSensitiveSql(orig); -//} return adaptor.fixSql(orig); } @@ -134,4 +130,9 @@ public String fixIdentifierCaseSensitve(String orig) { } return adaptor.fixIdentifierCaseSensitve(orig); } + +@Override +public String getTransactionIsolationLevel() { +return dsDef.getPropertyValue("transaction.isolation-level"); +} } diff --git a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverter.java b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverter.java index d25c04fd61..e8302e8a23 100644 --- a/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverter.java +++ b/datasource-sdk/src/main/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverter.java @@ -110,5 +110,11 @@ public IConfigurer getConfigurer() { boolean enableQuote(); String fixIdentifierCaseSensitve(String orig); + +/** + * Only support following 3 types + * TRANSACTION_READ_COMMITTED,TRANSACTION_READ_UNCOMMITTED,TRANSACTION_READ_COMMITTED + */ +String getTransactionIsolationLevel(); } } diff --git a/datasource-sdk/src/test/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverterTest.java b/datasource-sdk/src/test/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverterTest.java index 94cc651223..451be6061c 100644 --- a/datasource-sdk/src/test/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverterTest.java +++ b/datasource-sdk/src/test/java/org/apache/kylin/sdk/datasource/framework/conv/SqlConverterTest.java @@ -111,12 +111,20 @@ public boolean enableQuote() { public String fixIdentifierCaseSensitve(String orig) { return orig; } + +@Override +public String getTransactionIsolationLevel() { +return null; +} }, master); // escape default keywords -Assert.assertEquals("SELECT *\nFROM \"DEFAULT\".\"FACT\"", converter.convertSql("select * from \"DEFAULT\".FACT")); -Assert.assertEquals("SELECT *\nFROM \"Default\".\"FACT\"", converter.convertSql("select * from \"Default\".FACT")); -Assert.assertEquals("SELECT *\nFROM \"default\".\"FACT\"", converter.convertSql("select * from \"default\".FACT")); +Assert.assertEquals("SELECT *\nFROM \"DEFAULT\".\"FACT\"", +converter.convertSql("select * from \"DEFAULT\".FACT")); +Assert.assertEquals("SELECT *\nFROM \"Default\".\"FACT\"", +converter.convertSql("select * from \"Default\".FACT")); +Assert.assertEquals("SELECT *\nFROM \"default\".\"FACT\"", +converter.convertSql("select * from \"default\".FACT")); } @Test @@ -189,6 +197,11 @@ public boolean enableQuote() {
[jira] [Commented] (KYLIN-3597) Fix sonar reported static code issues
[ https://issues.apache.org/jira/browse/KYLIN-3597?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722152#comment-16722152 ] ASF subversion and git services commented on KYLIN-3597: Commit 32a31506ce609eeabe3c88544a2bcf742a5c5599 in kylin's branch refs/heads/master from whuwb [ https://gitbox.apache.org/repos/asf?p=kylin.git;h=32a3150 ] KYLIN-3597 fix sonar issues > Fix sonar reported static code issues > - > > Key: KYLIN-3597 > URL: https://issues.apache.org/jira/browse/KYLIN-3597 > Project: Kylin > Issue Type: Improvement > Components: Others >Reporter: Shaofeng SHI >Priority: Major > Fix For: v2.6.0 > > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (KYLIN-3597) Fix sonar reported static code issues
[ https://issues.apache.org/jira/browse/KYLIN-3597?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722151#comment-16722151 ] ASF GitHub Bot commented on KYLIN-3597: --- shaofengshi closed pull request #392: KYLIN-3597 fix sonar issues URL: https://github.com/apache/kylin/pull/392 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/core-common/src/main/java/org/apache/kylin/common/util/SSHClient.java b/core-common/src/main/java/org/apache/kylin/common/util/SSHClient.java index 26729207bb..b5d645952c 100644 --- a/core-common/src/main/java/org/apache/kylin/common/util/SSHClient.java +++ b/core-common/src/main/java/org/apache/kylin/common/util/SSHClient.java @@ -42,6 +42,7 @@ public class SSHClient { protected static final org.slf4j.Logger logger = LoggerFactory.getLogger(SSHClient.class); +private static final String ERROR_IN_CHECK_ACK = "Error in checkAck()"; private String hostname; private int port; @@ -97,7 +98,7 @@ public void scpFileToRemote(String localFile, String remoteTargetDirectory) thro out.write(command.getBytes(StandardCharsets.UTF_8)); out.flush(); if (checkAck(in) != 0) { -throw new Exception("Error in checkAck()"); +throw new Exception(ERROR_IN_CHECK_ACK); } } @@ -115,7 +116,7 @@ public void scpFileToRemote(String localFile, String remoteTargetDirectory) thro out.write(command.getBytes(StandardCharsets.UTF_8)); out.flush(); if (checkAck(in) != 0) { -throw new Exception("Error in checkAck()"); +throw new Exception(ERROR_IN_CHECK_ACK); } // send a content of lfile @@ -134,7 +135,7 @@ public void scpFileToRemote(String localFile, String remoteTargetDirectory) thro out.write(buf, 0, 1); out.flush(); if (checkAck(in) != 0) { -throw new Exception("Error in checkAck()"); +throw new Exception(ERROR_IN_CHECK_ACK); } out.close(); diff --git a/core-cube/src/main/java/org/apache/kylin/cube/cuboid/algorithm/CuboidRecommender.java b/core-cube/src/main/java/org/apache/kylin/cube/cuboid/algorithm/CuboidRecommender.java index 057f7e84e2..54c6764023 100644 --- a/core-cube/src/main/java/org/apache/kylin/cube/cuboid/algorithm/CuboidRecommender.java +++ b/core-cube/src/main/java/org/apache/kylin/cube/cuboid/algorithm/CuboidRecommender.java @@ -20,6 +20,7 @@ import java.io.IOException; import java.util.List; +import java.util.Locale; import java.util.Map; import java.util.concurrent.Callable; import java.util.concurrent.ExecutionException; @@ -97,20 +98,20 @@ public static CuboidRecommender getInstance() { true); if (recommendCuboid != null) { -logger.info("Add recommend cuboids for " + key + " to cache"); +logger.info(String.format(Locale.ROOT, "Add recommend cuboids for %s to cache", key)); cuboidRecommendCache.put(key, recommendCuboid); } return recommendCuboid; } catch (Exception e) { cuboidRecommendCache.invalidate(key); -logger.error("Failed to get recommend cuboids for " + key + " in cache", e); +logger.error(String.format(Locale.ROOT, "Failed to get recommend cuboids for %s in cache", key), e); throw e; } } }); } catch (ExecutionException e) { -logger.error("Failed to get recommend cuboids for " + key); +logger.error(String.format(Locale.ROOT, "Failed to get recommend cuboids for %s", key)); } } return results; @@ -121,9 +122,9 @@ public static CuboidRecommender getInstance() { */ public Map getRecommendCuboidList(CuboidStats cuboidStats, KylinConfig kylinConf, boolean ifForceRecommend) { -long Threshold1 = 1L << kylinConf.getCubePlannerAgreedyAlgorithmAutoThreshold(); -long Threshold2 = 1L << kylinConf.getCubePlannerGeneticAlgorithmAutoThreshold(); -if (Threshold1 >= Threshold2) { +long threshold1 = 1L << kylinConf.getCubePlannerAgreedyAlgorithmAutoThreshold(); +long threshold2 = 1L << kylinConf.getCubePlannerGeneticAlgorithmAutoThreshold(); +if (threshold1 >=
[jira] [Commented] (KYLIN-3597) Fix sonar reported static code issues
[ https://issues.apache.org/jira/browse/KYLIN-3597?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16722148#comment-16722148 ] ASF GitHub Bot commented on KYLIN-3597: --- shaofengshi closed pull request #393: KYLIN-3597 improve code smell URL: https://github.com/apache/kylin/pull/393 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/core-common/src/main/java/org/apache/kylin/common/KylinConfigBase.java b/core-common/src/main/java/org/apache/kylin/common/KylinConfigBase.java index 018552caf1..f67f6b3479 100644 --- a/core-common/src/main/java/org/apache/kylin/common/KylinConfigBase.java +++ b/core-common/src/main/java/org/apache/kylin/common/KylinConfigBase.java @@ -62,6 +62,8 @@ private static final String DEFAULT = "default"; private static final String KYLIN_ENGINE_MR_JOB_JAR = "kylin.engine.mr.job-jar"; private static final String KYLIN_STORAGE_HBASE_COPROCESSOR_LOCAL_JAR = "kylin.storage.hbase.coprocessor-local-jar"; +private static final String FILE_SCHEME = "file:"; +private static final String MAPRFS_SCHEME = "maprfs:"; /* * DON'T DEFINE CONSTANTS FOR PROPERTY KEYS! @@ -264,10 +266,10 @@ public String getHdfsWorkingDirectory() { root += "/"; cachedHdfsWorkingDirectory = root; -if (cachedHdfsWorkingDirectory.startsWith("file:")) { -cachedHdfsWorkingDirectory = cachedHdfsWorkingDirectory.replace("file:", "file://"); -} else if (cachedHdfsWorkingDirectory.startsWith("maprfs:")) { -cachedHdfsWorkingDirectory = cachedHdfsWorkingDirectory.replace("maprfs:", "maprfs://"); +if (cachedHdfsWorkingDirectory.startsWith(FILE_SCHEME)) { +cachedHdfsWorkingDirectory = cachedHdfsWorkingDirectory.replace(FILE_SCHEME, "file://"); +} else if (cachedHdfsWorkingDirectory.startsWith(MAPRFS_SCHEME)) { +cachedHdfsWorkingDirectory = cachedHdfsWorkingDirectory.replace(MAPRFS_SCHEME, "maprfs://"); } return cachedHdfsWorkingDirectory; } @@ -302,10 +304,10 @@ public String getMetastoreBigCellHdfsDirectory() { root += "/"; cachedBigCellDirectory = root; -if (cachedBigCellDirectory.startsWith("file:")) { -cachedBigCellDirectory = cachedBigCellDirectory.replace("file:", "file://"); -} else if (cachedBigCellDirectory.startsWith("maprfs:")) { -cachedBigCellDirectory = cachedBigCellDirectory.replace("maprfs:", "maprfs://"); +if (cachedBigCellDirectory.startsWith(FILE_SCHEME)) { +cachedBigCellDirectory = cachedBigCellDirectory.replace(FILE_SCHEME, "file://"); +} else if (cachedBigCellDirectory.startsWith(MAPRFS_SCHEME)) { +cachedBigCellDirectory = cachedBigCellDirectory.replace(MAPRFS_SCHEME, "maprfs://"); } return cachedBigCellDirectory; @@ -411,7 +413,7 @@ public String getMetadataUrlPrefix() { } public boolean isResourceStoreReconnectEnabled() { -return Boolean.parseBoolean(getOptional("kylin.resourcestore.reconnect-enabled", "false")); +return Boolean.parseBoolean(getOptional("kylin.resourcestore.reconnect-enabled", FALSE)); } public int getResourceStoreReconnectBaseMs() { @@ -1445,7 +1447,7 @@ public int getScanThreshold() { } public boolean isLazyQueryEnabled() { -return Boolean.parseBoolean(getOptional("kylin.query.lazy-query-enabled", "false")); +return Boolean.parseBoolean(getOptional("kylin.query.lazy-query-enabled", FALSE)); } public long getLazyQueryWaitingTimeoutMilliSeconds() { @@ -1543,7 +1545,7 @@ public String getMemCachedHosts() { } public boolean isQuerySegmentCacheEnabled() { -return Boolean.parseBoolean(getOptional("kylin.query.segment-cache-enabled", "false")); +return Boolean.parseBoolean(getOptional("kylin.query.segment-cache-enabled", FALSE)); } public int getQuerySegmentCacheTimeout() { @@ -1665,7 +1667,7 @@ public String getSQLResponseSignatureClass() { } public boolean isQueryCacheSignatureEnabled() { -return Boolean.parseBoolean(this.getOptional("kylin.query.cache-signature-enabled", "false")); +return Boolean.parseBoolean(this.getOptional("kylin.query.cache-signature-enabled", FALSE)); } // diff --git a/core-common/src/main/java/org/apache/kylin/common/persistence/ContentWriter.java b/core-common/src/main/java/org/apache/kylin/common/persistence/ContentWriter.java index 25420a4d7e..c7d963d40c 100644 ---
[GitHub] shaofengshi commented on a change in pull request #394: KYLIN-3720 add column family check when save/update cube desc
shaofengshi commented on a change in pull request #394: KYLIN-3720 add column family check when save/update cube desc URL: https://github.com/apache/kylin/pull/394#discussion_r241949448 ## File path: server-base/src/main/java/org/apache/kylin/rest/controller/CubeController.java ## @@ -608,6 +613,26 @@ public CubeRequest saveCubeDesc(@RequestBody CubeRequest cubeRequest) { return cubeRequest; } +//column family metrics may not match the real metrics when editing cube by json +private void validateColumnFamily(CubeDesc cubeDesc){ +Set columnFamilyMetricsSet = Sets.newHashSet(); +for(HBaseColumnFamilyDesc hBaseColumnFamilyDesc : cubeDesc.getHbaseMapping().getColumnFamily()) { +for(HBaseColumnDesc hBaseColumnDesc : hBaseColumnFamilyDesc.getColumns()){ +for(String columnName : hBaseColumnDesc.getMeasureRefs()){ +columnFamilyMetricsSet.add(columnName); +} +} +} +for(MeasureDesc measureDesc : cubeDesc.getMeasures()){ Review comment: Hi bo, the function looks good, but it seems the code is not formatted. Could you reformat it by following the "IDE code formatter" in https://kylin.apache.org/development/dev_env.html ? Besides, the "merge" commit should be avoid. Thank you for making Kylin better! This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] shaofengshi closed pull request #393: KYLIN-3597 improve code smell
shaofengshi closed pull request #393: KYLIN-3597 improve code smell URL: https://github.com/apache/kylin/pull/393 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/core-common/src/main/java/org/apache/kylin/common/KylinConfigBase.java b/core-common/src/main/java/org/apache/kylin/common/KylinConfigBase.java index 018552caf1..f67f6b3479 100644 --- a/core-common/src/main/java/org/apache/kylin/common/KylinConfigBase.java +++ b/core-common/src/main/java/org/apache/kylin/common/KylinConfigBase.java @@ -62,6 +62,8 @@ private static final String DEFAULT = "default"; private static final String KYLIN_ENGINE_MR_JOB_JAR = "kylin.engine.mr.job-jar"; private static final String KYLIN_STORAGE_HBASE_COPROCESSOR_LOCAL_JAR = "kylin.storage.hbase.coprocessor-local-jar"; +private static final String FILE_SCHEME = "file:"; +private static final String MAPRFS_SCHEME = "maprfs:"; /* * DON'T DEFINE CONSTANTS FOR PROPERTY KEYS! @@ -264,10 +266,10 @@ public String getHdfsWorkingDirectory() { root += "/"; cachedHdfsWorkingDirectory = root; -if (cachedHdfsWorkingDirectory.startsWith("file:")) { -cachedHdfsWorkingDirectory = cachedHdfsWorkingDirectory.replace("file:", "file://"); -} else if (cachedHdfsWorkingDirectory.startsWith("maprfs:")) { -cachedHdfsWorkingDirectory = cachedHdfsWorkingDirectory.replace("maprfs:", "maprfs://"); +if (cachedHdfsWorkingDirectory.startsWith(FILE_SCHEME)) { +cachedHdfsWorkingDirectory = cachedHdfsWorkingDirectory.replace(FILE_SCHEME, "file://"); +} else if (cachedHdfsWorkingDirectory.startsWith(MAPRFS_SCHEME)) { +cachedHdfsWorkingDirectory = cachedHdfsWorkingDirectory.replace(MAPRFS_SCHEME, "maprfs://"); } return cachedHdfsWorkingDirectory; } @@ -302,10 +304,10 @@ public String getMetastoreBigCellHdfsDirectory() { root += "/"; cachedBigCellDirectory = root; -if (cachedBigCellDirectory.startsWith("file:")) { -cachedBigCellDirectory = cachedBigCellDirectory.replace("file:", "file://"); -} else if (cachedBigCellDirectory.startsWith("maprfs:")) { -cachedBigCellDirectory = cachedBigCellDirectory.replace("maprfs:", "maprfs://"); +if (cachedBigCellDirectory.startsWith(FILE_SCHEME)) { +cachedBigCellDirectory = cachedBigCellDirectory.replace(FILE_SCHEME, "file://"); +} else if (cachedBigCellDirectory.startsWith(MAPRFS_SCHEME)) { +cachedBigCellDirectory = cachedBigCellDirectory.replace(MAPRFS_SCHEME, "maprfs://"); } return cachedBigCellDirectory; @@ -411,7 +413,7 @@ public String getMetadataUrlPrefix() { } public boolean isResourceStoreReconnectEnabled() { -return Boolean.parseBoolean(getOptional("kylin.resourcestore.reconnect-enabled", "false")); +return Boolean.parseBoolean(getOptional("kylin.resourcestore.reconnect-enabled", FALSE)); } public int getResourceStoreReconnectBaseMs() { @@ -1445,7 +1447,7 @@ public int getScanThreshold() { } public boolean isLazyQueryEnabled() { -return Boolean.parseBoolean(getOptional("kylin.query.lazy-query-enabled", "false")); +return Boolean.parseBoolean(getOptional("kylin.query.lazy-query-enabled", FALSE)); } public long getLazyQueryWaitingTimeoutMilliSeconds() { @@ -1543,7 +1545,7 @@ public String getMemCachedHosts() { } public boolean isQuerySegmentCacheEnabled() { -return Boolean.parseBoolean(getOptional("kylin.query.segment-cache-enabled", "false")); +return Boolean.parseBoolean(getOptional("kylin.query.segment-cache-enabled", FALSE)); } public int getQuerySegmentCacheTimeout() { @@ -1665,7 +1667,7 @@ public String getSQLResponseSignatureClass() { } public boolean isQueryCacheSignatureEnabled() { -return Boolean.parseBoolean(this.getOptional("kylin.query.cache-signature-enabled", "false")); +return Boolean.parseBoolean(this.getOptional("kylin.query.cache-signature-enabled", FALSE)); } // diff --git a/core-common/src/main/java/org/apache/kylin/common/persistence/ContentWriter.java b/core-common/src/main/java/org/apache/kylin/common/persistence/ContentWriter.java index 25420a4d7e..c7d963d40c 100644 --- a/core-common/src/main/java/org/apache/kylin/common/persistence/ContentWriter.java +++ b/core-common/src/main/java/org/apache/kylin/common/persistence/ContentWriter.java @@ -71,17 +71,11 @@ public long bytesWritten() { } public byte[] extractAllBytes() throws IOException { -
[GitHub] coveralls commented on issue #394: KYLIN-3720 add column family check when save/update cube desc
coveralls commented on issue #394: KYLIN-3720 add column family check when save/update cube desc URL: https://github.com/apache/kylin/pull/394#issuecomment-447548175 ## Pull Request Test Coverage Report for [Build 3966](https://coveralls.io/builds/20650531) * **0** of **14** **(0.0%)** changed or added relevant lines in **1** file are covered. * **1** unchanged line in **1** file lost coverage. * Overall coverage decreased (**-0.004%**) to **25.805%** --- | Changes Missing Coverage | Covered Lines | Changed/Added Lines | % | | :-|--||---: | | [server-base/src/main/java/org/apache/kylin/rest/controller/CubeController.java](https://coveralls.io/builds/20650531/source?filename=server-base%2Fsrc%2Fmain%2Fjava%2Forg%2Fapache%2Fkylin%2Frest%2Fcontroller%2FCubeController.java#L595) | 0 | 14 | 0.0% | Files with Coverage Reduction | New Missed Lines | % | | :-|--|--: | | [server-base/src/main/java/org/apache/kylin/rest/util/QueryRequestLimits.java](https://coveralls.io/builds/20650531/source?filename=server-base%2Fsrc%2Fmain%2Fjava%2Forg%2Fapache%2Fkylin%2Frest%2Futil%2FQueryRequestLimits.java#L72) | 1 | 47.62% | | Totals | [![Coverage Status](https://coveralls.io/builds/20650531/badge)](https://coveralls.io/builds/20650531) | | :-- | --: | | Change from base [Build 3965](https://coveralls.io/builds/20649148): | -0.004% | | Covered Lines: | 17823 | | Relevant Lines: | 69069 | --- # - [Coveralls](https://coveralls.io) This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (KYLIN-3720) add column family check when save/update cube desc
[ https://issues.apache.org/jira/browse/KYLIN-3720?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] WangBo updated KYLIN-3720: -- Attachment: KYLIN-3720.patch > add column family check when save/update cube desc > -- > > Key: KYLIN-3720 > URL: https://issues.apache.org/jira/browse/KYLIN-3720 > Project: Kylin > Issue Type: Improvement >Reporter: WangBo >Assignee: WangBo >Priority: Major > Attachments: KYLIN-3720.patch > > > when update or create cube desc by editing json,it may cause query failed > when the input column of column family is invalid.So checking user input > column with columns defined in cube desc can aovid it. > {code:java} > // query failed error > Caused by: java.lang.IllegalStateException > at > org.apache.kylin.gridtable.GTInfo.validateColumnBlocks(GTInfo.java:198) > at org.apache.kylin.gridtable.GTInfo.validate(GTInfo.java:167) > at org.apache.kylin.gridtable.GTInfo$Builder.build(GTInfo.java:269) > at > org.apache.kylin.cube.gridtable.CubeGridTable.newGTInfo(CubeGridTable.java:35) > at > org.apache.kylin.storage.gtrecord.CubeScanRangePlanner.(CubeScanRangePlanner.java:89) > at > org.apache.kylin.storage.gtrecord.CubeSegmentScanner.(CubeSegmentScanner.java:73) > at > org.apache.kylin.storage.gtrecord.GTCubeStorageQueryBase.search(GTCubeStorageQueryBase.java:89) > at > org.apache.kylin.query.enumerator.OLAPEnumerator.queryStorage(OLAPEnumerator.java:120) > at > org.apache.kylin.query.enumerator.OLAPEnumerator.moveNext(OLAPEnumerator.java:64){code} > > -- This message was sent by Atlassian JIRA (v7.6.3#76005)