[ 
https://issues.apache.org/jira/browse/KYLIN-3607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16722406#comment-16722406
 ] 

ASF GitHub Bot commented on KYLIN-3607:
---------------------------------------

shaofengshi closed pull request #395: KYLIN-3607 add hbase-hadoop*-compat*.jar
URL: https://github.com/apache/kylin/pull/395
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git 
a/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java
 
b/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java
index ccab22f878..86ad0fbe5d 100644
--- 
a/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java
+++ 
b/storage-hbase/src/main/java/org/apache/kylin/storage/hbase/steps/HBaseSparkSteps.java
@@ -66,6 +66,11 @@ public AbstractExecutable 
createConvertCuboidToHfileStep(String jobId) {
         StringUtil.appendWithSeparator(jars, 
ClassUtil.findContainingJar("org.apache.htrace.Trace", null)); // 
htrace-core.jar
         StringUtil.appendWithSeparator(jars,
                 
ClassUtil.findContainingJar("com.yammer.metrics.core.MetricsRegistry", null)); 
// metrics-core.jar
+        //KYLIN-3607
+        StringUtil.appendWithSeparator(jars,
+                
ClassUtil.findContainingJar("org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory",
 null));//hbase-hadoop-compat-1.1.1.jar
+        StringUtil.appendWithSeparator(jars,
+                
ClassUtil.findContainingJar("org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactoryImpl",
 null));//hbase-hadoop2-compat-1.1.1.jar
 
         StringUtil.appendWithSeparator(jars, 
seg.getConfig().getSparkAdditionalJars());
         sparkExecutable.setJars(jars.toString());


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> can't build cube with spark in v2.5.0
> -------------------------------------
>
>                 Key: KYLIN-3607
>                 URL: https://issues.apache.org/jira/browse/KYLIN-3607
>             Project: Kylin
>          Issue Type: Bug
>          Components: Storage - HBase
>    Affects Versions: v2.5.0
>            Reporter: ANIL KUMAR
>            Assignee: Lijun Cao
>            Priority: Major
>             Fix For: v2.6.0
>
>
> in Kylin v2.5.0, can't be built cube at step 8 Convert Cuboid Data to HFile, 
> the following is the related exception:
>  
> ERROR yarn.ApplicationMaster: User class threw exception: 
> java.lang.RuntimeException: error execute 
> org.apache.kylin.storage.hbase.steps.SparkCubeHFile. Root cause: Job aborted 
> due to stage failure: Task 0 in stage 1.0 failed 4 times, 
> java.lang.ExceptionInInitializerError
>  at 
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:247)
>  at 
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:194)
>  at 
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:152)
>  at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1125)
>  at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123)
>  at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123)
>  at 
> org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1353)
>  at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1131)
>  at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1102)
>  at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
>  at org.apache.spark.scheduler.Task.run(Task.scala:99)
>  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: Could not create interface 
> org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the 
> hadoop compatibility jar on the classpath?
>  at 
> org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:73)
>  at org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:31)
>  at org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:192)
>  ... 15 more
> Caused by: java.util.NoSuchElementException
>  at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365)
>  at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
>  at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>  at 
> org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:59)
>  ... 17 more



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to