[jira] [Updated] (KYLIN-3667) ArrayIndexOutOfBoundsException in NDCuboidBuilder

2018-11-06 Thread Hubert STEFANI (JIRA)


 [ 
https://issues.apache.org/jira/browse/KYLIN-3667?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hubert STEFANI updated KYLIN-3667:
--
Attachment: cube.json

> ArrayIndexOutOfBoundsException in NDCuboidBuilder
> -
>
> Key: KYLIN-3667
> URL: https://issues.apache.org/jira/browse/KYLIN-3667
> Project: Kylin
>  Issue Type: Bug
>  Components: Spark Engine
>Affects Versions: v2.5.0
> Environment: AWS EMR 
>Reporter: Hubert STEFANI
>Priority: Major
> Attachments: cube.json
>
>
> The former errors
> https://issues.apache.org/jira/browse/KYLIN-3115
> and
> https://issues.apache.org/jira/browse/KYLIN-1768
> still remain in step SparkCubingByLayer.
> we encounter the following error : 
> java.lang.ArrayIndexOutOfBoundsException at java.lang.System.arraycopy(Native 
> Method)
>  at 
> org.apache.kylin.engine.mr.common.NDCuboidBuilder.buildKeyInternal(NDCuboidBuilder.java:106)
>  at 
> org.apache.kylin.engine.mr.common.NDCuboidBuilder.buildKey2(NDCuboidBuilder.java:87)
>  at 
> org.apache.kylin.engine.spark.SparkCubingByLayer$CuboidFlatMap.call(SparkCubingByLayer.java:425)
>  at 
> org.apache.kylin.engine.spark.SparkCubingByLayer$CuboidFlatMap.call(SparkCubingByLayer.java:370)
>  at 
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:143)
>  at 
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:143)
>  at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
>  at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
>  at 
> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)
>  at 
> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
>  at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
>  at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
>  at org.apache.spark.scheduler.Task.run(Task.scala:99)
>  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  
> Do we have to (painfully) change dimensions size or should it be fixed 
> through a patch ?
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (KYLIN-3667) ArrayIndexOutOfBoundsException in NDCuboidBuilder

2018-11-05 Thread Hubert STEFANI (JIRA)


 [ 
https://issues.apache.org/jira/browse/KYLIN-3667?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hubert STEFANI updated KYLIN-3667:
--
Description: 
The former errors

https://issues.apache.org/jira/browse/KYLIN-3115

and

https://issues.apache.org/jira/browse/KYLIN-1768

still remain in step SparkCubingByLayer.

we encounter the following error : 

java.lang.ArrayIndexOutOfBoundsException at java.lang.System.arraycopy(Native 
Method)
 at 
org.apache.kylin.engine.mr.common.NDCuboidBuilder.buildKeyInternal(NDCuboidBuilder.java:106)
 at 
org.apache.kylin.engine.mr.common.NDCuboidBuilder.buildKey2(NDCuboidBuilder.java:87)
 at 
org.apache.kylin.engine.spark.SparkCubingByLayer$CuboidFlatMap.call(SparkCubingByLayer.java:425)
 at 
org.apache.kylin.engine.spark.SparkCubingByLayer$CuboidFlatMap.call(SparkCubingByLayer.java:370)
 at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:143)
 at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:143)
 at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
 at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
 at 
org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)
 at 
org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
 at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
 at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
 at org.apache.spark.scheduler.Task.run(Task.scala:99)
 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

 

Do we have to (painfully) change dimensions size or should it be fixed through 
a patch ?

 

  was:
The former errors

https://issues.apache.org/jira/browse/KYLIN-3115

and

https://issues.apache.org/jira/browse/KYLIN-1768

still remains in step SparkCubingByLayer.

we encounter the following error : 

java.lang.ArrayIndexOutOfBoundsException at java.lang.System.arraycopy(Native 
Method)
 at 
org.apache.kylin.engine.mr.common.NDCuboidBuilder.buildKeyInternal(NDCuboidBuilder.java:106)
 at 
org.apache.kylin.engine.mr.common.NDCuboidBuilder.buildKey2(NDCuboidBuilder.java:87)
 at 
org.apache.kylin.engine.spark.SparkCubingByLayer$CuboidFlatMap.call(SparkCubingByLayer.java:425)
 at 
org.apache.kylin.engine.spark.SparkCubingByLayer$CuboidFlatMap.call(SparkCubingByLayer.java:370)
 at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:143)
 at 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:143)
 at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
 at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
 at 
org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)
 at 
org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
 at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
 at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
 at org.apache.spark.scheduler.Task.run(Task.scala:99)
 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

 

Do we have to (painfully) change dimensions size or should it be fixed through 
a patch ? 

 


> ArrayIndexOutOfBoundsException in NDCuboidBuilder
> -
>
> Key: KYLIN-3667
> URL: https://issues.apache.org/jira/browse/KYLIN-3667
> Project: Kylin
>  Issue Type: Bug
>  Components: Spark Engine
>Affects Versions: v2.5.0
> Environment: AWS EMR 
>Reporter: Hubert STEFANI
>Priority: Major
>
> The former errors
> https://issues.apache.org/jira/browse/KYLIN-3115
> and
> https://issues.apache.org/jira/browse/KYLIN-1768
> still remain in step SparkCubingByLayer.
> we encounter the following error : 
> java.lang.ArrayIndexOutOfBoundsException at java.lang.System.arraycopy(Native 
> Method)
>  at 
> org.apache.kylin.engine.mr.common.NDCuboidBuilder.buildKeyInternal(NDCuboidBuilder.java:106)
>  at 
> org.apache.kylin.engine.mr.common.NDCuboidBuilder.buildKey2(NDCuboidBuilder.java:87)
>  at 
> org.apache.kylin.engine.spark.SparkCubingByLayer$CuboidFlatMap.call(SparkCubingByLayer.java:425)
>  at 
> org.apache.kylin.engine.spark.SparkCubingByLayer$CuboidFlatMap.call(SparkCubingByLayer.java:370)
>  at 
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:143)
>  at 
>