[ 
https://issues.apache.org/jira/browse/CARBONDATA-1421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16144925#comment-16144925
 ] 

Pallavi Singh commented on CARBONDATA-1421:
-------------------------------------------

Hi Zhichao Zhang,

I tried to resolve the issue by rebasing the master branch with the above PR, 
but the compaction is still failing with the following stacktrace:

17/08/29 14:00:26 AUDIT CarbonDataRDDFactory$: [pallavi][pallavi][Thread-1]Data 
load is successful for default.carbon_table
17/08/29 14:00:26 AUDIT CarbonDataRDDFactory$: 
[pallavi][pallavi][Thread-1]Compaction request received for table 
default.carbon_table
17/08/29 14:00:26 ERROR CarbonMergerRDD: [Executor task launch 
worker-1][partitionID:default_carbon_table_253fb994-7e4a-4ce0-9fb4-35fb687d44f9]
 
java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)
17/08/29 14:00:26 ERROR Executor: Exception in task 0.0 in stage 13.0 (TID 17)
java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)
17/08/29 14:00:26 WARN TaskSetManager: Lost task 0.0 in stage 13.0 (TID 17, 
localhost, executor driver): java.lang.IndexOutOfBoundsException: Index: 0, 
Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)

17/08/29 14:00:26 ERROR TaskSetManager: Task 0 in stage 13.0 failed 1 times; 
aborting job
17/08/29 14:00:26 ERROR DataManagementFunc$: main Exception in compaction 
thread org.apache.spark.SparkException: Job aborted due to stage failure: Task 
0 in stage 13.0 failed 1 times, most recent failure: Lost task 0.0 in stage 
13.0 (TID 17, localhost, executor driver): java.lang.IndexOutOfBoundsException: 
Index: 0, Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
java.util.concurrent.ExecutionException: org.apache.spark.SparkException: Job 
aborted due to stage failure: Task 0 in stage 13.0 failed 1 times, most recent 
failure: Lost task 0.0 in stage 13.0 (TID 17, localhost, executor driver): 
java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at 
org.apache.carbondata.spark.rdd.DataManagementFunc$$anonfun$executeCompaction$1.apply(DataManagementFunc.scala:193)
        at 
org.apache.carbondata.spark.rdd.DataManagementFunc$$anonfun$executeCompaction$1.apply(DataManagementFunc.scala:192)
        at scala.collection.Iterator$class.foreach(Iterator.scala:893)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
        at 
org.apache.carbondata.spark.rdd.DataManagementFunc$.executeCompaction(DataManagementFunc.scala:192)
        at 
org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$$anon$2.run(CarbonDataRDDFactory.scala:273)
        at 
org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.startCompactionThreads(CarbonDataRDDFactory.scala:364)
        at 
org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.handleSegmentMerging$1(CarbonDataRDDFactory.scala:493)
        at 
org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.loadCarbonData(CarbonDataRDDFactory.scala:1045)
        at 
org.apache.spark.sql.execution.command.LoadTable.processData(carbonTableSchema.scala:891)
        at 
org.apache.spark.sql.execution.command.LoadTable.run(carbonTableSchema.scala:613)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87)
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)
        at 
org.apache.carbondata.examples.CarbonSessionExample$.main(CarbonSessionExample.scala:108)
        at 
org.apache.carbondata.examples.CarbonSessionExample.main(CarbonSessionExample.scala)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: 
Task 0 in stage 13.0 failed 1 times, most recent failure: Lost task 0.0 in 
stage 13.0 (TID 17, localhost, executor driver): 
java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
        at scala.Option.foreach(Option.scala:257)
        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958)
        at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:935)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
        at org.apache.spark.rdd.RDD.collect(RDD.scala:934)
        at 
org.apache.carbondata.spark.rdd.Compactor$.triggerCompaction(Compactor.scala:100)
        at 
org.apache.carbondata.spark.rdd.Compactor.triggerCompaction(Compactor.scala)
        at 
org.apache.carbondata.spark.compaction.CompactionCallable.call(CompactionCallable.java:40)
        at 
org.apache.carbondata.spark.compaction.CompactionCallable.call(CompactionCallable.java:29)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        ... 3 more
17/08/29 14:00:26 ERROR CarbonDataRDDFactory$: main Exception in compaction 
thread org.apache.spark.SparkException: Job aborted due to stage failure: Task 
0 in stage 13.0 failed 1 times, most recent failure: Lost task 0.0 in stage 
13.0 (TID 17, localhost, executor driver): java.lang.IndexOutOfBoundsException: 
Index: 0, Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
17/08/29 14:00:26 ERROR CarbonDataRDDFactory$: main Exception in start 
compaction thread. Exception in compaction org.apache.spark.SparkException: Job 
aborted due to stage failure: Task 0 in stage 13.0 failed 1 times, most recent 
failure: Lost task 0.0 in stage 13.0 (TID 17, localhost, executor driver): 
java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
        at java.util.ArrayList.rangeCheck(ArrayList.java:653)
        at java.util.ArrayList.get(ArrayList.java:429)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.assignComplexOrdinal(SegmentProperties.java:472)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.fillDimensionAndMeasureDetails(SegmentProperties.java:397)
        at 
org.apache.carbondata.core.datastore.block.SegmentProperties.<init>(SegmentProperties.java:173)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD$$anon$1.<init>(CarbonMergerRDD.scala:161)
        at 
org.apache.carbondata.spark.rdd.CarbonMergerRDD.internalCompute(CarbonMergerRDD.scala:79)
        at org.apache.carbondata.spark.rdd.CarbonRDD.compute(CarbonRDD.scala:62)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:99)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
17/08/29 14:00:26 ERROR LoadTable: main 
java.lang.Exception: Dataload is success. Auto-Compaction has failed. Please 
check logs.
        at 
org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.loadCarbonData(CarbonDataRDDFactory.scala:1048)
        at 
org.apache.spark.sql.execution.command.LoadTable.processData(carbonTableSchema.scala:891)
        at 
org.apache.spark.sql.execution.command.LoadTable.run(carbonTableSchema.scala:613)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87)
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)
        at 
org.apache.carbondata.examples.CarbonSessionExample$.main(CarbonSessionExample.scala:108)
        at 
org.apache.carbondata.examples.CarbonSessionExample.main(CarbonSessionExample.scala)
17/08/29 14:00:26 AUDIT LoadTable: [pallavi][pallavi][Thread-1]Dataload failure 
for default.carbon_table. Please check the logs
Exception in thread "main" java.lang.Exception: Dataload is success. 
Auto-Compaction has failed. Please check logs.
        at 
org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.loadCarbonData(CarbonDataRDDFactory.scala:1048)
        at 
org.apache.spark.sql.execution.command.LoadTable.processData(carbonTableSchema.scala:891)
        at 
org.apache.spark.sql.execution.command.LoadTable.run(carbonTableSchema.scala:613)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87)
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)
        at 
org.apache.carbondata.examples.CarbonSessionExample$.main(CarbonSessionExample.scala:108)
        at 
org.apache.carbondata.examples.CarbonSessionExample.main(CarbonSessionExample.scala)

> Auto Compaction Failing in CarbonData Loading
> ---------------------------------------------
>
>                 Key: CARBONDATA-1421
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-1421
>             Project: CarbonData
>          Issue Type: Bug
>          Components: data-load
>    Affects Versions: 1.2.0
>            Reporter: Pallavi Singh
>             Fix For: 1.2.0
>
>
> I ran the create query followed by multiple load queries and the 
> auto-compaction is failing.
> 0: jdbc:hive2://localhost:10000> LOAD DATA inpath 
> 'hdfs://localhost:54310/data/4000_UniqData.csv' INTO table uniqdata 
> options('DELIMITER'=',', 'FILEHEADER'='CUST_ID, CUST_NAME, 
> ACTIVE_EMUI_VERSION, DOB, DOJ, BIGINT_COLUMN1, BIGINT_COLUMN2, 
> DECIMAL_COLUMN1, DECIMAL_COLUMN2, Double_COLUMN1, Double_COLUMN2, 
> INTEGER_COLUMN1');
> +---------+--+
> | Result  |
> +---------+--+
> +---------+--+
> No rows selected (1.183 seconds)
> 0: jdbc:hive2://localhost:10000> LOAD DATA inpath 
> 'hdfs://localhost:54310/data/5000_UniqData.csv' INTO table uniqdata 
> options('DELIMITER'=',', 'FILEHEADER'='CUST_ID, CUST_NAME, 
> ACTIVE_EMUI_VERSION, DOB, DOJ, BIGINT_COLUMN1, BIGINT_COLUMN2, 
> DECIMAL_COLUMN1, DECIMAL_COLUMN2, Double_COLUMN1, Double_COLUMN2, 
> INTEGER_COLUMN1');
> Error: java.lang.Exception: Dataload is success. Auto-Compaction has failed. 
> Please check logs. (state=,code=0)
> 0: jdbc:hive2://localhost:10000> LOAD DATA inpath 
> 'hdfs://localhost:54310/data/7000_UniqData.csv' INTO table uniqdata 
> options('DELIMITER'=',', 'FILEHEADER'='CUST_ID, CUST_NAME, 
> ACTIVE_EMUI_VERSION, DOB, DOJ, BIGINT_COLUMN1, BIGINT_COLUMN2, 
> DECIMAL_COLUMN1, DECIMAL_COLUMN2, Double_COLUMN1, Double_COLUMN2, 
> INTEGER_COLUMN1');
> Error: java.lang.Exception: Dataload is success. Auto-Compaction has failed. 
> Please check logs. (state=,code=0)
> 0: jdbc:hive2://localhost:10000> 
> 0: jdbc:hive2://localhost:10000> 
> 0: jdbc:hive2://localhost:10000> 
> 0: jdbc:hive2://localhost:10000> 
> 0: jdbc:hive2://localhost:10000> 
> 0: jdbc:hive2://localhost:10000> 
> 0: jdbc:hive2://localhost:10000> 
> 0: jdbc:hive2://localhost:10000> show segments for table uniqdata;
> +--------------------+----------+--------------------------+--------------------------+--+
> | SegmentSequenceId  |  Status  |     Load Start Time      |      Load End 
> Time       |
> +--------------------+----------+--------------------------+--------------------------+--+
> | 4                  | Success  | 2017-08-29 10:37:13.053  | 2017-08-29 
> 10:37:13.888  |
> | 3                  | Success  | 2017-08-29 10:36:57.851  | 2017-08-29 
> 10:36:59.08   |
> | 2                  | Success  | 2017-08-29 10:36:49.439  | 2017-08-29 
> 10:36:50.373  |
> | 1                  | Success  | 2017-08-29 10:36:37.365  | 2017-08-29 
> 10:36:38.768  |
> | 0                  | Success  | 2017-08-29 10:36:21.011  | 2017-08-29 
> 10:36:26.1    |
> +--------------------+----------+--------------------------+--------------------------+--+
> 5 rows selected (0.099 seconds)
> 0: jdbc:hive2://localhost:10000> LOAD DATA inpath 
> 'hdfs://localhost:54310/data/7000_UniqData.csv' INTO table uniqdata 
> options('DELIMITER'=',', 'FILEHEADER'='CUST_ID, CUST_NAME, 
> ACTIVE_EMUI_VERSION, DOB, DOJ, BIGINT_COLUMN1, BIGINT_COLUMN2, 
> DECIMAL_COLUMN1, DECIMAL_COLUMN2, Double_COLUMN1, Double_COLUMN2, 
> INTEGER_COLUMN1');
> Error: java.lang.Exception: Dataload is success. Auto-Compaction has failed. 
> Please check logs. (state=,code=0)
> 0: jdbc:hive2://localhost:10000> show segments for table uniqdata;
> +--------------------+----------+--------------------------+--------------------------+--+
> | SegmentSequenceId  |  Status  |     Load Start Time      |      Load End 
> Time       |
> +--------------------+----------+--------------------------+--------------------------+--+
> | 5                  | Success  | 2017-08-29 10:38:15.727  | 2017-08-29 
> 10:38:16.548  |
> | 4                  | Success  | 2017-08-29 10:37:13.053  | 2017-08-29 
> 10:37:13.888  |
> | 3                  | Success  | 2017-08-29 10:36:57.851  | 2017-08-29 
> 10:36:59.08   |
> | 2                  | Success  | 2017-08-29 10:36:49.439  | 2017-08-29 
> 10:36:50.373  |
> | 1                  | Success  | 2017-08-29 10:36:37.365  | 2017-08-29 
> 10:36:38.768  |
> | 0                  | Success  | 2017-08-29 10:36:21.011  | 2017-08-29 
> 10:36:26.1    |
> +--------------------+----------+--------------------------+--------------------------+--+



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to