See 
<https://builds.apache.org/job/carbondata-master-spark-2.2/871/display/redirect?page=changes>

Changes:

[jacky.likun] [CARBONDATA-2539]Fix mv classcast exception issue

------------------------------------------
[...truncated 83.78 MB...]
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
18/08/07 08:07:40 ERROR BypassMergeSortShuffleWriter: Error while deleting file 
/tmp/blockmgr-f9d21bb2-92d4-40c7-be53-b1d72b277171/3f/temp_shuffle_e51dfb34-0e2c-4514-ac70-41eca1bade11
18/08/07 08:07:40 ERROR Executor: Exception in task 0.0 in stage 170.0 (TID 388)
java.io.FileNotFoundException: 
/tmp/blockmgr-f9d21bb2-92d4-40c7-be53-b1d72b277171/3f/temp_shuffle_e51dfb34-0e2c-4514-ac70-41eca1bade11
 (No such file or directory)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at 
org.apache.spark.storage.DiskBlockObjectWriter.initialize(DiskBlockObjectWriter.scala:103)
        at 
org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:116)
        at 
org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:237)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
        at org.apache.spark.scheduler.Task.run(Task.scala:108)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
18/08/07 08:07:40 ERROR TaskSetManager: Task 0 in stage 170.0 failed 1 times; 
aborting job
- LuceneDataMapExample *** FAILED ***
  org.apache.spark.SparkException: Job aborted due to stage failure: Task 
0 in stage 170.0 failed 1 times, most recent failure: Lost task 0.0 in stage 
170.0 (TID 388, localhost, executor driver): java.io.FileNotFoundException: 
/tmp/blockmgr-f9d21bb2-92d4-40c7-be53-b1d72b277171/3f/temp_shuffle_e51dfb34-0e2c-4514-ac70-41eca1bade11
 (No such file or directory)
   at java.io.FileOutputStream.open0(Native Method)
   at java.io.FileOutputStream.open(FileOutputStream.java:270)
   at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
   at 
org.apache.spark.storage.DiskBlockObjectWriter.initialize(DiskBlockObjectWriter.scala:103)
   at 
org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:116)
   at 
org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:237)
   at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
   at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
   at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
   at org.apache.spark.scheduler.Task.run(Task.scala:108)
   at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
   at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
  at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1517)
  at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1505)
  at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1504)
  at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
  at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1504)
  at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
  at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
  at scala.Option.foreach(Option.scala:257)
  at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
  ...
  Cause: java.io.FileNotFoundException: 
/tmp/blockmgr-f9d21bb2-92d4-40c7-be53-b1d72b277171/3f/temp_shuffle_e51dfb34-0e2c-4514-ac70-41eca1bade11
 (No such file or directory)
  at java.io.FileOutputStream.open0(Native Method)
  at java.io.FileOutputStream.open(FileOutputStream.java:270)
  at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
  at 
org.apache.spark.storage.DiskBlockObjectWriter.initialize(DiskBlockObjectWriter.scala:103)
  at 
org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:116)
  at 
org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:237)
  at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
  at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  at org.apache.spark.scheduler.Task.run(Task.scala:108)
  ...
18/08/07 08:07:40 AUDIT CarbonCreateTableCommand: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Creating Table with Database name 
[default] and Table name [origin_table]
18/08/07 08:07:40 AUDIT CarbonCreateTableCommand: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Table created with Database name 
[default] and Table name [origin_table]
18/08/07 08:07:41 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load request has been 
received for table default.origin_table
18/08/07 08:07:41 ERROR DataLoadExecutor: [Executor task launch worker for task 
389][partitionID:table;queryID:2530312221156261] Data Load is partially success 
for table origin_table
18/08/07 08:07:41 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load is successful for 
default.origin_table
18/08/07 08:07:41 AUDIT MergeIndexEventListener: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Load post status event-listener 
called for merge index
18/08/07 08:07:41 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load request has been 
received for table default.origin_table
18/08/07 08:07:41 ERROR DataLoadExecutor: [Executor task launch worker for task 
391][partitionID:table;queryID:2530312536634430] Data Load is partially success 
for table origin_table
18/08/07 08:07:41 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load is successful for 
default.origin_table
18/08/07 08:07:41 AUDIT MergeIndexEventListener: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Load post status event-listener 
called for merge index
18/08/07 08:07:41 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load request has been 
received for table default.origin_table
18/08/07 08:07:42 ERROR DataLoadExecutor: [Executor task launch worker for task 
393][partitionID:table;queryID:2530312913464497] Data Load is partially success 
for table origin_table
18/08/07 08:07:42 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load is successful for 
default.origin_table
18/08/07 08:07:42 AUDIT MergeIndexEventListener: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Load post status event-listener 
called for merge index
18/08/07 08:07:42 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load request has been 
received for table default.origin_table
18/08/07 08:07:42 ERROR DataLoadExecutor: [Executor task launch worker for task 
395][partitionID:table;queryID:2530313268041752] Data Load is partially success 
for table origin_table
18/08/07 08:07:42 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load is successful for 
default.origin_table
18/08/07 08:07:42 AUDIT MergeIndexEventListener: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Load post status event-listener 
called for merge index
+--------+
|count(1)|
+--------+
|      40|
+--------+

18/08/07 08:07:42 AUDIT CarbonCreateTableCommand: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Creating Table with Database name 
[default] and Table name [external_table]
18/08/07 08:07:42 AUDIT CarbonCreateTableCommand: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Table created with Database name 
[default] and Table name [external_table]
+--------+
|count(1)|
+--------+
|      40|
+--------+

18/08/07 08:07:42 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load request has been 
received for table default.origin_table
18/08/07 08:07:43 ERROR DataLoadExecutor: [Executor task launch worker for task 
401][partitionID:table;queryID:2530313949122406] Data Load is partially success 
for table origin_table
18/08/07 08:07:43 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load is successful for 
default.origin_table
18/08/07 08:07:43 AUDIT MergeIndexEventListener: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Load post status event-listener 
called for merge index
18/08/07 08:07:43 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load request has been 
received for table default.origin_table
18/08/07 08:07:43 ERROR DataLoadExecutor: [Executor task launch worker for task 
403][partitionID:table;queryID:2530314315634043] Data Load is partially success 
for table origin_table
18/08/07 08:07:43 AUDIT CarbonDataRDDFactory$: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Data load is successful for 
default.origin_table
18/08/07 08:07:43 AUDIT MergeIndexEventListener: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Load post status event-listener 
called for merge index
+--------+
|count(1)|
+--------+
|      60|
+--------+

18/08/07 08:07:43 AUDIT CarbonDropTableCommand: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Deleting table [origin_table] 
under database [default]
18/08/07 08:07:43 AUDIT CarbonDropTableCommand: 
[asf930.gq1.ygridcore.net][jenkins][Thread-1]Deleted table [origin_table] under 
database [default]
- ExternalTableExample
Run completed in 1 minute, 45 seconds.
Total number of tests run: 17
Suites: completed 2, aborted 0
Tests: succeeded 11, failed 6, canceled 0, ignored 0, pending 0
*** 6 TESTS FAILED ***
[JENKINS] Recording test results
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache CarbonData :: Parent
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache CarbonData :: Parent ........................ SUCCESS [ 12.631 s]
[INFO] Apache CarbonData :: Common ........................ SUCCESS [ 20.781 s]
[INFO] Apache CarbonData :: Core .......................... SUCCESS [03:23 min]
[INFO] Apache CarbonData :: Processing .................... SUCCESS [ 45.676 s]
[INFO] Apache CarbonData :: Hadoop ........................ SUCCESS [ 31.859 s]
[INFO] Apache CarbonData :: Streaming ..................... SUCCESS [ 49.956 s]
[INFO] Apache CarbonData :: Spark Common .................. SUCCESS [01:47 min]
[INFO] Apache CarbonData :: Store SDK ..................... SUCCESS [01:13 min]
[INFO] Apache CarbonData :: Search ........................ SUCCESS [ 45.184 s]
[INFO] Apache CarbonData :: Lucene Index DataMap .......... SUCCESS [ 24.206 s]
[INFO] Apache CarbonData :: Bloom Index DataMap ........... SUCCESS [ 19.071 s]
[INFO] Apache CarbonData :: Spark2 ........................ SUCCESS [24:45 min]
[INFO] Apache CarbonData :: Spark Common Test ............. SUCCESS [  01:41 h]
[INFO] Apache CarbonData :: DataMap Examples .............. SUCCESS [ 14.385 s]
[INFO] Apache CarbonData :: Assembly ...................... SUCCESS [ 21.689 s]
[INFO] Apache CarbonData :: Hive .......................... SUCCESS [ 48.812 s]
[INFO] Apache CarbonData :: presto ........................ SUCCESS [01:52 min]
[INFO] Apache CarbonData :: Spark2 Examples ............... FAILURE [02:49 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:24 h
[INFO] Finished at: 2018-08-07T15:08:05+00:00
[INFO] Final Memory: 178M/1700M
[INFO] ------------------------------------------------------------------------
Waiting for Jenkins to finish collecting data[ERROR] Failed to execute goal 
org.scalatest:scalatest-maven-plugin:1.0:test (test) on project 
carbondata-examples-spark2: There are test failures -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal 
org.scalatest:scalatest-maven-plugin:1.0:test (test) on project 
carbondata-examples-spark2: There are test failures
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
        at 
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
        at 
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
        at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
        at 
org.jvnet.hudson.maven3.launcher.Maven33Launcher.main(Maven33Launcher.java:129)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.codehaus.plexus.classworlds.launcher.Launcher.launchStandard(Launcher.java:330)
        at 
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:238)
        at jenkins.maven3.agent.Maven33Main.launch(Maven33Main.java:176)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at hudson.maven.Maven3Builder.call(Maven3Builder.java:139)
        at hudson.maven.Maven3Builder.call(Maven3Builder.java:70)
        at hudson.remoting.UserRequest.perform(UserRequest.java:212)
        at hudson.remoting.UserRequest.perform(UserRequest.java:54)
        at hudson.remoting.Request$2.run(Request.java:369)
        at 
hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures
        at org.scalatest.tools.maven.TestMojo.execute(TestMojo.java:107)
        at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
        ... 31 more
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :carbondata-examples-spark2

[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/processing/pom.xml>
 to 
org.apache.carbondata/carbondata-processing/1.5.0-SNAPSHOT/carbondata-processing-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/processing/target/carbondata-processing-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-processing/1.5.0-SNAPSHOT/carbondata-processing-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Processing #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/datamap/examples/pom.xml>
 to 
org.apache.carbondata/carbondata-datamap-examples/1.5.0-SNAPSHOT/carbondata-datamap-examples-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/datamap/examples/target/carbondata-datamap-examples-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-datamap-examples/1.5.0-SNAPSHOT/carbondata-datamap-examples-1.5.0-SNAPSHOT.jar
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/store/search/pom.xml>
 to 
org.apache.carbondata/carbondata-search/1.5.0-SNAPSHOT/carbondata-search-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/store/search/target/carbondata-search-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-search/1.5.0-SNAPSHOT/carbondata-search-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Search #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/streaming/pom.xml>
 to 
org.apache.carbondata/carbondata-streaming/1.5.0-SNAPSHOT/carbondata-streaming-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/streaming/target/carbondata-streaming-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-streaming/1.5.0-SNAPSHOT/carbondata-streaming-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Streaming #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/spark-common/pom.xml>
 to 
org.apache.carbondata/carbondata-spark-common/1.5.0-SNAPSHOT/carbondata-spark-common-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/spark-common/target/carbondata-spark-common-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-spark-common/1.5.0-SNAPSHOT/carbondata-spark-common-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Spark Common #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/datamap/lucene/pom.xml>
 to 
org.apache.carbondata/carbondata-lucene/1.5.0-SNAPSHOT/carbondata-lucene-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/datamap/lucene/target/carbondata-lucene-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-lucene/1.5.0-SNAPSHOT/carbondata-lucene-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Lucene Index DataMap #872 to compare, so performing full copy of 
artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/hive/pom.xml>
 to 
org.apache.carbondata/carbondata-hive/1.5.0-SNAPSHOT/carbondata-hive-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/hive/target/carbondata-hive-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-hive/1.5.0-SNAPSHOT/carbondata-hive-1.5.0-SNAPSHOT.jar
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/common/pom.xml> 
to 
org.apache.carbondata/carbondata-common/1.5.0-SNAPSHOT/carbondata-common-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/common/target/carbondata-common-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-common/1.5.0-SNAPSHOT/carbondata-common-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Common #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/examples/spark2/pom.xml>
 to 
org.apache.carbondata/carbondata-examples-spark2/1.5.0-SNAPSHOT/carbondata-examples-spark2-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/spark2/pom.xml>
 to 
org.apache.carbondata/carbondata-spark2/1.5.0-SNAPSHOT/carbondata-spark2-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/spark2/target/carbondata-spark2-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-spark2/1.5.0-SNAPSHOT/carbondata-spark2-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Spark2 #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/datamap/bloom/pom.xml>
 to 
org.apache.carbondata/carbondata-bloom/1.5.0-SNAPSHOT/carbondata-bloom-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/datamap/bloom/target/carbondata-bloom-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-bloom/1.5.0-SNAPSHOT/carbondata-bloom-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Bloom Index DataMap #872 to compare, so performing full copy of 
artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/assembly/pom.xml> 
to 
org.apache.carbondata/carbondata-assembly/1.5.0-SNAPSHOT/carbondata-assembly-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/pom.xml> to 
org.apache.carbondata/carbondata-parent/1.5.0-SNAPSHOT/carbondata-parent-1.5.0-SNAPSHOT.pom
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Parent #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/core/pom.xml> to 
org.apache.carbondata/carbondata-core/1.5.0-SNAPSHOT/carbondata-core-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/core/target/carbondata-core-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-core/1.5.0-SNAPSHOT/carbondata-core-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Core #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/hadoop/pom.xml> 
to 
org.apache.carbondata/carbondata-hadoop/1.5.0-SNAPSHOT/carbondata-hadoop-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/hadoop/target/carbondata-hadoop-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-hadoop/1.5.0-SNAPSHOT/carbondata-hadoop-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Hadoop #872 to compare, so performing full copy of artifacts
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/presto/pom.xml>
 to 
org.apache.carbondata/carbondata-presto/1.5.0-SNAPSHOT/carbondata-presto-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/presto/target/carbondata-presto-1.5.0-SNAPSHOT.zip>
 to 
org.apache.carbondata/carbondata-presto/1.5.0-SNAPSHOT/carbondata-presto-1.5.0-SNAPSHOT.zip
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/presto/target/carbondata-presto-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-presto/1.5.0-SNAPSHOT/carbondata-presto-1.5.0-SNAPSHOT.jar
[Fast Archiver] Compressed 104.23 MB of artifacts by 94.9% relative to #870
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/spark-common-test/pom.xml>
 to 
org.apache.carbondata/carbondata-spark-common-test/1.5.0-SNAPSHOT/carbondata-spark-common-test-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/integration/spark-common-test/target/carbondata-spark-common-test-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-spark-common-test/1.5.0-SNAPSHOT/carbondata-spark-common-test-1.5.0-SNAPSHOT.jar
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/store/sdk/pom.xml>
 to 
org.apache.carbondata/carbondata-store-sdk/1.5.0-SNAPSHOT/carbondata-store-sdk-1.5.0-SNAPSHOT.pom
[JENKINS] Archiving 
<https://builds.apache.org/job/carbondata-master-spark-2.2/ws/store/sdk/target/carbondata-store-sdk-1.5.0-SNAPSHOT.jar>
 to 
org.apache.carbondata/carbondata-store-sdk/1.5.0-SNAPSHOT/carbondata-store-sdk-1.5.0-SNAPSHOT.jar
[Fast Archiver] No artifacts from carbondata-master-spark-2.2 » Apache 
CarbonData :: Store SDK #872 to compare, so performing full copy of artifacts
Sending e-mails to: commits@carbondata.apache.org
channel stopped
Not sending mail to unregistered user jacky.li...@qq.com

Reply via email to