[ https://issues.apache.org/jira/browse/HIVE-9136?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14250734#comment-14250734 ]
Hive QA commented on HIVE-9136: ------------------------------- {color:red}Overall{color}: -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12687851/HIVE-9136.1-spark.patch Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/567/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/567/console Test logs: http://ec2-50-18-27-0.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-567/ Messages: {noformat} **** This message was trimmed, see log for full details **** patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkPlan.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkPlanGenerator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkRecordHandler.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkReduceRecordHandler.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkTask.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/SparkJobMonitor.java patching file ql/src/java/org/apache/hadoop/hive/ql/log/PerfLogger.java patching file ql/src/java/org/apache/hadoop/hive/ql/parse/spark/SparkCompiler.java + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hive-ptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven -Phadoop-2 [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Hive [INFO] Hive Shims Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims Scheduler [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Ant Utilities [INFO] Spark Remote Client [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive Accumulo Handler [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog Streaming [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive 0.15.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive --- [INFO] Deleting /data/hive-ptest/working/apache-svn-spark-source (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ hive --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive --- [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-spark-source/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-spark-source/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-spark-source/target/tmp/conf [copy] Copying 10 files to /data/hive-ptest/working/apache-svn-spark-source/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive --- [INFO] Installing /data/hive-ptest/working/apache-svn-spark-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.15.0-SNAPSHOT/hive-0.15.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Common 0.15.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-spark-source/shims/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ hive-shims-common --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-shims-common --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-shims-common --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-spark-source/shims/common/src/main/resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common --- [INFO] Compiling 29 source files to /data/hive-ptest/working/apache-svn-spark-source/shims/common/target/classes [WARNING] /data/hive-ptest/working/apache-svn-spark-source/shims/common/src/main/java/org/apache/hadoop/fs/ProxyFileSystem.java: /data/hive-ptest/working/apache-svn-spark-source/shims/common/src/main/java/org/apache/hadoop/fs/ProxyFileSystem.java uses or overrides a deprecated API. [WARNING] /data/hive-ptest/working/apache-svn-spark-source/shims/common/src/main/java/org/apache/hadoop/fs/ProxyFileSystem.java: Recompile with -Xlint:deprecation for details. [WARNING] /data/hive-ptest/working/apache-svn-spark-source/shims/common/src/main/java/org/apache/hadoop/hive/thrift/DBTokenStore.java: Some input files use unchecked or unsafe operations. [WARNING] /data/hive-ptest/working/apache-svn-spark-source/shims/common/src/main/java/org/apache/hadoop/hive/thrift/DBTokenStore.java: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hive-shims-common --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-spark-source/shims/common/src/test/resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-spark-source/shims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-spark-source/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-spark-source/shims/common/target/tmp/conf [copy] Copying 10 files to /data/hive-ptest/working/apache-svn-spark-source/shims/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-spark-source/shims/common/target/hive-shims-common-0.15.0-SNAPSHOT.jar [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-shims-common --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-spark-source/shims/common/target/hive-shims-common-0.15.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.15.0-SNAPSHOT/hive-shims-common-0.15.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-spark-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.15.0-SNAPSHOT/hive-shims-common-0.15.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20S 0.15.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ Downloading: http://repository.apache.org/snapshots/org/apache/hive/shims/hive-shims-common-secure/0.15.0-SNAPSHOT/maven-metadata.xml Downloading: http://ec2-50-18-79-139.us-west-1.compute.amazonaws.com/data/spark_2.10-1.2-SNAPSHOT/org/apache/hive/shims/hive-shims-common-secure/0.15.0-SNAPSHOT/maven-metadata.xml [WARNING] Could not transfer metadata org.apache.hive.shims:hive-shims-common-secure:0.15.0-SNAPSHOT/maven-metadata.xml from/to apache.snapshots (http://repository.apache.org/snapshots): Connection to http://repository.apache.org refused [WARNING] Failure to transfer org.apache.hive.shims:hive-shims-common-secure:0.15.0-SNAPSHOT/maven-metadata.xml from http://repository.apache.org/snapshots was cached in the local repository, resolution will not be reattempted until the update interval of apache.snapshots has elapsed or updates are forced. Original error: Could not transfer metadata org.apache.hive.shims:hive-shims-common-secure:0.15.0-SNAPSHOT/maven-metadata.xml from/to apache.snapshots (http://repository.apache.org/snapshots): Connection to http://repository.apache.org refused Downloading: http://ec2-50-18-79-139.us-west-1.compute.amazonaws.com/data/spark_2.10-1.2-SNAPSHOT/org/apache/hive/shims/hive-shims-common-secure/0.15.0-SNAPSHOT/hive-shims-common-secure-0.15.0-SNAPSHOT.pom Downloading: http://repository.apache.org/snapshots/org/apache/hive/shims/hive-shims-common-secure/0.15.0-SNAPSHOT/hive-shims-common-secure-0.15.0-SNAPSHOT.pom [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [3.051s] [INFO] Hive Shims Common ................................. SUCCESS [3.944s] [INFO] Hive Shims 0.20S .................................. FAILURE [2:06.352s] [INFO] Hive Shims 0.23 ................................... SKIPPED [INFO] Hive Shims Scheduler .............................. SKIPPED [INFO] Hive Shims ........................................ SKIPPED [INFO] Hive Common ....................................... SKIPPED [INFO] Hive Serde ........................................ SKIPPED [INFO] Hive Metastore .................................... SKIPPED [INFO] Hive Ant Utilities ................................ SKIPPED [INFO] Spark Remote Client ............................... SKIPPED [INFO] Hive Query Language ............................... SKIPPED [INFO] Hive Service ...................................... SKIPPED [INFO] Hive Accumulo Handler ............................. SKIPPED [INFO] Hive JDBC ......................................... SKIPPED [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog Streaming ........................... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2:14.345s [INFO] Finished at: Wed Dec 17 17:54:15 EST 2014 [INFO] Final Memory: 41M/333M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal on project hive-shims-0.20S: Could not resolve dependencies for project org.apache.hive.shims:hive-shims-0.20S:jar:0.15.0-SNAPSHOT: Failed to collect dependencies for [org.apache.hive.shims:hive-shims-common-secure:jar:0.15.0-SNAPSHOT (compile), org.apache.hadoop:hadoop-core:jar:1.2.1 (compile?), org.apache.hadoop:hadoop-test:jar:1.2.1 (compile?), org.slf4j:slf4j-api:jar:1.7.5 (compile), org.slf4j:slf4j-log4j12:jar:1.7.5 (compile)]: Failed to read artifact descriptor for org.apache.hive.shims:hive-shims-common-secure:jar:0.15.0-SNAPSHOT: Could not transfer artifact org.apache.hive.shims:hive-shims-common-secure:pom:0.15.0-SNAPSHOT from/to apache.snapshots (http://repository.apache.org/snapshots): Connection to http://repository.apache.org refused: Connection timed out -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-shims-0.20S + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12687851 - PreCommit-HIVE-SPARK-Build > Profile query compiler [Spark Branch] > ------------------------------------- > > Key: HIVE-9136 > URL: https://issues.apache.org/jira/browse/HIVE-9136 > Project: Hive > Issue Type: Sub-task > Components: Spark > Affects Versions: spark-branch > Reporter: Brock Noland > Assignee: Chao > Attachments: HIVE-9136.1-spark.patch, HIVE-9136.1.patch > > > We should put some performance counters around the compiler and evaluate how > long it takes to compile a query in Spark versus the other execution > frameworks. Query 28 is a good one to use for testing. -- This message was sent by Atlassian JIRA (v6.3.4#6332)