Thanks Cheng.

Here is the error message after a fresh build.

$ mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.0 -Phive -DskipTests
clean package
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .......................... SUCCESS [19.117s]
[INFO] Spark Project Core ................................ SUCCESS [11:24.009s]
[INFO] Spark Project Bagel ............................... SUCCESS [1:09.498s]
[INFO] Spark Project GraphX .............................. SUCCESS [3:41.113s]
[INFO] Spark Project Streaming ........................... SUCCESS [4:25.378s]
[INFO] Spark Project ML Library .......................... SUCCESS [5:43.323s]
[INFO] Spark Project Tools ............................... SUCCESS [44.647s]
[INFO] Spark Project Catalyst ............................ SUCCESS [4:48.658s]
[INFO] Spark Project SQL ................................. SUCCESS [4:56.966s]
[INFO] Spark Project Hive ................................ SUCCESS [3:45.269s]
[INFO] Spark Project REPL ................................ SUCCESS [2:11.617s]
[INFO] Spark Project YARN Parent POM ..................... SUCCESS [6.723s]
[INFO] Spark Project YARN Stable API ..................... SUCCESS [2:20.860s]
[INFO] Spark Project Hive Thrift Server .................. SUCCESS [1:15.231s]
[INFO] Spark Project Assembly ............................ SUCCESS [1:41.245s]
[INFO] Spark Project External Twitter .................... SUCCESS [50.839s]
[INFO] Spark Project External Kafka ...................... SUCCESS [1:15.888s]
[INFO] Spark Project External Flume Sink ................. SUCCESS [57.807s]
[INFO] Spark Project External Flume ...................... SUCCESS [1:26.589s]
[INFO] Spark Project External ZeroMQ ..................... SUCCESS [54.361s]
[INFO] Spark Project External MQTT ....................... SUCCESS [53.901s]
[INFO] Spark Project Examples ............................ SUCCESS [2:39.407s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------

spark-sql> use mydb;
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException:
Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
org.apache.spark.sql.execution.QueryExecutionException: FAILED:
Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException:
Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:302)
at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
at 
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:103)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

On Tue, Oct 7, 2014 at 6:19 AM, Cheng Lian <lian.cs....@gmail.com> wrote:
> The build command should be correct. What exact error did you encounter when
> trying Spark 1.1 + Hive 0.12 + Hadoop 2.5.0?
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to