Minor correction: there was a typo in commandline

hive-thirftserver should be hive-thriftserver

Cheers

On Thu, Aug 7, 2014 at 6:49 PM, Cheng Lian <lian.cs....@gmail.com> wrote:

> Things have changed a bit in the master branch, and the SQL programming
> guide in master branch actually doesn’t apply to branch-1.0-jdbc.
>
> In branch-1.0-jdbc, Hive Thrift server and Spark SQL CLI are included in
> the hive profile and are thus not enabled by default. You need to either
>
>    - pass -Phive to Maven to enable it, or
>    - use SPARK_HIVE=true ./sbt/sbt assembly
>
> In the most recent master branch, however, Hive Thrift server and Spark
> SQL CLI are moved into a separate hive-thriftserver profile. And our SBT
> build file now delegates to Maven. So, to build the master branch, you can
> either
>
>    - ./sbt/sbt -Phive-thirftserver clean assembly/assembly, or
>    - mvn -Phive-thriftserver clean package -DskipTests
>
> On Fri, Aug 8, 2014 at 6:12 AM, ajatix <a...@sigmoidanalytics.com> wrote:
>
> Hi
>>
>> I wish to migrate from shark to the spark-sql shell, where I am facing
>> some
>> difficulties in setting up.
>>
>> I cloned the "branch-1.0-jdbc" to test out the spark-sql shell, but I am
>> unable to run it after building the source.
>>
>> I've tried two methods for building (with Hadoop 1.0.4) - sbt/sbt
>> assembly;
>> and mvn -DskipTests clean package -X. Both build successfully, but when I
>> run bin/spark-sql, I get the following error:
>>
>> Exception in thread "main" java.lang.ClassNotFoundException:
>> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:270)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:311)
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:73)
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> and when I run bin/beeline, I get this:
>> Error: Could not find or load main class org.apache.hive.beeline.BeeLine
>>
>> bin/spark-shell works fine. It there something else I have to add to the
>> build parameters?
>> According to this -
>> https://github.com/apache/spark/blob/master/docs/sql-programming-guide.md,
>> I
>> tried rebuilding with -Phive-thriftserver, but it failed to detect the
>> library while building.
>>
>> Thanks and Regards
>> Ajay
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Missing-SparkSQLCLIDriver-and-Beeline-drivers-in-Spark-tp11724.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>  ​
>

Reply via email to