What command did you use to build Spark 1.5.0 ?

bq. Export 'SPARK_HIVE=true' and run build/sbt assembly

Please following the above.

BTW 1.5.1 has been released which is more stable.

Please use 1.5.1

Cheers

On Mon, Oct 5, 2015 at 9:25 AM, cherah30 <ahmed.cher...@gmail.com> wrote:

> I work with Spark 1.5 on windows 7, with anacond and pyspark. everything
> works fine until I wanted to test the connection to my MySQL database. So I
> started watching it
>
> https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
> <
> https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
> >
> .
> Everything is set (jdbc, ... Etc).
>
> To start playing with, I just wanted to connect to my Mysql database to
> retrieve data from a table.
>
> Here is my code
>
> from pyspark.sql import HiveContext
> df_mysql = sqlHiveContext.read.format("jdbc").options(url =
> "jdbc:mysql://localhost:3306/my_bdd_name", driver =
> "com.mysql.jdbc.Driver",
> dbtable="bdd_My_table_nameXX",  user ="my_id", password="my_pw").load()
>
> And here is the exception message :
> Exception: ("You must build Spark with Hive. Export 'SPARK_HIVE=true' and
> run build/sbt assembly", Py4JJavaError(u'An error occurred while calling
> None.org.apache.spark.sql.hive.HiveContext.\n', JavaObject id=o28)).
>
> You get an idea of what to do?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-You-must-build-Spark-with-Hive-Export-SPARK-HIVE-true-and-run-build-sbt-assembly-tp24928.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to