Re: Exception: "You must build Spark with Hive. Export 'SPARK_HIVE=true' and run build/sbt assembly"

2015-10-05 Thread Ted Yu
In the tar ball, do you see any class(es) from spark-hive module ?

>From the error message, I don't think so.

Cheers

On Mon, Oct 5, 2015 at 11:16 AM, Ahmed Cheriat <ahmed.cher...@gmail.com>
wrote:

> Thanks Ted for your reply.
> Well, it's a stanalone spark version "spark-1.5.0-bin-hadoop2.6" (windows
> 7).
> To launch spark i use the prompt command (dos):
> bin\pyspark --jars "my_path_to_mysql_jdbc.jar"
>
> This command starts a notebook pyspark  without errors.
>
>
> 2015-10-05 18:29 GMT+02:00 Ted Yu <yuzhih...@gmail.com>:
>
>> What command did you use to build Spark 1.5.0 ?
>>
>> bq. Export 'SPARK_HIVE=true' and run build/sbt assembly
>>
>> Please following the above.
>>
>> BTW 1.5.1 has been released which is more stable.
>>
>> Please use 1.5.1
>>
>> Cheers
>>
>> On Mon, Oct 5, 2015 at 9:25 AM, cherah30 <ahmed.cher...@gmail.com> wrote:
>>
>>> I work with Spark 1.5 on windows 7, with anacond and pyspark. everything
>>> works fine until I wanted to test the connection to my MySQL database.
>>> So I
>>> started watching it
>>>
>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
>>> <
>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
>>> >
>>> .
>>> Everything is set (jdbc, ... Etc).
>>>
>>> To start playing with, I just wanted to connect to my Mysql database to
>>> retrieve data from a table.
>>>
>>> Here is my code
>>>
>>> from pyspark.sql import HiveContext
>>> df_mysql = sqlHiveContext.read.format("jdbc").options(url =
>>> "jdbc:mysql://localhost:3306/my_bdd_name", driver =
>>> "com.mysql.jdbc.Driver",
>>> dbtable="bdd_My_table_nameXX",  user ="my_id", password="my_pw").load()
>>>
>>> And here is the exception message :
>>> Exception: ("You must build Spark with Hive. Export 'SPARK_HIVE=true' and
>>> run build/sbt assembly", Py4JJavaError(u'An error occurred while calling
>>> None.org.apache.spark.sql.hive.HiveContext.\n', JavaObject id=o28)).
>>>
>>> You get an idea of what to do?
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-You-must-build-Spark-with-Hive-Export-SPARK-HIVE-true-and-run-build-sbt-assembly-tp24928.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>
>
> --
> Ahmed Cheriat
>


Exception: "You must build Spark with Hive. Export 'SPARK_HIVE=true' and run build/sbt assembly"

2015-10-05 Thread cherah30
I work with Spark 1.5 on windows 7, with anacond and pyspark. everything
works fine until I wanted to test the connection to my MySQL database. So I
started watching it 
https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
<https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases>
 
.
Everything is set (jdbc, ... Etc).

To start playing with, I just wanted to connect to my Mysql database to
retrieve data from a table.

Here is my code

from pyspark.sql import HiveContext
df_mysql = sqlHiveContext.read.format("jdbc").options(url = 
"jdbc:mysql://localhost:3306/my_bdd_name", driver = "com.mysql.jdbc.Driver",
dbtable="bdd_My_table_nameXX",  user ="my_id", password="my_pw").load()

And here is the exception message : 
Exception: ("You must build Spark with Hive. Export 'SPARK_HIVE=true' and
run build/sbt assembly", Py4JJavaError(u'An error occurred while calling
None.org.apache.spark.sql.hive.HiveContext.\n', JavaObject id=o28)).

You get an idea of what to do?




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Exception-You-must-build-Spark-with-Hive-Export-SPARK-HIVE-true-and-run-build-sbt-assembly-tp24928.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Exception: "You must build Spark with Hive. Export 'SPARK_HIVE=true' and run build/sbt assembly"

2015-10-05 Thread Ted Yu
What command did you use to build Spark 1.5.0 ?

bq. Export 'SPARK_HIVE=true' and run build/sbt assembly

Please following the above.

BTW 1.5.1 has been released which is more stable.

Please use 1.5.1

Cheers

On Mon, Oct 5, 2015 at 9:25 AM, cherah30 <ahmed.cher...@gmail.com> wrote:

> I work with Spark 1.5 on windows 7, with anacond and pyspark. everything
> works fine until I wanted to test the connection to my MySQL database. So I
> started watching it
>
> https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
> <
> https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
> >
> .
> Everything is set (jdbc, ... Etc).
>
> To start playing with, I just wanted to connect to my Mysql database to
> retrieve data from a table.
>
> Here is my code
>
> from pyspark.sql import HiveContext
> df_mysql = sqlHiveContext.read.format("jdbc").options(url =
> "jdbc:mysql://localhost:3306/my_bdd_name", driver =
> "com.mysql.jdbc.Driver",
> dbtable="bdd_My_table_nameXX",  user ="my_id", password="my_pw").load()
>
> And here is the exception message :
> Exception: ("You must build Spark with Hive. Export 'SPARK_HIVE=true' and
> run build/sbt assembly", Py4JJavaError(u'An error occurred while calling
> None.org.apache.spark.sql.hive.HiveContext.\n', JavaObject id=o28)).
>
> You get an idea of what to do?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-You-must-build-Spark-with-Hive-Export-SPARK-HIVE-true-and-run-build-sbt-assembly-tp24928.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>