Sounds like Drive issue as it can prepare the DF but cannot collect it.

In general as long you have the correct JDBC drivers it should work.

Have you modified spark-defaults.conf and added the driver list there?

For example here I have both Oracle and Sybase IQ drivers

spark.driver.extraClassPath
/home/hduser/jars/jconn4.jar:/home/hduser/jars/ojdbc6.jar

You also need to refer to these jar files in your spark-shell

spark-shell --master spark://<IP_ADDR> --jars
/home/hduser/jars/ojdbc6.jar,/home/hduser/jars/jconn4.jar,/home/hduser/jars/junit-4.12.jar

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 29 March 2016 at 16:15, Ted Yu <yuzhih...@gmail.com> wrote:

> As the error said, com.sap.db.jdbc.topology.Host is not serializable.
>
> Maybe post question on Sap Hana mailing list (if any) ?
>
> On Tue, Mar 29, 2016 at 7:54 AM, reena upadhyay <
> reena.upadh...@impetus.co.in> wrote:
>
>> I am trying to execute query using spark sql on SAP HANA  from spark
>> shell. I
>> am able to create the data frame object. On calling any action on the data
>> frame object, I am getting* java.io.NotSerializableException.*
>>
>> Steps I followed after adding saphana driver jar in spark class path.
>>
>> 1. Start spark-shell
>> 2. val df = sqlContext.load("jdbc", Map("url" ->
>> "jdbc:sap://
>> 172.26.52.54:30015/?databaseName=system&user=SYSTEM&password=Saphana123",
>> "dbtable" -> "SYSTEM.TEST1"));
>> 3. df.show();
>>
>> *I get below exception on calling any action on dataframe object.*
>>
>> *org.apache.spark.SparkException: Job aborted due to stage failure: Task
>> not
>> serializable: java.io.NotSerializableException:
>> com.sap.db.jdbc.topology.Host*
>> Serialization stack:
>>         - object not serializable (class: com.sap.db.jdbc.topology.Host,
>> value:
>> 172.26.52.54:30015)
>>         - writeObject data (class: java.util.ArrayList)
>>         - object (class java.util.ArrayList, [172.26.52.54:30015])
>>         - writeObject data (class: java.util.Hashtable)
>>         - object (class java.util.Properties, {dburl=jdbc:sap://
>> 172.26.52.54:30015,
>> user=SYSTEM, password=Saphana123,
>> url=jdbc:sap://172.26.52.54:30015/?system&user=SYSTEM&password=Saphana123
>> ,
>> dbtable=SYSTEM.TEST1, hostlist=[172.26.52.54:30015]})
>>
>>
>> Caused by: java.io.NotSerializableException: com.sap.db.jdbc.topology.Host
>> Serialization stack:
>>         - object not serializable (class: com.sap.db.jdbc.topology.Host,
>> value:
>> 172.26.52.54:30015)
>>         - writeObject data (class: java.util.ArrayList)
>>         - object (class java.util.ArrayList, [172.26.52.54:30015])
>>         - writeObject data (class: java.util.Hashtable)
>>         - object (class java.util.Properties, {dburl=jdbc:sap://
>> 172.26.52.54:30015,
>> user=SYSTEM, password=Saphana123,
>> url=jdbc:sap://172.26.52.54:30015/?system&user=SYSTEM&password=Saphana123
>> ,
>> dbtable=SYSTEM.TEST1, hostlist=[172.26.52.54:30015]})
>>
>>
>> Appreciate help on this.
>> Thank you
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-execute-query-on-SAPHANA-using-SPARK-tp26628.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to