Finally got it working.
I was trying to access hive using the jdbc driver like I was trying to
access the terradata.
It took me some time to figure out that default sqlContext created by Spark
supported hive and it uses the hive-site.xml in spark conf folder to access
hive.
I had to use my
)
Regards,
Ishwardeep
*From:* ankitjindal [via Apache Spark User List] [mailto:ml-node+[hidden
email] http:///user/SendEmail.jtp?type=nodenode=22768i=0]
*Sent:* Tuesday, May 5, 2015 5:00 PM
*To:* Ishwardeep Singh
*Subject:* RE: Unable to join table across data sources using sparkSQL
,
Ishwardeep
From: ankitjindal [via Apache Spark User List]
[mailto:ml-node+s1001560n22766...@n3.nabble.com]
Sent: Tuesday, May 5, 2015 5:00 PM
To: Ishwardeep Singh
Subject: RE: Unable to join table across data sources using sparkSQL
Just check the Schema of both the tables using frame.printSchema
User List]
[mailto:ml-node+s1001560n22762...@n3.nabble.com]
Sent: Tuesday, May 5, 2015 1:26 PM
To: Ishwardeep Singh
Subject: Re: Unable to join table across data sources using sparkSQL
Hi
I was doing the same but with a file in hadoop as a temp table and one
table in sql server but i succeeded