You might try pointing your spark context at the hive metastore via:

val conf = new SparkConf()

conf.set("hive.metastore.uris", "your.thrift.server:9083")


val sparkSession = SparkSession.builder()
  .config(conf)
  .enableHiveSupport()
  .getOrCreate()




. . . . . . . . . . . . . . . . . . . . . . . . . . .

Richard Moorhead
Software Engineer
richard.moorh...@c2fo.com<mailto:richard.moorh...@gmail.com>

C2FO: The World's Market for Working Capital®

[http://c2fo.com/wp-content/uploads/sites/1/2016/03/LinkedIN.png] 
<https://www.linkedin.com/company/c2fo?trk=vsrp_companies_res_name&trkInfo=VSRPsearchId%3A125658601427902817660%2CVSRPtargetId%3A1555109%2CVSRPcmpt%3Aprimary>
 [http://c2fo.com/wp-content/uploads/sites/1/2016/03/YouTube.png]  
<https://www.youtube.com/c/C2FOMarket> 
[http://c2fo.com/wp-content/uploads/sites/1/2016/03/Twitter.png]  
<https://twitter.com/C2FO> 
[http://c2fo.com/wp-content/uploads/sites/1/2016/03/Googleplus.png]  
<https://plus.google.com/+C2foMarket/posts> 
[http://c2fo.com/wp-content/uploads/sites/1/2016/03/Facebook.png]  
<https://www.facebook.com/C2FOMarketplace> 
[http://c2fo.com/wp-content/uploads/sites/1/2016/03/Forbes-Fintech-50.png] 
<https://c2fo.com/media-coverage/c2fo-included-forbes-fintech-50>

The information contained in this message and any attachment may be privileged, 
confidential, and protected from disclosure. If you are not the intended 
recipient, or an employee, or agent responsible for delivering this message to 
the intended recipient, you are hereby notified that any dissemination, 
distribution, or copying of this communication is strictly prohibited. If you 
have received this communication in error, please notify us immediately by 
replying to the message and deleting from your computer.



________________________________
From: Даша Ковальчук <dashakovalchu...@gmail.com>
Sent: Thursday, June 8, 2017 12:30 PM
To: ayan guha
Cc: user@spark.apache.org
Subject: Re: [Spark Core] Does spark support read from remote Hive server via 
JDBC

The  result is count = 0.

2017-06-08 19:42 GMT+03:00 ayan guha 
<guha.a...@gmail.com<mailto:guha.a...@gmail.com>>:
What is the result of test.count()?

On Fri, 9 Jun 2017 at 1:41 am, Даша Ковальчук 
<dashakovalchu...@gmail.com<mailto:dashakovalchu...@gmail.com>> wrote:
Thanks for your reply!
Yes, I tried this solution and had the same result. Maybe you have another 
solution or maybe I can execute query in another way on remote cluster?

2017-06-08 18:30 GMT+03:00 Даша Ковальчук 
<dashakovalchu...@gmail.com<mailto:dashakovalchu...@gmail.com>>:
Thanks for your reply!
Yes, I tried this solution and had the same result. Maybe you have another 
solution or maybe I can execute query in another way on remote cluster?

2017-06-08 18:10 GMT+03:00 Vadim Semenov 
<vadim.seme...@datadoghq.com<mailto:vadim.seme...@datadoghq.com>>:
Have you tried running a query? something like:

```
test.select("*").limit(10).show()
```

On Thu, Jun 8, 2017 at 4:16 AM, Даша Ковальчук 
<dashakovalchu...@gmail.com<mailto:dashakovalchu...@gmail.com>> wrote:
Hi guys,

I need to execute hive queries on remote hive server from spark, but for some 
reasons i receive only column names(without data).
Data available in table, I checked it via HUE and java jdbc connection.

Here is my code example:
val test = spark.read
        .option("url", "jdbc:hive2://remote.hive.server:10000/work_base")
        .option("user", "user")
        .option("password", "password")
        .option("dbtable", "some_table_with_data")
        .option("driver", "org.apache.hive.jdbc.HiveDriver")
        .format("jdbc")
        .load()
test.show()


Scala version: 2.11
Spark version: 2.1.0, i also tried 2.1.1
Hive version: CDH 5.7 Hive 1.1.1
Hive JDBC version: 1.1.1

But this problem available on Hive with later versions, too.
I didn't find anything in mail group answers and StackOverflow.
Could you, please, help me with this issue or could you help me find correct 
solution how to query remote hive from spark?

Thanks in advance!


--
Best Regards,
Ayan Guha

Reply via email to