[
https://issues.apache.org/jira/browse/SPARK-21063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16130472#comment-16130472
]
Varene Olivier edited comment on SPARK-21063 at 8/17/17 2:25 PM:
-
Hi,
I am experiencing the same issue with Spark 2.2.0
and HiveServer2 via http
{code}
// transform scala Map[String,String] to Java Properties (needed by jdbc driver)
implicit def map2Properties(map:Map[String,String]):java.util.Properties = {
(new java.util.Properties /: map) {case (props, (k,v)) => props.put(k,v);
props}
}
val url =
"jdbc:hive2://remote.server:10001/myDatabase;transportMode=http;httpPath=cliservice"
// from "org.apache.hive" % "hive-jdbc" % "1.2.2"
val driver = "org.apache.hive.jdb.HiveServer"
val user = "myRemoteUser"
val password = "myRemotePassword"
val table = "myNonEmptyTable"
val props = Map("user" -> user,"password" -> password, "driver" -> driver)
val d = spark.read.jdbc(url,table,props)
println(d.count)
{code}
returns :
{code}
0
{code}
and my table is not empty
was (Author: ov):
Hi,
I am experiencing the same issue with Spark 2.2.0
and HiveServer2 via http
{code:scala}
// transform scala Map[String,String] to Java Properties (needed by jdbc driver)
implicit def map2Properties(map:Map[String,String]):java.util.Properties = {
(new java.util.Properties /: map) {case (props, (k,v)) => props.put(k,v);
props}
}
val url =
"jdbc:hive2://remote.server:10001/myDatabase;transportMode=http;httpPath=cliservice"
// from "org.apache.hive" % "hive-jdbc" % "1.2.2"
val driver = "org.apache.hive.jdb.HiveServer"
val user = "myRemoteUser"
val password = "myRemotePassword"
val table = "myNonEmptyTable"
val props = Map("user" -> user,"password" -> password, "driver" -> driver)
val d = spark.read.jdbc(url,table,props)
println(d.count)
{code}
returns :
{code}
0
{code}
and my table is not empty
> Spark return an empty result from remote hadoop cluster
> ---
>
> Key: SPARK-21063
> URL: https://issues.apache.org/jira/browse/SPARK-21063
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, SQL
>Affects Versions: 2.1.0, 2.1.1
>Reporter: Peter Bykov
>
> Spark returning empty result from when querying remote hadoop cluster.
> All firewall settings removed.
> Querying using JDBC working properly using hive-jdbc driver from version 1.1.1
> Code snippet is:
> {code:java}
> val spark = SparkSession.builder
> .appName("RemoteSparkTest")
> .master("local")
> .getOrCreate()
> val df = spark.read
> .option("url", "jdbc:hive2://remote.hive.local:1/default")
> .option("user", "user")
> .option("password", "pass")
> .option("dbtable", "test_table")
> .option("driver", "org.apache.hive.jdbc.HiveDriver")
> .format("jdbc")
> .load()
>
> df.show()
> {code}
> Result:
> {noformat}
> +---+
> |test_table.test_col|
> +---+
> +---+
> {noformat}
> All manipulations like:
> {code:java}
> df.select(*).show()
> {code}
> returns empty result too.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org